DevOps Home Server – Part two – Software

For the purposes of my own home DevOps server, I decided to use the range of Atlassian software.

Source Code Control Atlassian Bitbucket
Project and Issues tracking Atlassian JIRA
Documentation Atlassian Confluence
Build and Deploy server Atlassian Bamboo
Code Analysis SonarQube

The main reasons for this are :

  • I have used these applications in a previous job
  • They all integrate really nicely with each other
  • They are affordable (even though there are free open source equivalents)

Once they are all set up, and connected together, you can quickly change between them when browsing the web interfaces.

In terms of price, each one of these applications (apart from SonarQube which is free) costs $10 for a permanent license for 10 users.  If you wish to get software maintenance, then you can pay that price every year (actually, its half that for extending the maintenance after the initial purchase), but in my case, the initial purchase is going to be fine, costing not much more than a decent takeaway.

The Virtual Machines

Both VM’s are running on my original Intel NUC Server (i7, 16GB RAM), set-up with Microsoft Windows Server 2012.  I decided to create a separate SQL Server machine for a couple of reasons, one to reduce risk of failure (so one VM is not hosting everything), but also in case I ever need a general purpose SQL Server, it can be used for that as well.

SQL Server

I have installed Microsoft SQL Server 2016 on the VM, opened up appropriate ports, firewall rules etc., and that was about it.  It now sits there chugging away nicely.  I have still yet to sort out any form of backup system.  I will probably set up some automated database backups, but I will also aim to have the actual VM backed up as well.

DevOps Server

The DevOps Server is another VM, although I have given it a big more power than the SQL one purely as it will be hosting a number of applications at the same time, and it will need a fair amount of oomph to keep it all running smoothly.

Installing the software

NOTE: this is not a guide on how to install everything, just an overview of the experience.

The first step was to install Java as most of the Atlassian products are built on Java, after which I also installed Visual Studio as I knew that was a requirement for the Bamboo Build and Deploy server.  I then one by one installed and configured the following applications :

  • Atlassian JIRA
  • Atlassian Confluence
  • Atlassian Bitbucket
  • Atlassian Bamboo
  • SonarQube

Each piece of software required me to choose a program location, and a data directory.  Most software also added a Windows Service as well (although some had to be manually installed by running some command scripts).  Each application would start, and allow me to perform the first time configuration.  This involved the usual initial settings, database connection etc.  Most Databases had to be created prior to running the first time config, and also had to adhere to the softwares requirements in terms of collation settings.  Each Database was given its own specific user for the application to use (a standard sql user).

As I was wanting them all to have the same user database, I elected to use Active Directory, although this had to be set up after the initial install.  Most of the applications required the same AD config, and once I had figured it out for the first one, the others were straight forward.

Once everything was up and running, I was able to connect all of the Atlassian applications together so they show up on each applications hamburger menu, and as they are all configured with the same set of AD users, I am able to switch seamlessly between the applications.

External access

Obviously, as this is all just running on my home network, if I ever have to go anywhere, I don’t want to not be able to access it all, so I wanted to expose it to the outside world.  I am not worried about security, or about having my home broadband thrashed by outside users as its only going to be me using it.

To expose them online, all I had to do was configure up some port redirects on my router, so that any traffic hitting it on the relevant applications ports where forwarded to the correct port on the virtual server.  I configured my router to use a dynamic DNS service (Netgear’s own service) so that my home network could be reached on somehost.mynetgear.com on all of the configured ports.

I then configured an appropriate domain so that the various sub domains would also redirect to the somehost.mynetgear.com.

jira.someexternaldomain.co.uk → somehost.mynetgear.com

confluenace.someexternaldomain.co.uk → somehost.mynetgear.com

bitbucket.someexternaldomain.co.uk → somehost.mynetgear.com

bamboo.someexternaldomain.co.uk → somehost.mynetgear.com

sonar.someexternaldomain.co.uk → somehost.mynetgear.com

Unfortunately, this means that everything is still reliant on ports, so when accessing one of the sites, I would still need to use something like http://confluence.someexternaldomain.co.uk:8090, but I am not too bothered about that.

Internal DNS

Now, one issue that I discovered with this plan was that each application, when trying to internally communicate with the other applications, because all URL’s (on the server itself) where full domains, and not just IP addresses, they would actually do the round trip to the internet and back.  This was not good as it would of caused delays etc.  What I did to fix this was to go on to my Active Directory VM, and set up DNS so that forward lookup DNS settings meant that any machine on the internal network, using AD, would not need to go outside the network for these URL’s.

Performance

In terms of performance, it all seems to work well.  There are times (such as if there is a build job running), that things seem to slow down somewhat, but I would say its no worse than I have seen in a real production environment.  It all works rather well, and it feels like I have my own professional business environment right at home.  The only issue is that the actual server makes a bit of noise. The fan seems to be working pretty hard, but I can just shut the door on my office and forget about it.

In the next blog post, I am going to describe the workflow of using it all.

DevOps home server – part one – the equipment

 

Ah, DevOps, such a buzz word now.  It seems that everyone wants to bring Operations and Development together, in a harmonious gathering of intellectual minds.  To get in on the action, I wanted to do some hands on development, with a little saucy operation to go with it, and so wanted to experiment with some home server shenanigans.

Why bother, I hear you say, why not just use the existing cloud services I hear you cry.  Well, I really don’t have an explanation, other than to say, why not.  Sometimes, a workplace environment may not be in a position to use the various cloud services, and may have to host everything themselves, so its worth having a bit of experience with such a situation.  So, I bring to you my experience of setting it all up, using Windows Hyper-V.

Firstly, a little bit of background as to what I already had, before I get in to the most recent information with regards to my little home development environment.

Equipment

The existing Hyper-V server

 

I have always had the drive to have my own little server at home, primarily to run my own instance of Microsoft Dynamics, simply because for development purposes, it was too expensive to have an online instance for general experimentation.  As a result, quite a while ago, I purchased a small Intel NUC small form factor PC to run Hyper-V on, and host my development environments.  This was an Intel i7 PC, 16GB of RAM, a 512GB SSD, and a 1TB Hard Disk. It was a small machine, took very little power, and since Windows 10 Pro also provided Hyper-V Virtualisation, there was no need for it to run Windows Server.  On this PC, I set up a Virtual Machine to host Active Directory, and then a second Virtual Machine where I was able to install Microsoft SQL Server, and Microsoft Dynamics.  This provided me with a nice little CRM test environment.  Overtime, I hit a little snag when my requirements overstretched the little machine, as I needed other Virtual Machines to host other bits and bobs, and the initial CRM VM also became bogged down with the amount of different CRM organisations I was running.  It was time for an upgrade.

Hyper-V server, the second coming

 

So, keeping with the excellent experience I had with the intel NUC, I decided to get a new one, but the latest version.  This little baby was the latest i7, it had a maximum of 32GB of RAM, but otherwise was pretty similar to the original one.

https://www.intel.co.uk/content/www/uk/en/products/boards-kits/nuc/kits/nuc7i7bnh.html

Coupled with 32GB of RAM, a 512GB SSD for the operating system, and a 2 TB 2.5 inch Drive, it was ready to receive Windows 10 Professional.

Installing Windows 10 was easy, then adding Hyper-V, and then moving my VM’s across to it.  With the 32GB of memory, I was able to increase the memory of the Dynamics Virtual Machine to give it space to grow, and I still had memory left over.

Enabling the server for remote desktop access enabled me to unplug it from the screen and keyboard, and position it out of the way, simply connected to power and ethernet.  And there it sits, chugging away.

 

 

Servers in action

And this is what they look like.