Setting up your own Docker host engine

Discussion

When I first got serious about learning Docker it was for rather selfish reasons and that was to save money on my hosting by using Docker to containerize the websites I had at the time.  I’ve not finished yet but I’m getting there.  I wanted a single “rented” host running virtualization of some sort.

Digital Ocean has turned out to be a quality and economic approach to getting this done.  Digital Ocean does supply an already pre-packaged Docker application service that you can be up and running on in a very short amount of time.  I am a thick headed DIY sort of guy and decided “nope! let me just spin up a droplet and set Docker up myself.”

Following the instructions that I published in a previous article I set up the Docker engine and what follows are the instructions for connecting the docker-machine CLI tool to this new instance.  It’s not that hard to do and is fairly rewarding.

Get ‘er Done!

OS Considerations

In case you might be wondering I set my Docker hosting machine up with Ubuntu 18.04LTS.  This is a distro that I’ve learned to love given I have machines provisioned with it that except for the fact I periodically reboot all my machines about once every quarter probably could have run for years with Ubuntu.  I have one last aging Debian server that sometime this calendar year I plan to swap out with Ubuntu.

Authentication Bits

I am assuming at this point you have a machine set up with docker-ce installed and running at this point. If not go back and do that.

You want to create an account on your machine and I used the same account name as exists on my laptop.

On your hosting machine as root

The “moduser” invocation adds the user to the docker user group allowing access to the docker CLI and associated operations.

Next you’ll want to create (if you don’t already have one) a RSA ssh key

On your local host as yourself

Take the public key that is generated (~/.ssh/id_rsa.pub) and copy it to the clipboard.

On the remote host as yourself you are going to create (if it doesn’t already exist) the .ssh directory and an authorized_keys file inside the .ssh directory.

On your local host ensure you are able to ssh as yourself to the remote host without being prompted for a password.

Once that is done it is now time to initialize the docker-machine.

Setup docker-machine

What you need going into this is the following information:

  • Fully qualified domain name of the Docker host
  • IP Address of the Docker host
  • path to your ssh public key

On your local host as yourself execute the following

Where USER is the userid you are logging into the remote host with, IP is the IP Address of the remote host and FQDN is the fully qualified domain of the remote host.

Validation

You should now be able to connect to and redirect your docker CLI to the remote host.

To set up the connection do the following on your local host as yourself:

This sets environmental values in your shell that tell the docker CLI where to send the actual requests you are making to.  If you want to see what is actually set simply run

In both cases substitute FQDN with the fully qualified domain name you used earlier.

 

Start with something simple:

 

That’s it!

Getting Started with Docker and Docker Compose

Introduction and chatter.

I’ve been using Docker now as a “hobbyist” for a few years now. I use it from everything from hosting this blog to testing out new ideas I plan to use at work.  Using Docker to test new ideas is a good thing especially if the tests I am doing have a potential for destroying something.  Who cares if I screw up an ephemeral container that I can always rebuild?

Of course Docker has a serious side and while so far I’ve not needed it in a professional setting (except to test Puppet modules, but that’s another topic for another post) learning Docker can’t hurt professionally so having one’s own Docker environment is a good thing and a great way to get comfortable.

I am not going to go into great detail about setting up a hosting environment in this post.  You actually do end up setting on your desktop or laptop just following my instructions.  What I am talking about is hosting one on a remote machine in a data center. You can pretty well read between the lines to figure that out there isn’t much difference.

What you’ll Need

  • An internet connection to download packages and scripts through
  • A fairly “beefy” laptop or desktop.  In this case a machine with enough memory (8Gb+) to support running your containers.
  • docker-ce package
  • docker-compose script
  • docker-machine script

docker-ce

The docker-ce package is the main thing needed here. This has all the software infrastructure to create containers on your local host and the commands needed to create packages remotely.

docker-compose

This provides a layer on top of docker (and I’ll get into more of this in another post) that allows your to code entire infrastructures in one file. For instance for a three-tier application you could define the build of a web server, application middleware server and database server as a unit and docker-compose would handle the tasks associated with spinning them up.

docker-machine

This provides and interface to redirect the results of docker or docker-machine commands to a remote Docker hosting machine.

Commentary

Most of if not all the instructions are geared to working on Linux and in particular Ubuntu.  My hosted Docker host machine is running Ubuntu 18.04LTS at the moment and seems to do well.  For those on the spawn of the Evil Empire there are of course differences but I don’t do windows so those instructions won’t be here.

Docker Hosting

If you want to jump right in and get a Docker hosting system there are a few ways of doing that.

Build your own

Gotta nice server of your own?  Are your renting one from any of the myriad of hosting providers that exist?  No sweat, follow the instruction for installing docker-ce and add docker-machine setup to your list of short projects and away you go.

Hosted Solutions

Both Digital Ocean and Microsoft Azure offer hosted Docker solutions.  So far of the two I am more happy with Digital Ocean’s offering and I use the heck out of it.  Again you’ll want docker-machine set up on your laptop or desktop to interact with it.  Both have different approaches to setting up the credentials you need to access your Docker machine.

Digital Ocean also has support for Kubernetes coming soon.. just saying…

Let’s Get to Work

docker-ce

Again I’m covering how to do this on Ubuntu and first you’ll need to point to the Docker official repositories.

As root you want to execute the following:

This will import the gpg key for the docker-ce package and install the configuration for apt to find the docker-ce repository.  Now we want to install docker-ce along with its dependencies.

NOTE: if you have a previous version of docker, docker-ce or docker-engine or your machine it is advisable that you uninstall those prior to proceeding any further.

Installing docker-ce:

Now you have docker-ce installed you might want to validate everything is working for you.

docker-compose

Now it is time to install docker-compose.

Become root and perform the following:

That’s it, it is that simple

If all you want is to run Docker locally you are done.   However if you are planning on using Docker against a remote host you’ll also need docker-machine. Very simply become root and run the following:

On my next article I’ll tie all this together in a neat package for you so you can start spinning up containers to your heart’s content.

 

 

Copyright DevOps — In the Trenches 2019
Tech Nerd theme designed by Siteturner