Quick-Starts in Three Lines or Less
git clone ./install.sh ./run.sh
There’s nothing more pleasing than being able to execute something like the above, and nothing more, and have it result in a running service or library ready for import. Having a short and effective quick-start directive such as this is a must for my projects I wish to expose to the world. It allows prospective users to quickly dip their feet, new developers to be quickly on-boarded, and other team members to quickly spin it up the code as a dependency without needing to be intimately familiar with it. It also shows that some thought has been put into the developer experience, a good sign that the project is probably nice to work with and not some internal developer’s collection of one-off scripts. When a project involves many code bases, even more benefit may arise by consolidating this process to be similar across all of them, enabling muscle-memory like invocation. Here, I will walk through my approach to providing a friendly and familiar bootstrapping process accross a project.
A Common Abstraction
Every language and many frameworks have at least one common approach to developing in them. Each code base will likely have a unique (and hopefully documented!) process dependent on what language is being used, what the target platform is, the background of the developers involved, et cetera. These individualalities are important to allow the specific type of development to happen, but, of course, are counter-productive in the goal of trying to offer a standard bootstrap procedure. It’s then important to find an interface that may contain any toolchain.
Docker and containers offer just this.
Any amount of complexity required for a projects build, installation, or
execution may be neatly tucked into
If you’ve investigated a new backend service recently, you’ve probably been to
run something like one of the following on the first page of the
Getting Started guide:
# Elasticsearch docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch # Prometheus docker run --name prometheus -d -p 127.0.0.1:9090:9090 quay.io/prometheus/prometheus
All it takes is a Docker installation and one command and, voila! A running service with no deep diving into all the knobs and complexities of that individual service. The developer is now free to evaluate the capabilities of the service, or move straight to implementing it into their current work.
Dockerfilefor your service that provides a zero-configuration, runnable service
This is a good start, although we would ideally like to be able to run a code base with a functioning default configuration. Passing configuration flags to Docker just won’t cut it.
Configuring a code base is often a task for developers or those familiar with the intricacies and inner workings of the project. First time users should not have read up on how to set each of the knobs before running the first time, so they should be given a hand with some functioning defaults.
Providing these defaults, such as with commands similar to the above, in the
How-To, or other quickstart documentation is good, but it still works
against the ‘one process’ agenda.
Better yet, the configuration for a codebase can be set up through utilizing
Docker Compose and a
This will allow environment variables, port mapping, volume mapping and other
settings to be masked behind a single command, while still being configurable
to users who need more control.
Use docker-compose and set sensible and functioning defaults in
docker-compose.ymlto mask them from the user
Now, instead of writing
docker run <project specific defaults> ...,
in our How-To-Quicks guides, we can consistently specify
By utilizing docker-compose and a standard docker with a Dockerfile, I
can wrap any service’s build, install, run, and config in one, simple command.
However, in modern day service development, scarcely does a project consist of
a single service.
Most services build upon one or many persistance, networking, logging, metrics,
or any number of other services.
Often, a Getting Started guide may guide the user through setting up
these dependencies before finally running the service itself.
Even worse, a the guide may simply leave it to the user to configure the
dependency, or not even specify that one is required!
Docker Compose is a tool that is commonly used to manage multiple
containers and networking between them.
This means that, for a service that has dependent services, a
docker-compose.yml may be provided to define them.
Configuration, versions, and networking may all be abstracted away into the
docker-compose file, allowing for yet another one-liner for the developer
to perform that will result in a fully-functional service, no matter how
complex its dependencies:
For services with other dependent services, include a
docker-compose.ymlthat takes care of the dependencies and configurations
The result of following these practices leaves us with two lines we can paste into the head of every README:
git clone <my-repo> docker-compose up
Now, assuming this approach is followed across all code bases, a user wanting to try out a new service may immediately copy down any repository and know exactly what to do to get running.
More on Getting-Started Fast
There’s still more to discuss about keeping the development section of the Quick Start short and sweet usig similar aproaches, and once this practice is being enforced organization-wide, there’s even more ways to simplify a complex, micro-services project.