Docker on MacOS? Not so fast.

Tweaking and optimizing will help, but is it worth your time?

Before Docker made its first appearance, we were all struggling with building and maintaining stable and performant local development environments. As Drupal developers, our LAMP stack evolved with us and new dependencies and restrictions presented themselves.

And then came Docker. And Lando. Virtualization, in general, took the web by storm and promised us a new era in web development and deployment - no more messy dependencies management, no more unexpected DevOps issues, and the ability to spin up systems locally, or deploy them to scalable environments, quickly and efficiently.

How could we not jump on the train?

If you’re reading this I can only assume you have first hand experience, and at least some frustration, with the slowness of Docker on MacOS based machines. You’ve probably followed a few guides for tweaking and optimizing your Docker configurations, and noticed a difference, but you probably also know that things could be much, much, faster.

Without getting into the technical details, it’s safe to say that the slowness you’re experiencing is inherent and, at least as of writing this blog post, inevitable. The problem is rooted in the fundamental need for an abstraction layer between your native filesystem and the virtual one created by Docker. In essence, any change to any file needs to be synchronized between the two, and operations that require a very high I/O will consume a considerable amount of processing overhead to accomplish an otherwise simple task.

Many partial solutions exist, and if you use Docker it’s important to stay up-to-date and test different setups for squeezing the most performance out of your machine. Some very interesting and promising solutions, such as Mutagen.io, lean towards using NFS and an additional layer of synchronization for speeding up the process. Other solutions may already exist, and it’s certain that the Docker development team is hard at work developing their own solutions.

So where are we now?

The promise was not fully kept, but we cannot ignore the amazing advantages - Docker, and virtualization in general, are solid solutions for pre-production and production environments. It’s a game changer.

Let’s also not underestimate some of the other advantages - any Docker-enabled project can be easily set-up locally in an isolated and mission-centric environment, allowing any developer to hop on board any project with no friction. The value of these containers also presents itself in testing and automation of code, processes and systems, all of which require a high level of standardization and certainty which virtual environments provide by design.

And what about local development?

When we examine the requirements for efficient local development environments, we find a few key factors:

  1. Speed - We want to be able to quickly set-up and operate systems locally.
  2. Overhead - Our time is limited and we want to spend it coding, not tweaking or waiting.
  3. Ease of use - We want a minimal learning curve and high flexibility.
  4. Standardization - We want our environment to be as identical as possible to production.

As you have probably experienced yourself, Docker misses on the first 3 points. It is true that significant optimization can be achieved if time is put into the set-up, but that only adds more overhead and nearly diminishes the added “ease of use” value if we constantly need to fiddle around with updates, settings, and configurations.

The solution?

There are many solutions, and they all come with their own advantages and disadvantages. The two most obvious solutions are:

  1. Switching to a Linux based machine. Most of us will encounter significant overhead running and operating such a machine and might end up spending even more time getting everything else in our set-up working smoothly.
  2. Staying up-to-date with an optimized Docker set-up that utilizes additional tools for speeding things up. True, some overhead is required and you’ll need to get your hands dirty with some “under the hood” configurations to truly get the most out of this set-up, but it will have the added value of standardization (and a real sense of accomplishment!).

But I would like to suggest a third, hybrid, solution.

If you are a predominantly LAMP-based developer that works on database-intensive frameworks such as WordPress or Drupal, you may want to consider going native. MacOS has half the stack built-in, and the other half can quickly be installed using brew (or an equivalent package manager).

It is relatively painless to set-up wildcard SSL and dynamic DNS for a flexible local setup that allows working on multiple projects in parallel. Keeping your stack up-to-date is important, but there are many solutions for being able to run parallel versions of PHP and MySQL locally for those edge cases, and realizing that an upgrade to a part of your stack broke your system is a valuable lesson in and of itself.

Running your stack natively requires some work, no doubt. The setup process involves more than double clicking an installer, and you may require additional PHP plugins and some obscure configurations in the long run. But we are web developers, and the need to understand how the different parts of our stack interact, as well as the changes and advancements in the underlying technologies, are an integral part of our job.

And most importantly - a native stack is blazingly fast.

So should I just ditch Docker?

Absolutely not. Docker is here to stay and knowing how to work with it is a fundamental skill for any developer. Containers will still be used for testing environments as well as higher up the deployment chain. It is best practice these days to include a Docker/Lando file in every repository, pre-configured to match the optimal stack for development, to enable any developer to jump on board with no friction, but that does not necessarily mean that you must use it.

I’ve tried to hop on the Docker train a few times in recent years. The promise was too compelling to ignore, but the time I found myself waiting for otherwise snappy operations always led me back to my native setup (think cache clearing and migration processing).

The reality is that most of what we tend to work on utilizes a very similar, if not identical, stack. And for those rare cases where my local setup does not match the specific requirements of a project - I look at those as educational opportunities and take the time to understand the “why” and the “how” of those requirements, often leading to a better understanding of the project as a whole.

My personal recommendation?

Keep Docker around, but use it cautiously. Keep it optimized and updated and ready to launch, but don’t feel that you must use it, especially if you’re working on database intensive projects and find yourself reading articles such as this while waiting for the cache to be cleared.

Take the time to set-up a native local environment and experiment with it. You will learn a lot about the underlying technologies of your work, and you will certainly experience a considerable boost in performance.

Sometimes, knowing when not to use a tool is as important as knowing how and when to use it.

Docker and Lando are great solutions for specific tasks, and they provide the world of web development tremendous value when used correctly, but if you’re feeling that you are spending too much time fiddling or waiting, and not enough time coding, don’t hesitate to try a more native environment setup.