Wrangling Complex Rails apps with Docker Part 3 : Testing Rails under Docker
In the previous post in this series, we created a portable development environment with docker-compose. I've noticed that a lot of blog posts and tutorials for Dockerizing Rails apps tend to finish before covering the subject of testing within Docker. Testing is an important part of Rails culture, but I believe that the reason that many guides and blog posts don't cover testing under Docker is that it can be difficult to configure. My aim for this post is to alleviate some of that difficulty and to give you a good base upon which to create your own infrastructure for testing in Docker.
There are many frameworks available for testing Rails apps but I'll be covering RSpec in this post. If you have minitest or anything else in your arsenal, this post can be used as a rough guide, as the infrastructure to set up testing is fairly similar.
Unit testing inside of Docker is largely the same as it is outside of Docker but with special considerations when it comes to browser testing. Feature testing inside of a browser can be difficult enough without Docker, but adding in concepts like containers, networking, and the lack of a desktop environment inside of Docker containers makes browser testing doubly difficult.
Running feature tests in Docker presents the following challenges:
- Our Rails server (puma) is usually in a different container, therefore it will have a different hostname.
- Controlling a browser in a docker container is not straightforward.
- Test containers should be persistent so that we don't incur Rails startup costs with each test run.
Prerequisites
This post assumes that you have a working docker-compose setup with Rails. It's also heavily geared toward RSpec, so if you're using another testing framework, you'll have to adjust accordingly.
If you're just following along, feel free to clone our example app.
The first thing we need is a test environment file .env.docker.test
:
DATABASE_URL="postgres://myuser:letmein@db/appdb_test"
BUNDLE_PATH=/bundle/vendor
We also need a test
service definition in our docker-compose.yml
. There's one in the example app or in the previous post in the series. The relevant service configuration is here:
test:
# appimage is a reference to the main application image earlier in the compose file.
image: appimage
# This is a different env file, for testing only.
env_file: .env.docker.test
# This does two things... Allows our test container to be persistent, and loads the test boilerplate
# for faster test runs. You need the "spring-commands-rspec" gem installed in order to make this work (assuming you're using rspec)
command: bin/spring server
# We don't need to allow http access from outside of docker during testing, so it's a backend service
networks:
- backend
<<: *volumes
There are a few things to take note of here:
The image:
key is referenced from our main app's image. The image is built under the web:
service config, named "appimage" and referenced here. This is so that we build the image once and only once when issuing a docker-compose build
command.
You might expect the command:
to be bin/rspec
. If that were the case, when the environment is brought up by docker-compose up
, all specs would be run, and the container would immediately exit. The spring server preloads a copy of our application and runs as a daemon. This way, our test container stays up and RSpec is preloaded. In order to take advantage of spring with RSpec, we need the spring-commands-rspec gem.
We could have specified an immediately terminating command:
to be something like /bin/true
, and used docker-compose run test rspec
instead of docker-compose exec test rspec
to run our tests, but we'd incur the cost of loading rails each time we run a spec. Docker also expects services to run as daemons, so we get the best of both worlds here. A daemonized Docker service, and quick loading specs.
One crucial thing that we need in order to get started is a test database.
First, bring the application up with docker-compose up -d
, then bootstrap our test database:
docker-compose exec test rails db:create
docker-compose exec test rails db:migrate
Setting up RSpec with Docker
Setting up RSpec can be a daunting task. I've done it a handful of times and always have to refer back to several forms of documentation to achieve the optimal setup. The focus of this post will be to get set up so that we can run a bare minimum example under Docker.
The way I set up new applications with RSpec is very close to the way Jason Swett sets up his applications. If you're new to RSpec and Rails, that's an excellent resource. His "Complete Guide to Rails Testing" is also a wonderful resource if you're interested in the what, how, and why of Rails testing.
As a very bare minimum, have the rspec-rails
gem installed, and then run rails generate rspec:install
. This will give you a small testing skeleton to build upon.
RSpec does not install binstubs by default, so we have to do that ourselves:
docker-compose exec test bundle binstubs rspec-core
Running model and request specs
Running a model or request spec is pretty straightforward:
docker-compose exec test rspec spec/models/
docker-compose exec test rspec spec/requests/
You may notice that we're repeating docker-compose exec test
over and over again. One thing that keeps this process less tedious is using shell aliases similar to the ones we defined in part 2 of this series.
Capybara Setup
We're using the ubiquitous Capybara for our feature tests.
Running capybara in a browser with docker is a bit of a pain. We do not have access to a windowing system, since docker is bascially a CLI only. Thankfully, the selenium project provides a docker image that runs a virtual windowing environment. We can connect to this virtual environment with a VNC viewer.
Installing Capybara is out of the scope of this post, but you can find the installation instructions here.
Running tests outside of the browser is as easy in docker as it is natively. Here's an example with my sample todo app mentioned earlier:
docker-compose exec test rspec spec/features/todos/index_spec_no_js.rb
Easy peasy. Out of the box, this uses the rack_test
driver, which cannot test pages with javascript. Most pages need javascript testing, so I'll show you the easiest way to achieve that in Docker.
Running feature tests under Capybara/Selenium
Our Capybara setup needs to be tweaked a little bit in order to run under Selenium server. The rack-test
driver is the default Capybara driver but it's easy enough to change.
In our test service, we have to tell Capybara which docker host that selenium is running on. We have to add an environment variable to our docker-compose.yml
file:
environment:
- SELENIUM_REMOTE_HOST=selenium
We also have to configure Capybara in spec/spec_helper.rb
:
require 'capybara/rspec'
Capybara.server = :puma
Capybara.server_host = '0.0.0.0'
Capybara.server_port = 4000
Capybara.run_server = true
Capybara.app_host = 'http://test:4000'
Capybara.javascript_driver = :selenium_remote_chrome
Capybara.register_driver :selenium_remote_chrome do |app|
Capybara::Selenium::Driver.new(
app,
browser: :remote,
url: "http://#{ENV['SELENIUM_REMOTE_HOST']}:4444/wd/hub",
capabilities: :chrome
)
end
In the example provided above, we tell Capybara to start a puma server on 0.0.0.0:4000 (listen on all interfaces, not just localhost.). Since we're running under the test
service and docker automatically assigns that hostname,
we tell capybara that our test app instance will be found on http://test:4000
, then finally we register our selenium driver with the SELENIUM_REMOTE_HOST
configuration that we added to
our docker-compose.yml
under the environment
key.
There's a full example here which shows how both the test
and selenium
services are configured.
In the example application referenced above, note that I am matching selenium version numbers in docker-compoose.yml
and the Gemfile. If you install this yourself, you'll have to ensure that your version
of the selenium-webdriver gem matches that of the selenium service that you're running inside of docker.
Browser/Javascript testing
Selenium offers a docker image which includes a selenium server with an embedded browser to run our tests against. In order to take advantage of that, put the following in your docker-compose.yml
:
selenium:
image: selenium/standalone-chrome
environment:
- VNC_NO_PASSWORD=1
networks:
- backend
ports:
- '4444:4444'
- '5900:5900'
The cool thing about this is that the Selenium image bundles a windowing system within it (xvfb), so we can run our tests in a browser. It also has a VNC server, which we can connect to with a VNC viewer in order to watch our browser tests run.
In order to view your browser tests, you need a VNC viewer. The built in VNC viewer in macOS will not work, so I recommend RealVNC viewer. Open a connection to localhost:5900 (no password). You should see a banner across the VNC connection that says "Selenium Grid."
Run your browser tests:
docker-compose exec test rspec spec/features/todos/index_spec.rb
and switch back to the VNC viewer window. You should see your tests running in a Chrome browser. Pretty neat!
Viewing other feature tests in a browser is as simple as tagging them as js: true
, switching over to your VNC window, and watching them run in the embedded chrome browser.
This concludes this blog series about developing Rails applications with Docker. I hope you've found it useful, and if for any reason you get stuck, feel free to clone the example application.
Previous: Wrangling Complex Rails apps with Docker Part 2: Creating a docker-compose configuration
Image Attributions
Photo by Bill Oxford on Unsplash