Testing Rails loggers with minitest

Having good logs around can give you insight into what's currently happening in your application. It also gives you some track record to react to users' inquiries and issue reports.

And in a way, logging is also a type of behavior. It's not as user-facing as things that happen in the UI, but it's internal usage. Good logging might very well be part of the requirements of a feature that you are building, especially if it's a backend-heavy feature or refactoring. Sometimes a specific log message is even the only behavior that is happening:

if x
  Go.calculate!
else
  logger.info("Abort calculation, nothing to do here")
end

And often the message are quite dynamic:

logger.info("Great #{Calculation.get} value for #{X.count} objects")

So let's test our loggers!

A good start to test your Rails loggers is to inject the logger as a dependency into the service that you are building instead of making a hard dependency on it.

Let's imagine you convert a series of images to a video at a user's request.

# services/image_converter.rb
class ImagesConverter
  def initialize(images:, logger: Rails.logger)
    @images = images
    @logger = logger
  end

  attr_reader :images, :logger

  def convert
    logger.info("Start conversion with #{images.count} images")
    return unless valid_conversion?
    
    logger.info("Conversion step 1")
    
    # ... conversion code
    
    logger.info("Finished conversion 🎉")
  end

  private

  def valid_conversion?
    if images.any(&:invalid?)
      logger.info("Invalid images detected, stop conversion")
      return false
    end
  end
end

This converter is a pretty backend-heavy beast. There might be a lot going on and we want to track the different conversion steps, including validation and any special handling that is to come.

We pass the Rails.logger as the default argument because you might not always need to care about a custom logger, but it can also be useful, even outside of tests.

For example, you could decide to have a custom logger logging everything around conversion into a conversion.log file, instead of being lost between the heavily frequented Rails.logger logs. Or have a different logger passed with different converter invocations? Or the same logger passed with different parameters for different Sidekiq workers? There are many reasons to inject it as a dependency here.

To test the logs in the example above, I'd vouch for a happy path test first. It can get a bit tricky when you test multiple invocations of an object's method with different parameters in minitest, here's how it goes:

# test/services/images_converter_test.rb
class ImagesConverterTest < ActiveSupport::TestCase
  test "#convert logs the happy test path" do
    logger = Rails.logger
    images = FactoryBot.create_list(:images, 3)
    
    expected_messages = [
      "Start conversion with 3 images",
      "Conversion step 1",
      "Finished conversion 🎉"
    ]

    logger.expects(:info).times(3).with do |actual|
      expected == expected_messages.shift
    end

    ImagesConverter.new(images: images, logger: logger).convert
  end
end

From the above code, this is probably the interesting piece here when testing loggers:

    logger.expects(:info).times(3).with do |actual|
      actual == expected_messages.shift
    end

You are most probably used to this kind of usage:

logger.expects(:info).with("logged!")

But .with can also take a block instead of just values in its parameters list. Within the block above we shift out the very first value of the expected_messages array and compare it to the actual value that we got in the tested code.

Keep in mind that the order of the expected_messages and the number of .times(3) matters.

With highly volatile code that's expected to change often, I also reach out to dynamically determine the number of times we expect the messages to come in:

    expected_messages = [
      "Start conversion with 3 images",
      "Conversion step 1",
      "MANY MORE CONVERSION STEPS WILL COME HERE",
      "Finished conversion 🎉",
    ]

    # Let's softcode .times() invocation:
    logger.expects(:info).times(expected_messages.size).with do |actual|
      expected == expected_messages.shift
    end

And once this is done, we can add another test for stopping to log early when the record is invalid:

# test/services/images_converter_test.rb
class ImagesConverterTest < ActiveSupport::TestCase

  # ... 
  test "#convert logs start and validation issue when invalid" do
    logger = Rails.logger
    images = FactoryBot.create(:invalid_image)
    
    expected_messages = [
      "Start conversion with 1 images",
      "Invalid images, stop conversion"
    ]

    logger.expects(:info).times(2).with do |actual|
      actual == expected_messages.shift
    end
    
    ImagesConverter.new(images: [images], logger: logger).convert
  end
end

If you grow to have many different unhappy paths in the validation step, you could make a method of setting the expectation:

# test/services/images_converter_test.rb
class ImagesConverterTest < ActiveSupport::TestCase

  def set_invalid_log_expectation(invalid_message)
    expected_messages = [
      "Start conversion with 1 memory images",
      invalid_message
    ]

    logger.expects(:info).times(2).with do |actual|
      actual == expected_messages.shift
    end
  end
  
  test "#convert logs start and validation issue when invalid" do
    logger = Rails.logger
    images = FactoryBot.create(:invalid_image)

    set_invalid_log_expectation("Invalid images, stop conversion")
    
    ImagesConverter.new(images: [images], logger: logger).convert
  end
  
  test "#convert logs start and validation issue when too many pegs" do
    logger = Rails.logger
    pegs = FactoryBot.create_list(:peg, 100)
    
    set_invalid_log_expectation("Can't convert 100 pegs, stop conversion")
    
    # (you'd need to change a bit of code in MemoryPegsConverter to get the log)
    ImagesConverter.new(pegs: [pegs], logger: logger).convert
  end
end

💬
That's it for logger tests, if you have any comments, drop them here 👇