Cucumber scenarios that depend on Sphinx

June 1, 2009 code 4 min read

I love writing apps that make heavy use of search indexes, but testing them can be a bit of a pain. Here is how I got ThinkingSphinx to play nice with Cucumber.

Here is the relevant part of what I put in features/support/env.rb:

# Cucumber::Rails.use_transactional_fixtures

require 'database_cleaner'
DatabaseCleaner.strategy = :truncation
Before do

ts = ThinkingSphinx::Configuration.instance
FileUtils.mkdir_p ts.searchd_file_path
at_exit do
ThinkingSphinx.deltas_enabled = true
ThinkingSphinx.updates_enabled = true
ThinkingSphinx.suppress_delta_output = true

# Re-generate the index before each Scenario
Before do

What’s going on here?

Start by commenting out the line about using transactional fixtures in env.rb. Using transactional fixtures will run each scenario inside of a transaction and roll it back at the end of the scenario to revert the database state. Thinking Sphinx uses an after_commit callback for kicking off the delta indexing, but the callback never gets run when transactional fixtures are enabled because the entire scenario is run inside of a big transaction.

Once we’ve disabled transactional fixtures, our test database will start to fill up, likely causing some problems. So we need to add a Before block that clears out the database before each scenario. I’m using database_cleaner, which gives you some different strategies for cleaning the database. Alternatively, the brute-force solution is just to reload the schema before each scenario, but this is slower than truncating the data.

Before do
  ActiveRecord::Schema.verbose = false
  load "#{RAILS_ROOT}/db/schema.rb"

Next, we start Sphinx when env.rb is loaded, and shut it down when the Ruby process exists. We also enable deltas and updates, which are disabled by default in test mode. Finally, we define a Before block that updates the index before each scenario so we don’t end up with a stale index.

Putting it all together

I’m using Sphinx’s delayed delta support, so whenever I update records, I need to have delayed_job process jobs. Instead of trying to get delayed_job to run in the background, I took the easy way out and defined a step: “When the system processes jobs”.

Scenario: Posting a new listing
  Given I am logged in as "MovinMan"
  When I create a new listing titled "Lots of Boxes" near "49423"
  And the system processes jobs
  And I browse listings near "49423"
  Then I can see a listing titled "Lots of Boxes"

Which is just implemented as:

When 'the system processes jobs' do

If you’re just using the default deltas, and not delayed deltas, then you can update the index like this:

When /^the system updates the index$/ do

I hope that helps. Post your suggestions in the comments for improving this.

This content is open source. Suggest Improvements.


avatar of Brandon Keepers I am Brandon Keepers, and I work at GitHub on making Open Source more approachable, effective, and ubiquitous. I tend to think like an engineer, work like an artist, dream like an astronaut, love like a human, and sleep like a baby.