Skip to content

Developing Tests for Islandora Modules

Daniel Aitken edited this page Sep 2, 2015 · 10 revisions

Islandora implements a fairly specialized extension of Drupal's SimpleTest module, providing a great deal of specific Islandora-related functionality. Several libraries are included to make it easy to assert that objects and their datastreams are as they should be.

This document is designed to help developers create tests using the existing framework, as well as to provide some light information on expanding that framework.


Table of Contents

The Base Test Classes

What tests require

A base test skeleton

Customizations available to tests

Available test classes

Available utilities

Available assertions

Datastream validation: a special case

Adding new utilities and assertions

Running your test

Getting Your Test To Run In Travis-CI

Enabling the Travis-CI webhook

Creating a .travis.yml

The Travis-CI Islandora environment

A sample .travis.yml

Troubleshooting Travis builds

The Base Test Classes

Tests in Drupal are defined at the end of a tree of classes extended from the DrupalTestCase class. The hierarchy looks a bit like:

DrupalTestCase

DrupalUnitTestCase

IslandoraUnitTestCase

WhateverTestCase

DrupalWebTestCase

IslandoraWebTestCase

WhateverTestCase

IslandoraCollectionWebTestCase

WhateverTestCase

Where the bolded class names are the actual test classes you would be creating. These test classes define the test as it shows up in the list of tests, as well as what the test does.


What tests require

Tests are contained within a class, ending in TestCase, that is extended from a parent TestCase class (e.g. class WhateverWebTestCase extends IslandoraWebTestCase). These classes require the following two methods:

  • getInfo(), a public static function returning an array containing name (a string containing the name of the test), description (a description for the test), and group (a group name to encapsulate several tests under, so that they can be run simultaneously in Drush or found in the same fieldset in the testing UI).
  • At least one public function beginning with test (e.g. public function testSomeThing(). This is the method that will run tests and add assertions to the result set.

Tests should probably contain:

  • A public setUp() function. This is not necessary unless you want to select more modules besides Islandora to enable during the setup phase, or establish global settings across all of the tests in the class. These are better detailed in the base skeleton below.

Tests should ABSOLUTELY NOT CONTAIN an extension of the test tearDown() function WITHOUT CALLING parent::tearDown(). THIS CANNOT BE STRESSED ENOUGH. The existing parent tearDown() functionality does everything in its power to establish the Fedora environment to the same as it was before the tests began. Changing this functionality through extensions of the method can irreparably damage your Fedora repository. Do not extend this method without calling parent::tearDown().


A base test skeleton

/**
 * @file
 * Base test skeleton.
 */

/**
 * Base skeleton class.
 */
class BaseSkeletonTestCase extends IslandoraWebTestCase {

  protected $classVariable;

  /**
   * Implements getInfo().
   */
  public static function getInfo() {
    return array(
      'name' => 'Base Skeleton Tests',
      'description' => 'A base skeleton test set.',
      'group' => 'Sample Tests',
    );
  }
  
  /**
   * Implements setUp().
   */
  public function setUp() {
    // Here, you can establish class variables before the actual setup phase.
    // After the setup is complete, the Drupal database will mimic a fresh
    // install, so this is the time to grab anything you need.
    $this->classVariable = drupal_get_variable('some_variable', NULL);
    // Now, we run the parent setUp(), which takes an array of strings
    // representing modules to enable.
    parent::setUp(array('module_1', 'module_2'));
    // After here, if we'd like, we can use those class variables.
    drupal_set_variable('some_variable', $this->classVariable);
  }
  
  /**
   * A sample test.
   */
  public function testSomething() {
    // Test goes here.
    $thing = do_something();
    // By making an assertion, a result is added to the test's result set.
    $this->assertTrue($thing, t("Doing something returned true"));
  }
  
  /**
   * Another sample test.
   */
   public function testSomethingElse() {
     // Another test goes here.
   }

}

Running "Base Skeleton Tests" or "BaseSkeletonTestCase" (or all "Sample Tests") will cause both testSomething() and testSomethingElse() to run.


Customizations available to tests

At present, tests can be customized in two important ways:

  • When performing web tests only, adding Drupal users to the class variable $users array allows you to define a set of users whose Fedora objects should be purged after tests are run. Users who are logged in during the test using drupalLogin() are added to this array. Generally, this array shouldn't be touched, and care should be taken not to add users existing in the actual Drupal database to this array.
  • When performing web tests or unit tests, setting the class variable $deleteObjectsOnTeardown to TRUE or FALSE, allows you to determine whether or not objects created during the tests should be purged. This can be very useful for debugging, and can also be used to persist a single object across several test methods (as opposed to recreating the object in every test method), speeding up your tests.

Available test classes

There are actually more test classes available than these three, but they are almost exclusively module-specific. Most tests built should be extended from one of these three classes:

Class Description
IslandoraUnitTestCase A base unit test case with a connection to Fedora but no connection to the Drupal database.
IslandoraWebTestCase A base web test case with a connection to both Fedora and the Drupal database.
IslandoraCollectionWebTestCase A web test case extended from IslandoraWebTestCase that includes some basic collection manipulation functionality.

Available utilities

NOTE: These utilities are supplementary to the utilities available from SimpleTest out of the box. An ancient and highly incomplete list of utility functions in the base SimpleTest API can be found here: https://www.drupal.org/node/265762; this, however, should not be considered a useful reference, and it is recommended to check the actual DrupalWebTestCase to see what's available to you. (Note from QA Dan: This isn't a recommendation I would ever make anywhere else, but hot dang, drupal.org, get ye some post-2008 docs for this!)

To make sure that they are consistently available across separate, unrelated test classes, many test utilities have been moved to a special test utility framework that gets called magically during tests and returns test results in a format compatible with SimpleTest. Because they are called magically, they may not show as available by an IDE that supports autocompletion of code and checks for methods through class hierarchy. They are nonetheless available for use.

More methods are available than are described below; however, these have been omitted as they are generally used by the testing framework and are not expected to be called during tests.

Method Name Available To Description Parameters Return Value
ingestConstructedObject() Web and unit tests Constructs and ingests an object into Fedora using tuque and given parameters. If no parameters are given, a basic object will be ingested. However, the method accepts a $properties array and a $datastreams array. Check the code description block for more details about this. The object if it was created, or FALSE on failure.
deleteUserCreatedObjects() Web and unit tests Deletes all objects owned by the given $username. Refuses to delete any objects owned by the configured $this->configuration['admin_user'] $username - a string representing the owner to search for. TRUE if all objects were removed, or FALSE if any remained after attempted removal.
getObjectFromPath() Web tests only Gets a tuque object given a path. $path - the path to try to extract a PID and load an object from. The object if it is found; otherwise FALSE.
deleteObject() Web tests only Deletes an object through the UI. $pid - the PID of the object; $button - a string representing the text on the deletion button (if NULL, defaults to "Permanently remove '(label)' from repository"); $safety - if TRUE, forces the function to only delete objects owned by users in the $users array.
drupalPostByID() Web tests only Mimics the SimpleTest drupalPost() function, but allows you to select the submit button ID. This is useful when multiple submit buttons on the page have the same label. $path - the path to the post form; $edit - field data in a drupalPost-compatible associative array; $submit - the label on the submit button; $id - the ID of the correct button; $options - options to be forwarded to url(); $headers - an array of additional HTTP request headers, each as 'name: value'; $form_html_id - an optional HTML ID of the form to be submitted; $extra_post - a string containing additional data to append to the POST submission. The content returned from the POST request, or FALSE if it fails.
createTestCollection() Collection web tests Creates a basic collection in the top-level collection. $label - the collection label; $models - a string representing the collection's configured content model, or an array of strings for multiple; $pid - a PID or namespace to use for the collection. N/A
deleteTestCollection() Collection web tests Attempts to delete a collection through the user interface, and then through tuque. If UI deletion fails, a fail assertion will be generated. $pid - the PID of the collection to delete. N/A

Available assertions

NOTE: These assertions are supplementary to the assertions available from SimpleTest out of the box. These assertions can be found here: https://www.drupal.org/node/265828

Assertions are the actual purpose of your tests, as they pass back information to the user telling what has passed and failed. Each of these assertions adds a 'pass' or 'fail' entry to the test result set.

Assertions always return TRUE or FALSE depending on whether or not they passed or failed. This is done so that you can tailor your tests to the results of the assertions, rather than run all of the assertions in a giant stack one after the other. For example, if you assert that an object doesn't exist, there's no reason to then attempt to assert the contents of one of its datastreams, so something like:

$object = islandora_object_load('PID');
$object_is_real = $this->assertFedoraObject($object);
if ($object_is_real) {
  $this->assertDatastreams($object, array('DSID'));
}

would be helpful in this case.

Method Name Available To Description Parameters
assertDatastreams() Web and unit tests Asserts that the given DSIDs represent datastreams that exist on a given object. $object - a loaded Fedora object to check; $datastreams - an array of DSIDs to check for.
assertNoDatastreams() Web and unit tests Asserts that the given DSIDs do not represent datastreams that exist on a given object. $object - a loaded Fedora object to check; $datastreams - an array of DSIDs to check for.
assertFedoraObject() Web and unit tests Asserts that the given object is a Fedora object. $object - the object to check.
assertError() Web tests Asserts that an error was found on the page $message - a message to pass on to the test results; $group - the group that the message should belong to.
assertWarning() Web tests Asserts that a warning was found on the page $message - a message to pass on to the test results; $group - the group that the message should belong to.
assertNoError() Web tests Asserts that no errors can be found on the page $message - a message to pass on to the test results; $group - the group that the message should belong to.
assertNoWarning() Web tests Asserts that no warnings can be found on the page $message - a message to pass on to the test results; $group - the group that the message should belong to.

Datastream validation: a special case

A special assertion is available to use, validateDatastreams(), which attempts to assert whether or not the datastreams of an object are the type of file they claim to be. This is done by performing binary assertions on the content of the file, rather than checking mime types or extensions, for the sake of accuracy.

validateDatastreams() takes two parameters:

  • $object, a loaded Fedora object to check
  • $datastreams, an array of datastream arrays to check. These are formatted like so:
$datastreams = array(
  // Most datastream validators simply take the DSID as the first item in the
  // array, and the datastream validator class prefix as the second item.
  array('DSID', 'prefix'),
  // Some datastream validators take additional parameters, set as the third
  // item in the array.
  array('DSID2', 'prefix2', array('arg1', 'arg2)),
);

So, how do you determine the prefix? Good question. Each validator for a particular file type is actually a class extended from the abstract DatastreamValidator class. These classes use the naming convention "PREFIXDatastreamValidator", and it's that PREFIX prefix that validateDatastreams() is looking for. Examples of these can be found in the classes included in islandora/tests/includes/datastream_validators.inc.

On construction, datastream validators load the object into the class variable $object, and store the DSID in the class variable $datastream. They then dump the contents of that datastream into the class variable $datastreamContent, making it available for use by assertions within the class.

The design is such that by simply extending the DatastreamValidator class and using that naming convention, you can create datastream validators of your own for file types not included in the base Islandora datastream validator set. All that datastream validator classes need to work is at least one function whose name begins with the prefix "assert", and which makes at least one call to addResult(). For example:

class TestDatastreamValidator extends DatastreamValidator {

  public function assertSomeThing() {
    if ($this->datastreamContent === "Hello!") {
      $this->addResult(TRUE, "It passed!");
    }
    else {
      $this->addResult(FALSE, "It failed!");
    }
  }
}

If, during your test now, you were to load an object "$object" with a datastream "DSID" containing just the word "Hello!" as its binary content, you could run the following:

$object = islandora_object_load('hello:object');
$this->validateDatastreams($object, array(
  array('DSID', 'test'),
));

then a passing test result with the message "It passed!" would be sent to the test result set.


Adding new utilities and assertions

Because there is no guarantee that any individual utility or assertion will be run from a particular test class type, test utilities have their own class that generates test results in a consistent method which the base Islandora test cases are able to parse. If a new utility or assertion can be considered 'useful' to any test class and not just to the class it's currently being used in, or if it needs to be used across both unit and web tests, it should be added to Islandora and use this framework.

The test utility framework, found in islandora/tests/includes/test_utility_abstraction.inc, works like so:

  1. An object extended from the test utility abstraction is created (e.g. $utilities = new IslandoraTestUtilities($this->configuration, array('db_access' => TRUE));).
  2. A test utility is run from this object that makes an assertion of some kind.
  3. The results of that assertion are expressed as a boolean, with TRUE for a pass and FALSE for a fail.
  4. A call is made to addResult() from within the test utility object. This should add a new IslandoraTestUtilityResult to the test utility class's $results array.
  5. The base test class calling out to the test utility class runs the test utility object's getResults() method to grab the result set.
  6. The base test class then iterates through the result set, and passes the results on to SimpleTest. Results have the methods getMessage(), getCaller() and getType() available to them to pull the relevant information required by SimpleTest's assert() method.

Both IslandoraUnitTestCase and IslandoraWebTestCase implement the PHP magic __call() method, which is run any time a method is called that can't be found in the class. Both of them create an IslandoraTestUtilities object and attempt to run the method being called. Both also implement a method, parseUtilityResults(), specifically to give SimpleTest an assertion based on the results passed back from the utility method.

When creating and running test utilities, you can either choose to create a new IslandoraTestUtilities object and manually parseUtilityResults(), or you can run utilities as $this->testUtility() from within the base test class and have it do the work for you.


Running your test

Before your test can run, the following steps must be completed:

  1. You must have a test configuration file set up in your islandora/tests folder. Islandora tests look for a test_config.ini file in these folders, and if this isn't found, they look for a default.test_config.ini file. If neither are found, the tests will refuse to run.
  2. If 'use_drupal_filter' is flagged in your test configuration, then your drupal_filter.xml must be writeable by Drupal. This may mean changing its permissions, or adding the webserver user to a particular group; it's up to you how this gets done, but it is a strict requirement. Islandora will refuse to run any tests if this step is not completed.
  3. A reference to the file containing the tests must be added to the module's .info file. This simply means creating a line in the .info file that reads files[] = path/to/your/test.test, with the path beginning relative to your module's root folder.
  4. The Drupal cache should then be cleared. A reference to your test class must exist in the registry for SimpleTest to pick it up, and clearing the cache is the easiest way to get this done.

The test configuration file should contain the following entries, under a [fedora] group:

  • fedora_url, the base path to your Fedora install
  • use_drupal_filter, a boolean designating whether or not the Drupal filter should be used (for now, this is almost exclusively required to be TRUE)
  • drupal_filter_file, the absolute path to your drupal_filter.xml
  • admin_user, the Fedora administrator username
  • admin_pass, the Fedora administrator password

After this, your test can be run in one of two ways:

  • From the user interface, at yoursite.com/admin/config/development/testing. Search for the group you established in your test class's getInfo(); your tests will be contained inside that fieldset.
  • Via drush, using drush test-run. When running tests via drush, always use the --uri flag to designate where the tests are being run, even if that's just http://localhost. Drush accepts either a group name (e.g. drush test-run --uri=http://localhost "Test Group") or a test class (e.g. drush test-run --uri=http://localhost ExampleTestCase) for parameters of what to run.

Getting Your Test To Run In Travis-CI

We would prefer that, at the very least, all modules that are part of Islandora core have their commits checked by Travis-CI, including having their tests run. This isn't actually a complicated process, but it does require a few steps.


Enabling the Travis-CI webhook

Once your module is created, you can add it to Travis by:

  • Logging into travis-ci.org with your GitHub account, then
  • Navigating to your account page (travis-ci.org/profile/username), and then
  • Possibly clicking the 'Sync now' button to get the most up-to-date repository information from GitHub, and then
  • Switching the 'Off' switch to 'On'.

From here on out, by default, pushes and pull requests to that repository will trigger Travis builds. You can change this behaviour at travis-ci.org/username/module_name/settings.


Creating a .travis.yml

The .travis.yml file defines how the testing environment is set up and how tests are run, and is a hard requirement for properly running tests in Travis. The file is to be placed in the root folder of your module, otherwise Travis will not properly detect and parse it.

The Travis-CI Islandora environment

A shell script is included in islandora/tests/scripts/travis_setup.sh that establishes an Islandora environment during a Travis build. This includes the following:

  • A full installation of Fedora at /home/travis/islandora_tomcat, including Solr. Available Fedora versions are 3.5, 3.6.2 and 3.7.0.
  • Drush 6.3
  • PHP CodeSniffer 1.5.6
  • PHP copy-paste detection
  • A full installation of the latest version of Drupal, located in a version-numbered folder at /home/travis and running at http://localhost:8081 and placing logs in /tmp/drush_webserver.log.

The Drupal installation includes the following modules downloaded and enabled:

  • Coder 7.x-2.4
  • potx 7.x-1.0
  • SimpleTest
  • Coder Review

A sample .travis.yml

# The language section tells Travis how to customize the environment. We tend
# to test against PHP 5.3 through 5.5
language: php
php:
  - 5.3.3
  - 5.4
  - 5.5
# Restricting the branches to test against makes sure that we're only testing
# updates to the main branch.
branches:
  only:
    - /^7.x/
# The environment matrix lets us set environment variables to use when
# establishing our test environment and running tests. FEDORA_VERSION is used
# by travis_setup.sh to determine what version to download.
env:
  matrix:
    - FEDORA_VERSION="3.5"
    - FEDORA_VERISON="3.6.2"
    - FEDORA_VERSION="3.7.0"
  global:
    # Global variables across all tests can be established here.
# The before_install script is used to set up the environment. This is where
# we run travis_setup.sh, install packages, and move everything to the right
# place. If anything fails here, the test script will not be run.
before_install:
 # Before doing pretty much anything else, you should clone the Islandora
 # repository and run the travis setup script.
 - cd $HOME
 - git clone -b 7.x git://github.com/Islandora/islandora.git
 - $HOME/islandora/tests/scripts/travis_setup.sh
 # Moving into $HOME/drupal-* means you don't need to know the current Drupal
 # version.
 - cd $HOME/drupal-*
 # We generally clone down modules to the home folder and then symlink them
 # into the modules folder. They can then be enabled at this step.
 - git clone -b some_branch git://github.com/account/some_module.git
 - ln -s $HOME/some_module sites/all/modules/some_module
 - drush -y -u 1 en some_module
# At this point, the test script is run.
script:
  # Islandora and its contributed modules require Drupal coding standards to be
  # met, so it's in your best interests to use the code sniffing tools at your
  # disposal as well as run your tests.
  - drush coder-review --reviews=production,security,style,i18n,potx,sniffer some_module
  # This is also the point when your actual SimpleTest tests should be run.
  # Remember that travis_setup.sh establishes the webserver at localhost:8081,
  # so set that.
  - drush test-run --uri=http://localhost:8081 "Some Module Tests"
# If the tests failed, it may be a good idea to cat the log files in an
# after_failure section. Maybe do it cleaner than this example.
after_failure:
  - cat /tmp/drush_webserver.log
  - cat $HOME/sites/default/files/simpletest/verbose/*
  - cat /home/travis/islandora_tomcat/server/logs/*

Troubleshooting Travis builds

Two chief methods exist for troubleshooting Travis builds:

  • The .travis.yml after_failure section
  • A local Travis environment

The after_failure section is appropriate for troubleshooting if you know where error logs should end up for the part that's failing and can simply cat or cat | grep "what i'm looking for" them.

If you don't, however, a local Travis environment can be used to dig around in the failed build, check the status of things, re-run tests rapidly, and see what happens. Building a local Travis box is unfortunately, however, an incredibly time-intensive process (about four hours would have to be set aside just to build the base box and Vagrant instance, and that assumes everything goes smoothly), so an exported Travis PHP virtual machine is available here:

Travis Build Environment

This build:

  • Uses the Travis travis-precise base box (Ubuntu 12.04)
  • Is provisioned by the chef-solo php-precise provisioner from travis-cookbooks
  • Has the login credentials 'travis' for both the username and password

To use this, you will need:

  • The above Travis build environment
  • VirtualBox, which is expected as the .ova import environment
  • Git; how to install this (or whether it even needs to be installed) depends on your operating system
  • The ability to install Ruby gems

To run the build (on a POSIX-y machine):

  • gem install travis
  • gem install bundler
  • git clone https://github.com/travis-ci/travis-build.git
  • cd travis-build
  • bundle install --gemfile Gemfile
  • ln -s /path/to/travis-build ~/.travis/travis-build
  • git clone https://github.com/your-account/your-module.git
  • cd your-module
  • travis compile m.n > build.sh where m is the number of the build you'd like to run, and n is the expanded number of the job you'd like to run from the matrix (e.g. for Islandora use cases that have 3 PHP versions and 3 Fedora versions, n 1 represents the first listed PHP version and the first listed Fedora version, n 2 represents the first listed PHP version and the second listed Fedora version, and so on)
  • NOTE: the Git repository and branch aren't populated into the resultant shell script. Until we figure out why, these will have to be added in by hand to the git.checkout portion of the script.
  • Import the Travis build environment into VirtualBox
  • Take a snapshot of it; otherwise, you'll need to re-import every time you want to re-run the build
  • Log into the environment
  • Move the build.sh script created with travis-build to your box's /home/travis/builds folder using Wizard Magic
  • chmod +x /home/travis/builds/build.sh
  • /home/travis/builds/build.sh

This should run your build. You can then revert your snapshot, re-copy over build.sh, and run it again to run it again.

⚠️ This wiki is an archive for past meeting notes. For current minutes as well as onboarding materials, click here.

Clone this wiki locally