Docker and Maven for integration testing

I recently implemented LDAP authentication in our Dropwizard application. How to integration test this?

Here’s what I used:

* This docker image: nickstenning/slapd, containing a basic configuration of the OpenLDAP server.
* The fabric8io docker-maven-plugin
* Some maven foo.

The docker image

I required LDAP configured with some known groups and users to use for testing. I put the following Dockerfile in src/test/docker/ldap, along with users.ldif which defined the users I wanted to load.

FROM nickstenning/slapd

#Our envt variables
ENV LDAP_ROOTPASS topsecret
ENV LDAP_ORGANISATION My Org
ENV LDAP_DOMAIN domain.com

# Users and groups to be loaded
COPY users.ldif        /users.ldif

# Run slapd to load ldif files
RUN apt-get update && apt-get install -y ldap-utils && rm -rf /var/lib/apt/lists/*
RUN (/etc/service/slapd/run &) && sleep 2 && \
  ldapadd -h localhost -p 389 -c -x -D cn=admin,dc=domain,dc=com -w topsecret -v -f /users.ldif && killall slapd

The parent config

In my parent pom, I put the shared configuration for the fabric8io plugin into pluginManagement. This includes any images I want to run for multiple sub modules (I’m using postgres for other integration tests, in combination with the liquibase for setting up a test database), and the executions which apply to all submodules.

 <pluginManagement>
    <plugins> 
        <plugin>
            <groupId>io.fabric8</groupId>
            <artifactId>docker-maven-plugin</artifactId>
            <version>0.18.1</version>
            <configuration>                    
                <images>                       
                    
                </images>
            </configuration>
            <executions>
                <execution>
                    <id>build</id>
                    <phase>pre-integration-test</phase>
                    <goals>
                        <goal>build-nofork</goal>
                    </goals>
                </execution>
                <execution>
                    <id>run</id>
                    <phase>pre-integration-test</phase>
                    <goals>
                        <goal>start</goal>
                    </goals>
                </execution>
                <execution>
                    <id>stop</id>
                    <phase>post-integration-test</phase>
                    <goals>
                        <goal>stop</goal>
                    </goals>
                </execution>
            </executions>                    
        </plugin>
    </plugins>
</pluginManagement>

Remember that putting a plugin in pluginManagement in the parent does not mean it will be active in the child modules. It simply defines the default config which will be used, should you add the plugin to the plugins section in the child pom. This approach means we can avoid spinning up docker contained for sub modules which don’t need them.

Remember to use the maven-failsafe-plugin to run your integration tests. This ensures that the docker containers will be stopped even if your integration tests fail, so you don’t get containers hanging around disrupting your next test run.

The sub module config

A key optimisation I wanted to make was not to run the maven-docker-plugin if I was skipping tests – I’d assume I’m skipping tests for a fast build, so the last thing I want is to spin up unnecessary containers. To do this, I activated the docker maven plugin within a profile in the appropriate submodules, which is only active if you haven’t supplied the skipTests parameter.

<profiles>  
    <profile>
        <id>integ-test</id>
        <activation>
            <property>
                <name>!skipTests</name>
            </property>
        </activation>
        <build>
            <plugins>
                <plugin>
                    <groupId>io.fabric8</groupId>
                    <artifactId>docker-maven-plugin</artifactId>
                </plugin>
            </plugins>
        </build>
    </profile>
</profiles>

For submodules which only require the docker images defined in the parent pom, the above configuration is sufficient. However, in one submodule I also want to spin up the ldap image. To do that, I wanted to add to the images defined in the parent. This led me to a useful blog on how to Merging Plugin Configuration in Complex Maven Projects. The combine.children=”append”
attribute ensures that the merge behaviour is as required.

 <profile>
    <id>integ-test</id>
    <activation>
        <property>
            <name>!skipTests</name>
        </property>
    </activation>
    <build>
        <plugins>
            <plugin>
                <groupId>io.fabric8</groupId>
                <artifactId>docker-maven-plugin</artifactId>                
                <configuration>   
                    <!-- append docker images for this module -->                 
                    <images combine.children="append">
                                                                 
                    </images>                      
                </configuration>
            </plugin>
        </plugins>
    </build>
</profile>

Result: completely portable integration tests.

  

Don’t commit your target folder

or an insidious error which resulted in jenkins builds where the sources jar did not match the binary jar

I still can’t work out exactly how this works. I recently made some changes to one of our Maven archetypes, and released the new version using Jenkins. I was confused to discover that projects created with the new version of the archetype did not contain the changes. I checked the git repository tag. I checked the source jar in Artifactory. Both contained the changes. I checked the binary jar and it did not (the changes were to resource files).

I checked out locally, ran maven clean and then maven package. The resulting binary jar was fine.

I wiped out the Jenkins workspace and repeated the build. The binary jar was created incorrectly! Cue lots of head scratching, until a quick thinking colleague asked if I’d checked in the target directory at any point. And I HAD! So although I’d wiped the jenkins workspace, the fresh checkout had got back the out of date target contents. AND the jenkins job did not begin with a maven clean.

A lesson in better house-keeping.

  

Finally… maven release and git flow!

I am really looking forward to using the new Maven JGit-Flow Plugin by Atlassian

.

At the moment, we’re following the git flow branching structure, with master, develop and feature branches. But we’re not using git flow for releases. Our release process is as follows:

  • Create release/1.0.0 branch.
  • Set version on develop branch to 1.1.0-SNAPSHOT
  • Push to origin
  • QA and final bug fixing on release branch
  • Merge to master
  • Maven Release on master branch
  • Merge to develop

This was the most we could streamline the process. It ensures we don’t release something which can’t merge to master.
(I should admit: sometimes we skip the release branch and merge straight from develop to master when we are ready for a release.)

Anyway, really hoping that the Atlassian plugin does away with all these manual steps. It looks like it will. The only thing it doesn’t seem to do is bump the version on the develop branch as soon as the release branch has been created. There’s a discussion here: https://bitbucket.org/atlassian/maven-jgitflow-plugin/issue/16/develop-version-on-release

I think it’s really important to do this. If work continues on the develop branch whilst release QA is underway, both the develop branch and the release branch will be installing snapshots with the same version number and different functionality. I’m sure this could lead to confusion somewhere, especially with multi-module projects. I’m pretty sure I read somewhere that the git flow module says you should bump the version on develop after release-start, but of course I can’t find it now. The only argument I can see for not bumping is what happens if you decide not to release, but don’t see it as a problem to miss a minor version number, and if it was, it’;s easy to use the Maven Versions plugin to reset the version on develop. This will be the minority case, so I think the plugin should cater first for the normal case when the release goes ahead.

I’d be interested to know when other developers bump the version on the develop branch? Either way, I’ll certainly be trying out the plugin when I next need to do some releasing.

  

Writing a maven 3 plugin in groovy

I thought I’d document this, as we found it a bit confusing to get going, and the documentation is fairly sparse.

We wanted to knock up a quick maven plugin which would parse some XML files, and output some HTML. Great, a chance to do some groovy foo! Parsing and writing XML with groovy is so easy, we thought we’d try writing the plugin in groovy. Once compiled, groovy is just java, so Maven should not care what the plugin is written in.

First, DON’T follow these instructions: http://www.sonatype.com/books/mcookbook/reference/writing-plugins-alternative-sect-writing-groovy.html. They only work in Maven 2.

For reference, the error message when compiling the plugin:

Failed to execute goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs (default) on project firstgroovy-maven-plugin: Execution default of goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs failed: An API incompatibility was encountered while executing org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs: java.lang.NoSuchMethodError: org.codehaus.plexus.PlexusContainer.hasChildContainer(Ljava/lang/String;)Z

The correct approach for maven 3 is documented here:
http://docs.codehaus.org/display/GMAVEN/Implementing+Maven+Plugins.

It still doesn’t work straight away. If you follow the guidelines and create the pom and mojo as in this page (specifiying 1.5 for the gmaven versions), the mojo appears to install but then when you execute it from another project, you get the following error:

 org.sonatype.guice.bean.reflect.Logs$JULSink warn
WARNING: Error injecting: sample.plugin.GreetingMojo
java.lang.NoClassDefFoundError: Lorg/codehaus/groovy/reflection/ClassInfo;

The answer is on this page: http://jira.codehaus.org/browse/GMAVEN-110.

We need to add the following to the gmaven-plugin configuration:

<configuration>
    <providerSelection>1.5</providerSelection>
</configuration>

I’ve attached a zip containing a working mojo and a test project: groovy-maven-plugin.tar.gz. Run mvn install on the first to install it in your local repository, then mvn compile onthe second to see it execute.

  

Sharing junit tests with Maven

We’ve now had two cases at work where we have created a set of junit tests, which we want to re-use in other maven modules. It is a two stage process, and there are a couple of options for how to run the tests in the project which imports them.

Creating re-usable tests

First, we create the tests in the ‘tester-module’. These are standard unit tests in src/test/java. Of course, they must pass, or your project will not build. So you need some sample implementations of the what you are testing within that project. For ease of use, we put all the tests which will be exported into a single package.

To create a test-jar, which can be a dependency for other projects, add to the pom:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    
    <executions>
        <execution>           
            <id>test-jar</id>
            <goals>
                <goal>test-jar</goal>
            </goals>
            <configuration>
                <includes>
                    <include>**/exported_tests/*</include>
                </includes>
            </configuration>
        </execution>       
    </executions>    
</plugin>

This will generate a test-jar containing only the tests in the package specified.

Importing the tests

To import the test to the module which will use them, the following dependency is added.

<dependency>
    <groupId>tester-group</groupId>
    <artifactId>tester-module</artifactId>
    <version>1.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

Running the tests

There are two ways of running the attached tests. They DO NOT get run automatically when you run mvn test.

Method 1. Extracting the dependency

Adding the following to the pom will extract the test classes into target/test-classes, so that they all get run when you run mvn test. This works well if you always want to run ALL the attached tests.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>process-test-classes</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>tester-group</groupId>
                        <artifactId>tester-module</artifactId>
                        <version>1.0</version>
                        <type>test-jar</type>
                        <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>

Method 2. Using test suites

If you want to run certain tests only, you can add a TestSuite to the module, and define which of the attached tests should be run as part of the test suite.

@RunWith(Suite.class)
@Suite.SuiteClasses({TestClass1.class, TestClass2.class})
public class TestSuite {
  //nothing
}

An afterthought: testing the tests

In the situations where we are re-using our tests, the tests themselves become a critical component of our systems. We wanted to be able to test the tests too. We got round this by not including logic in the tests themselves. Instead, the test-module contains a set of ‘validators’, in src/main. We can then write tests for these ‘validators’ as usual. The tests in ‘exported_tests’ can then simply delegate to an appropriate validator, which we know has been tested.

assertTrue(validator.validate(classUnderTest));

The only difference this makes is that you have to add a normal dependency to the tester-module, as well as the test-jar dependency.

We’ve found this approach very useful, as we’re using maven and junit as a framework for testing other file resources. However I think it is useful for java code too, if you have a set of interfaces, and different implementations in different modules.

References:
http://maven.apache.org/guides/mini/guide-attached-tests.html
http://softwaremavens.blogspot.co.uk/2009/09/running-tests-from-maven-test-jar-in.html

  

maven release, git and the parent pom

Some strange maven behaviour took up some of my time today.

A lot of our projects have some common settings, so we have a ‘parent pom’ artifact.  I’ve been trying to tidy up our release procedure, and wanted to use the maven release plugin.

I thought I could be clever and define the scm settings in the parent pom like this:

 

 
<scm>
   <developerConnection>scm:git:https://our-repo-url/${repo.name}.git</developerConnection>
</scm>
 

All projects using this parent would then just have to define a “repo.name” property in the pom.

However, it didn’t work! when running maven release, maven was attempting to push to ‘https://our-repo-url/${repo.name}.git/artifactId” i.e. it was appending the artifactId to the scm url. Needless to say, the push failed, and the maven release failed.

After some googling I found that same problem is encountered when trying to release a single module of a multi module project (For example see http://stackoverflow.com/questions/6727450/releasing-a-multi-module-maven-project-with-git)

I guess the maven release plugin is trying to be clever, and perhaps this behaviour makes more sense in an SVN world.

To fix the problem, I defined an “scm.root” variable in the parent pom, and defined the scm connection for individual projects as below:

<scm>
   <developerConnection>${scm.root}/reponame.git</developerConnection>
</scm>
  

maven, junit, cobertura and BeanCreationException

I have a set of junit tests for my project. When I run them with mvn test they run fine and pass.

However, when I ran them with mvn cobertura:cobertura, they failed with the following error:

org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'com.anorakgirl.MyTest': Autowiring of fields failed;
nested exception is org.springframework.beans.factory.BeanCreationException:
Could not autowire field: private com.anorakgirl.MyService com.anorakgirl.MyTest.myService;
nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException:
No unique bean of type [com.anorakgirl.MyService] is defined:
Unsatisfied dependency of type [class com.anorakgirl.MyService]:
expected at least 1 matching bean

After some googling, this is the problem. The ServiceObject was annotated with @Autowire. Spring autowires by type. I have only one class of type ‘com.anorakgirl.MyService’. So when run with junit this works fine. However, cobertura changes the type of the MyService class during Instrumentation, so Spring no longer recognises it as type MyService, and so cannot Autowire.

There are two possible answers:

The easy answer (what I did)

In your test application context add the line:

<aop:config proxy-target-class="true"/>

The other answer (what I read about)

Make sure the classes that you want to autowire all implement interfaces. Then autowire the interface rather than the implementation. The Cobertura class will still implement the same interface and therefore can still be autowired.

I didn’t try this as the service classes do not have interfaces and I haven’t time to add them!