Groovy Grails Exchange 2013

This week I attended the 2013 Groovy Grails Exchange, organised by Skills Matter (with whom I’ve also been lucky enough to attend a couple of courses over the years).

Finally got time to jot down a few thoughts – I came back with an almost full notebook, but these are the highlights:

1. What a great conference! Skills Matter did an amazing job of hosting. The speakers were varied and interesting. It was fantastically motivating to be amongst all these people who are actually pushing the technology forwards. As I work in a very small dev team (out in the sticks) I found it really interesting to see what goes on in the rest of the world.

2. Must. Learn. Spock.

3. Must also get back on board with Geb. I dabbled a little a while ago, and now that we’re about to start a new Grails project, functional testing with Spock & Geb has to be central to it. Lots of food for thought in the DevQA talk by Alvaro Sanchez-Mariscal, applicable even though we don’t have a dedicated QA team for our internal dev.

4. Forces on code – what makes code “good” depends on the context. Common sense really but well illustrated in the talk by by David the Coder

5. Do I need to learn a JavaScript framework? I enjoyed the very persuasive talk on Developing SPI Applications by Alvaro Sanchez-Mariscal. Separating the front end and back end into independent apps makes a lot of sense especially if you have dedicated UI developers. I’m not sure we have the resources to move away from GSPs just yet. But I’m certainly going to have a browse around ToDoMVC to get an idea of the options.

6. Jeff Brown’s live coding demo of Metaprogramming With The Groovy Runtime was a great refresher – nothing particularly that hadn’t been covered on the Groovy course I attended last year, but a reminder that I am pretty guilty of just writing Java like code inside .groovy files and not taking full advantage of groovy’s awesomeness.

7. The “Open Source and You” session by Peter Ledbrook made me think a bit more deeply about Open Source software – what to expect from it, the costs involved, how to manage a successful open source project. I’m definitely motivated to get more involved.

8. Gradle for deployment? @danveloper’s talk on Groovy for Sysadmins gave me lots to think about. I doubt I’ll ever end up hacking the kernel, but I like the software centric approach to deploying and maintaining servers.

9. Must pay more attention to release notes and road maps. New and upcoming versions of Groovy and Grails have some great new features that I’m looking forward to using. Changes notes for Groovy 2.2 and Grails 2.3 definitely worth a look. Also looking forward to plugin and build system changes in Grails 3.0 some time in 2014.

10. And finally, how come all these Open Source aficionados use Macs?! I’ve not got an iAnything yet and don’t plan to. Linux rocks :)

Anyway, in summary, well worth going for anyone working with Groovy or Grails. Podcasts of all the talks are available on the Skills Matter website, but I think attending in real is a fantastic opportunity to absorb knowledge from an enthusiastic and knowledgeable crowd, and worth every penny. Better get my early bird ticket for next year…

  

Updated JIRA sub-task sorting plugin

Following some user feedback, I’ve made some changes to the JIRA plugin I wrote for dragging sub-tasks, first mentioned in this post Jira web-resource plugin to drag and drop subtasks.

If you’ve used it, you may have found it annoying that it was actually too easy to drag subtasks, causing page refreshes when they weren’t wanted! Copying and pasting from the subtask list was impossible. You could also drag issues in the Issue Navigator, although it didn’t achieve anything.

Just a few teething problems then.

First I put a delay in the jQuery sortable options. I also stopped the page from submitting if the list order had not been changed. This was a definite improvement, but users still wanted to copy text from the list, which wasn’t possible. So now I’ve limited the “drag handle” to the cell in the table where the up and down arrows appear for sorting. You can only drag if you click in that area, anywhere else won’t work. It’s still not ideal, as it’s not immediately obvious how to drag. Better ideas welcomed!

You can get the updated source from github: https://github.com/anorakgirl/subtask-dragger/releases/tag/v0.6.
Or if you don’t feel like packaging it yourself, there’s a jar for download here: subtask-dragger-0.6.jar. USE AT YOUR OWN RISK!

  

Issue Security in Jira sub-tasks and the Script Runner plugin

I’ve been trying to do some advanced JIRA configuration recently, with mixed success.

The requirement was to have issues in a project which could be viewed only by a specific user (not necessarily the assignee).

I discovered that “Issue Security Schemes” can have levels which allow access based on a “User Custom Field Value”. Fantastic. I added a custom field, let’s call it “Employee”, and added it to an Issue Security Level. Applied to the project and bingo, users can only see issues where they have been entered into the “Employee” field.

Too easy. I added a subtask to an issue where I was the employee. No error message, but no subtask appeared either.

My understanding was that you could not set issue security on subtasks, because they inherit the issue security of the parent. I assumed the intention of this was that if you could see an issue, you should be able to see its children. But no – they specifically inherit the issue security setting from the parent, rather than its visibility – as confirmed in Atlassian Answers. So because the sub-task does not have an “Employee” field, I can’t see it. Bother.

Of course, I could use a workflow post-function so that when subtasks are created, the employee field is set to that of the parent. But what if the value is changed in the parent? It’s a nuisance to have to change all the sub-tasks too.

Time to investigate the Script Runner plugin.

Here’s the code for my Scripted field, which gets the value of the Employee field in the parent issue.

import com.atlassian.jira.ComponentManager
import com.atlassian.jira.issue.CustomFieldManager
import com.atlassian.jira.issue.Issue
import com.atlassian.jira.issue.fields.CustomField

def componentManager = ComponentManager.getInstance()
def parentIssue = issue.getParentObject();
CustomFieldManager cfManager = ComponentManager.getInstance().getCustomFieldManager()


CustomField customField = cfManager.getCustomFieldObject(99999)
parentIssue.getCustomFieldValue(customField)

It is almost too easy. I think it may be the answer to some of the other requests I’ve had for custom fields in JIRA, but unfortunately it doesn’t solve this problem. The calculated value appears perfectly on the page when viewing a sub-task. But, when I go to the “Issue Security Level” and click in the “User Custom Field Value” drop down, it doesn’t appear! It’s configured with the “User Picker” Searcher and template, so I don’t think there’s much more I can do.

Back to the drawing board…

  

Unexpected side effects using Groovy MockFor in a Grails test

Recently, I was trying to write a Unit Test for a Grails method which instantiates an HTTPBuilder object within a method.

I couldn’t use the normal Grails ‘mockFor’ syntax, because the HTTBuilder was not injected or accessible from outside the class. No problem, because standard Groovy MockFor is also available.

With a bit of help from google (http://stackoverflow.com/questions/9101084/groovy-httpbuilder-mocking-the-response) I came up with the test below.

def testDoSomething() {
    def requestDelegate = [response: [:], uri: [:]]

    def mockHttpBuilder = new MockFor(groovyx.net.http.HTTPBuilder)
			
    mockHttpBuilder.demand.request { Method method, Closure body ->
        body.delegate = requestDelegate
        body.call()
        requestDelegate.response.success() // make it call success
    }

    mockHttpBuilder.use {
        // All calls to HTTPBuilder will be intercepted within this block of code.      
        MyClient myClient = new MyClient();
        assert myClient.doSomething("Some Param")
    }
}

This works really well in isolating the code under test from the HTTP request.

However after spending a bit of time working on this test, and getting it to pass, I ran test-app on the entire project, and was very confused to see that some totally unrelated tests had begin to fail. Grails was reporting errors like “No more calls to ‘get’ expected at this point.” in tests where I had used no mocking at all. I was bewildered.

Eventually I came across GRAILS-8535. Although unit tests are supposed to be just that, it seems that grails continues to use the proxy in subsequent tests. This appears to be fixed in Grails 2.2.3, but as of the time of writing, this wasn’t released.

Luckily, it is possible to manually reset the MetaClass in the Grails MetaClassRegistry at the end of the test.

At the start of the test which uses MockFor, I record what the class was originally:

MetaClassRegistry registry = GrailsMetaClassUtils.getRegistry()
def originalMetaClass = registry.getMetaClass(HTTPBuilder)

And at the end of the test, set it back:

MetaClassRegistryCleaner.addAlteredMetaClass(HTTPBuilder, originalMetaClass) 
  

Best method for functional testing in grails

or ‘why isn’t there more quality control on the internet…?’

This week I tried to add some automated functional tests to a grails project. (See my post on agile best practices; we agreed that our definition of ‘Done’ should definitely include functional tests)

There are a few plugins for grails which look like they could do the job.

My final selection criteria turned out to be: the only one which I could get working within less than half a day.

Canoo Webtest

http://grails.org/plugin/webtest

I came away from a recent grails training course thinking that this was the de-facto standard for functional testing. In fact, I have since then used the standalone Webtest tool to test a legacy PHP application, with good success.

However, trying to get going in grails was a different matter.

UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: net.sourceforge.htmlunit#htmlunit;2.8-SNAPSHOT: not found
::::::::::::::::::::::::::::::::::::::::::::::

I found the following bug report, and following the advice to specify the htmlunit version did allow me to get started.

http://jira.grails.org/browse/GPWEBTEST-72

But I was already disappointed.  If I install a plugin using the grails install-plugin command, I think there should be some level of assurance that the plugin version to be installed will be a tested version, dependent on NON-snapshot versions of any required libraries.

Grails functional-test plugin

I then realised that other plugins were available, as mentioned in the grails functional testing user guide.
The documentation for this one defines the dependency as

compile ":functional-test:2.0.RC1"

Again, I do not want the install-plugin command to install a Release Candidate. I want the tested and released version. I tried using an older version as well, but gave up on this pretty quickly too.

selenium-rc

See: http://grails.org/plugin/selenium-rc
The grails plugin page states:

This plugin is no longer maintained. Consider looking at Geb.

So this is what I did.

Geb

After a bad day, I was pleasantly surprised to get going quickly with Geb. And I really like the separation between modelling the available functions on the page and calling them in tests. Although the plugin version is only 0.9.0 it feels like it is current and will be maintained.

I’m surprised at how much trouble I had getting going with functional testing. It feels like there is a lack of quality control on the publicly available Grails plugins, and little attempt to ensure versions are compatible with other standard plugins such as jQuery.

I’ll be spending some more time learning to use Geb in the near future, so I’ll try and report my progress…

  

Grails unit tests failing on jenkins

I spent a while confused today, when some fairly straightforward unit tests were running fine locally, but failing on our continuous integration server, jenkins.

The error was occurring in the setUp method, annotated @Before, where we were setting a flash variable prior to running tests.

java.lang.NullPointerException: Cannot invoke method getFlashScope() on null object
at grails.test.mixin.web.ControllerUnitTestMixin.getFlash(ControllerUnitTestMixin.groovy:159)

After much head scratching I came across this link
http://www.mopri.de/2013/grails-unit-testing-and-a-little-fun-with-before/
which explains a similar problem.

It appears that because the Controller uses a Mixin, it effectively extends ControllerUnitTestMixin which is not abstract and also has a method annotated with @Before. Junit does not guarantee the order in which the two methods annotated with @Before will be run (See http://junit.sourceforge.net/doc/faq/faq.htm#tests_2)

If the Mixin method does not run before my setUp method, the flash scope has not been mocked, hence the failure.

The solution suggested in the post is to call super.bindGrailsWebRequest() in the setUp method. In our case it was simple enough to set the flash variable within the individual tests, instead of the setUp method.

  

Immutability and Collections.unmodifiableList

Or: why is my list still mutable?

Today I learnt a new thing about Collections.unmodifiableList.

First, I noted that Sonar wasn’t raising a violation where I thought it should.

Take the sample class below

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = myList;
    }

    public List<String> getMyList() {
        return myList;
    }
    
}

I would expect sonar to raise the following:

Malicious code vulnerability - May expose internal representation by incorporating reference to mutable object (Findbugs: EI_EXPOSE_REP2)
Malicious code vulnerability - May expose internal representation by returning reference to mutable object (Findbugs: EI_EXPOSE_REP)

But it didn’t. This is a mystery in itself, as I am sure it has picked me up on this before.

Anyway, no problem, I wrote a test for the class myself:

/**
     * Test of getMyList method, of class MyImmutableClass.
     */
    @Test
    public void testGetMyList() {
        List<String> list = new ArrayList<String>();
        list.add("Potato");
        list.add("Apple");
        
        MyImmutableClass instance = new MyImmutableClass(list);
        
        assertEquals(2,instance.getMyList().size());
        
        //check for immutability
        list.add("item 3");
        assertEquals(2,instance.getMyList().size());
        
        try {
            instance.getMyList().add("Message3");
            fail("Should not be possible");
        } catch (Exception ex) {
            assertTrue( ex instanceof UnsupportedOperationException);
        }
    }

Of course, the test fails with the code as it is.

So I modified it to this:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(myList);
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

And it STILL failed. I was very confused. How come Collections.unmodifiableList in the constructer wasn’t stopping the list inside the class from changing?

It took some googling to find the answer. For example: this stackoverflow post.
If you pay proper attention to the javadoc for Collections.unmodifiableList, it makes sense.


     * Returns an unmodifiable view of the specified list.  This method allows
     * modules to provide users with "read-only" access to internal
     * lists.  Query operations on the returned list "read through" to the
     * specified list, and attempts to modify the returned list, whether
     * direct or via its iterator, result in an
     * UnsupportedOperationException.

So, this just wraps the original list inside an unmodifiable view. I can’t modify the one inside my class, but if I modify the one I used to create the class, the view reflects the update.

The correct way to make my class immutable is:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(new ArrayList<String>(myList));
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

Phew. As an afterthought, because sonar and findbugs did not pick this up, I’m thinking of taking a look at this: http://mutability-detector.blogspot.co.uk/. We like to make all classes immutable, unless there is a good reason not to. It would be interesting to see what else has slipped through.

  

Writing a maven 3 plugin in groovy

I thought I’d document this, as we found it a bit confusing to get going, and the documentation is fairly sparse.

We wanted to knock up a quick maven plugin which would parse some XML files, and output some HTML. Great, a chance to do some groovy foo! Parsing and writing XML with groovy is so easy, we thought we’d try writing the plugin in groovy. Once compiled, groovy is just java, so Maven should not care what the plugin is written in.

First, DON’T follow these instructions: http://www.sonatype.com/books/mcookbook/reference/writing-plugins-alternative-sect-writing-groovy.html. They only work in Maven 2.

For reference, the error message when compiling the plugin:

Failed to execute goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs (default) on project firstgroovy-maven-plugin: Execution default of goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs failed: An API incompatibility was encountered while executing org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs: java.lang.NoSuchMethodError: org.codehaus.plexus.PlexusContainer.hasChildContainer(Ljava/lang/String;)Z

The correct approach for maven 3 is documented here:
http://docs.codehaus.org/display/GMAVEN/Implementing+Maven+Plugins.

It still doesn’t work straight away. If you follow the guidelines and create the pom and mojo as in this page (specifiying 1.5 for the gmaven versions), the mojo appears to install but then when you execute it from another project, you get the following error:

 org.sonatype.guice.bean.reflect.Logs$JULSink warn
WARNING: Error injecting: sample.plugin.GreetingMojo
java.lang.NoClassDefFoundError: Lorg/codehaus/groovy/reflection/ClassInfo;

The answer is on this page: http://jira.codehaus.org/browse/GMAVEN-110.

We need to add the following to the gmaven-plugin configuration:

<configuration>
    <providerSelection>1.5</providerSelection>
</configuration>

I’ve attached a zip containing a working mojo and a test project: groovy-maven-plugin.tar.gz. Run mvn install on the first to install it in your local repository, then mvn compile onthe second to see it execute.

  

Sharing junit tests with Maven

We’ve now had two cases at work where we have created a set of junit tests, which we want to re-use in other maven modules. It is a two stage process, and there are a couple of options for how to run the tests in the project which imports them.

Creating re-usable tests

First, we create the tests in the ‘tester-module’. These are standard unit tests in src/test/java. Of course, they must pass, or your project will not build. So you need some sample implementations of the what you are testing within that project. For ease of use, we put all the tests which will be exported into a single package.

To create a test-jar, which can be a dependency for other projects, add to the pom:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    
    <executions>
        <execution>           
            <id>test-jar</id>
            <goals>
                <goal>test-jar</goal>
            </goals>
            <configuration>
                <includes>
                    <include>**/exported_tests/*</include>
                </includes>
            </configuration>
        </execution>       
    </executions>    
</plugin>

This will generate a test-jar containing only the tests in the package specified.

Importing the tests

To import the test to the module which will use them, the following dependency is added.

<dependency>
    <groupId>tester-group</groupId>
    <artifactId>tester-module</artifactId>
    <version>1.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

Running the tests

There are two ways of running the attached tests. They DO NOT get run automatically when you run mvn test.

Method 1. Extracting the dependency

Adding the following to the pom will extract the test classes into target/test-classes, so that they all get run when you run mvn test. This works well if you always want to run ALL the attached tests.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>process-test-classes</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>tester-group</groupId>
                        <artifactId>tester-module</artifactId>
                        <version>1.0</version>
                        <type>test-jar</type>
                        <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>

Method 2. Using test suites

If you want to run certain tests only, you can add a TestSuite to the module, and define which of the attached tests should be run as part of the test suite.

@RunWith(Suite.class)
@Suite.SuiteClasses({TestClass1.class, TestClass2.class})
public class TestSuite {
  //nothing
}

An afterthought: testing the tests

In the situations where we are re-using our tests, the tests themselves become a critical component of our systems. We wanted to be able to test the tests too. We got round this by not including logic in the tests themselves. Instead, the test-module contains a set of ‘validators’, in src/main. We can then write tests for these ‘validators’ as usual. The tests in ‘exported_tests’ can then simply delegate to an appropriate validator, which we know has been tested.

assertTrue(validator.validate(classUnderTest));

The only difference this makes is that you have to add a normal dependency to the tester-module, as well as the test-jar dependency.

We’ve found this approach very useful, as we’re using maven and junit as a framework for testing other file resources. However I think it is useful for java code too, if you have a set of interfaces, and different implementations in different modules.

References:
http://maven.apache.org/guides/mini/guide-attached-tests.html
http://softwaremavens.blogspot.co.uk/2009/09/running-tests-from-maven-test-jar-in.html

  

Groovy use of the Jira REST API to update a custom field

We use Jira for managing a specific type of project at work. For some time, we’ve used the description field of certain issues to hold a particular piece of information. We recently decided to make a custom field for this instead, so that we could use the description field for its intended purpose, and include our new custom field in issue lists.

The challenge was to update the new custom field with the data from the description field for all historical issues. I couldn’t see a way to do it through Jira directly, so I thought I’d take the opportunity to exercise my groovy skills, and explore the Jira REST API.

I decided to use the HTTPBuilder library. Here is the grab and the imports:

@Grab(group='org.codehaus.groovy.modules.http-builder', module='http-builder', version='0.5.2' )

import groovy.json.*

import groovyx.net.http.RESTClient
import groovy.util.slurpersupport.GPathResult
import groovyx.net.http.ContentType;

Good practice next, I don’t want to hard code my username and password in a script, so lets prompt for them:

String username = System.console().readLine("%s ", ["Jira Username:"] as String) 
String password = String.valueOf(System.console().readPassword("%s ", ["Jira Password:"] as String[]))

Next we make a client for all our rest requests:

def jiraClient = new RESTClient("https://my.jira.url");

Now I’m going to set up the authorisation header, and some settings that I’ll reuse each time I make a REST request. I spent a while working out how to do the authorisation. You can post to the /auth/1/session resource and then store a cookie, but sending an Authorisation Header seems nice and simple

def authString = "${username}:${password}".getBytes().encodeBase64().toString()
def requestSettings = [contentType: ContentType.JSON, requestContentType: ContentType.JSON, headers:['Authorization':"Basic ${authString}"]]

Here’s the first REST request, to the search resource. The clever groovy bit is adding two maps together, using the plus operator. I spent a while looking for a special function to do that. Sometimes groovy makes things too easy! The plus operator adds two maps and returns a new one. In contrast using the left shift operator would alter the map on the left.

def jqlQuery = "type = \"My Issue Type\" and cf[11720] is EMPTY and project = MYPROJ"
def issueRequest = jiraClient.post(settings + [path: "/rest/api/latest/search", body:  [jql: jqlQuery, maxResults: 10000]])

Next we run a REST PUT request for each issue returned. The groovy JsonBuilder makes is really easy to construct a correctly formatted JSON String to send. The syntax is a bit complicated – as well as the REST API documentation I found this example helpful: https://developer.atlassian.com/display/JIRADEV/JIRA+REST+API+Example+-+Edit+issues.

issueRequest.data.issues.each() { issue ->
  def json = new groovy.json.JsonBuilder()
  def root =  json.update {  
		customfield_11720 ( [{
			set "${issue.fields.description}"
		}] )
  }
  
  try {
      def result = jiraClient.put(settings + [path: "/rest/api/latest/issue/${issue.key}", body:  json.toString()] )
      println "${issue.key} | ${issue.fields.summary} | ${issue.fields.description} |  ${result.status}"
  } catch (Exception ex) {
      println "${issue.key} | ${issue.fields.summary} | ${issue.fields.description} |  FAIL"
  }
  
}

Unfortunately some of the put requests failed, because we’ve made more fields mandatory since we started and some of the older issues were missing data for mandatory fields, but it still saved a lot of time!