Writing a maven 3 plugin in groovy

I thought I’d document this, as we found it a bit confusing to get going, and the documentation is fairly sparse.

We wanted to knock up a quick maven plugin which would parse some XML files, and output some HTML. Great, a chance to do some groovy foo! Parsing and writing XML with groovy is so easy, we thought we’d try writing the plugin in groovy. Once compiled, groovy is just java, so Maven should not care what the plugin is written in.

First, DON’T follow these instructions: http://www.sonatype.com/books/mcookbook/reference/writing-plugins-alternative-sect-writing-groovy.html. They only work in Maven 2.

For reference, the error message when compiling the plugin:

Failed to execute goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs (default) on project firstgroovy-maven-plugin: Execution default of goal org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs failed: An API incompatibility was encountered while executing org.codehaus.mojo.groovy:groovy-maven-plugin:1.0-beta-3:generateStubs: java.lang.NoSuchMethodError: org.codehaus.plexus.PlexusContainer.hasChildContainer(Ljava/lang/String;)Z

The correct approach for maven 3 is documented here:
http://docs.codehaus.org/display/GMAVEN/Implementing+Maven+Plugins.

It still doesn’t work straight away. If you follow the guidelines and create the pom and mojo as in this page (specifiying 1.5 for the gmaven versions), the mojo appears to install but then when you execute it from another project, you get the following error:

 org.sonatype.guice.bean.reflect.Logs$JULSink warn
WARNING: Error injecting: sample.plugin.GreetingMojo
java.lang.NoClassDefFoundError: Lorg/codehaus/groovy/reflection/ClassInfo;

The answer is on this page: http://jira.codehaus.org/browse/GMAVEN-110.

We need to add the following to the gmaven-plugin configuration:

<configuration>
    <providerSelection>1.5</providerSelection>
</configuration>

I’ve attached a zip containing a working mojo and a test project: groovy-maven-plugin.tar.gz. Run mvn install on the first to install it in your local repository, then mvn compile onthe second to see it execute.

Sharing junit tests with Maven

We’ve now had two cases at work where we have created a set of junit tests, which we want to re-use in other maven modules. It is a two stage process, and there are a couple of options for how to run the tests in the project which imports them.

Creating re-usable tests

First, we create the tests in the ‘tester-module’. These are standard unit tests in src/test/java. Of course, they must pass, or your project will not build. So you need some sample implementations of the what you are testing within that project. For ease of use, we put all the tests which will be exported into a single package.

To create a test-jar, which can be a dependency for other projects, add to the pom:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    
    <executions>
        <execution>           
            <id>test-jar</id>
            <goals>
                <goal>test-jar</goal>
            </goals>
            <configuration>
                <includes>
                    <include>**/exported_tests/*</include>
                </includes>
            </configuration>
        </execution>       
    </executions>    
</plugin>

This will generate a test-jar containing only the tests in the package specified.

Importing the tests

To import the test to the module which will use them, the following dependency is added.

<dependency>
    <groupId>tester-group</groupId>
    <artifactId>tester-module</artifactId>
    <version>1.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

Running the tests

There are two ways of running the attached tests. They DO NOT get run automatically when you run mvn test.

Method 1. Extracting the dependency

Adding the following to the pom will extract the test classes into target/test-classes, so that they all get run when you run mvn test. This works well if you always want to run ALL the attached tests.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>process-test-classes</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>tester-group</groupId>
                        <artifactId>tester-module</artifactId>
                        <version>1.0</version>
                        <type>test-jar</type>
                        <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>

Method 2. Using test suites

If you want to run certain tests only, you can add a TestSuite to the module, and define which of the attached tests should be run as part of the test suite.

@RunWith(Suite.class)
@Suite.SuiteClasses({TestClass1.class, TestClass2.class})
public class TestSuite {
  //nothing
}

An afterthought: testing the tests

In the situations where we are re-using our tests, the tests themselves become a critical component of our systems. We wanted to be able to test the tests too. We got round this by not including logic in the tests themselves. Instead, the test-module contains a set of ‘validators’, in src/main. We can then write tests for these ‘validators’ as usual. The tests in ‘exported_tests’ can then simply delegate to an appropriate validator, which we know has been tested.

assertTrue(validator.validate(classUnderTest));

The only difference this makes is that you have to add a normal dependency to the tester-module, as well as the test-jar dependency.

We’ve found this approach very useful, as we’re using maven and junit as a framework for testing other file resources. However I think it is useful for java code too, if you have a set of interfaces, and different implementations in different modules.

References:
http://maven.apache.org/guides/mini/guide-attached-tests.html
http://softwaremavens.blogspot.co.uk/2009/09/running-tests-from-maven-test-jar-in.html

Groovy use of the Jira REST API to update a custom field

We use Jira for managing a specific type of project at work. For some time, we’ve used the description field of certain issues to hold a particular piece of information. We recently decided to make a custom field for this instead, so that we could use the description field for its intended purpose, and include our new custom field in issue lists.

The challenge was to update the new custom field with the data from the description field for all historical issues. I couldn’t see a way to do it through Jira directly, so I thought I’d take the opportunity to exercise my groovy skills, and explore the Jira REST API.

I decided to use the HTTPBuilder library. Here is the grab and the imports:

@Grab(group='org.codehaus.groovy.modules.http-builder', module='http-builder', version='0.5.2' )

import groovy.json.*

import groovyx.net.http.RESTClient
import groovy.util.slurpersupport.GPathResult
import groovyx.net.http.ContentType;

Good practice next, I don’t want to hard code my username and password in a script, so lets prompt for them:

String username = System.console().readLine("%s ", ["Jira Username:"] as String) 
String password = String.valueOf(System.console().readPassword("%s ", ["Jira Password:"] as String[]))

Next we make a client for all our rest requests:

def jiraClient = new RESTClient("https://my.jira.url");

Now I’m going to set up the authorisation header, and some settings that I’ll reuse each time I make a REST request. I spent a while working out how to do the authorisation. You can post to the /auth/1/session resource and then store a cookie, but sending an Authorisation Header seems nice and simple

def authString = "${username}:${password}".getBytes().encodeBase64().toString()
def requestSettings = [contentType: ContentType.JSON, requestContentType: ContentType.JSON, headers:['Authorization':"Basic ${authString}"]]

Here’s the first REST request, to the search resource. The clever groovy bit is adding two maps together, using the plus operator. I spent a while looking for a special function to do that. Sometimes groovy makes things too easy! The plus operator adds two maps and returns a new one. In contrast using the left shift operator would alter the map on the left.

def jqlQuery = "type = \"My Issue Type\" and cf[11720] is EMPTY and project = MYPROJ"
def issueRequest = jiraClient.post(settings + [path: "/rest/api/latest/search", body:  [jql: jqlQuery, maxResults: 10000]])

Next we run a REST PUT request for each issue returned. The groovy JsonBuilder makes is really easy to construct a correctly formatted JSON String to send. The syntax is a bit complicated – as well as the REST API documentation I found this example helpful: https://developer.atlassian.com/display/JIRADEV/JIRA+REST+API+Example+-+Edit+issues.

issueRequest.data.issues.each() { issue ->
  def json = new groovy.json.JsonBuilder()
  def root =  json.update {  
		customfield_11720 ( [{
			set "${issue.fields.description}"
		}] )
  }
  
  try {
      def result = jiraClient.put(settings + [path: "/rest/api/latest/issue/${issue.key}", body:  json.toString()] )
      println "${issue.key} | ${issue.fields.summary} | ${issue.fields.description} |  ${result.status}"
  } catch (Exception ex) {
      println "${issue.key} | ${issue.fields.summary} | ${issue.fields.description} |  FAIL"
  }
  
}

Unfortunately some of the put requests failed, because we’ve made more fields mandatory since we started and some of the older issues were missing data for mandatory fields, but it still saved a lot of time!

Jira web-resource plugin to drag and drop subtasks

Today I’ve been making a simple plugin for Jira, to allow drag and drop reordering of subtasks. As standard, a list of subtasks comes with links to move the items one position up or down – tedious if you want to move a long way!

Getting up and running with the Atlassian plugin SDK is really straightforward.

The plugin contains only a web-resource with the JavaScript I want to run. I’m not particularly familiar with jQuery, so it may be a little hacky or non-optimal, but it works!

A couple of useful things I learnt:

Jira provides some very specific contexts to determine which pages will include the resource. In my case, the web-resource definition in my atlassian-plugin.xml contains the line:

<context>jira.view.issue</context>

You can see all the web-resource contexts here: Web Resource plugin module contexts.

Because the list of subtasks could contain issues hidden to the current user, I parsed the new position for the sub task from the links which already exist for moving the subtasks up and down, rather than just using the row index.

You can see the source here: https://github.com/anorakgirl/subtask-dragger

Or you can download a jar here: subtask-dragger-0.1.jar. DISCLAIMER: This is just a demo, use at your own risk!

Updating from a list in Grails

This week I had the pleasure of attending both Groovy and Grails courses at Skills Matter in London. The courses were taught by Dierk König, a committer to both Groovy and Grails, and the author of Groovy in Action. He’s very knowledgeable and the courses were really interesting – I would have liked more time for advanced grails topics, but I guess you can’t fit it all in to two days!

During the course I asked for help with one of my pet grails problems – the best way to update multiple records at once from a list view. For example, you might have a set of records with a check box field, and want to tick/un-tick for several rows and then save all with a single click. We didn’t quite get it working in class, but I got some helpful advice which meant I was able to finish it off on the train home.

The domain class is nothing special:

class Person {
 String firstName
 String lastName
 boolean available
}

My PersonController looks like this (I’ve left the scaffolding in, so you get all the standard pages too):

class PersonController {
   static scaffold = true
   def multiEdit = {
      List<Person> list = Person.list()
      [personInstanceList: list]
   }

    def multiSave = { MultiplePersonCommand command ->
     command.people.each { p ->
     Person person = Person.get(p.id)
     person.properties = p.properties
     person.save()
   }
   redirect action: list
 }
}
class SinglePersonCommand {
 Integer id
 boolean available
}
class MultiplePersonCommand {
 List<SinglePersonCommand> people = [].withDefault({ new SinglePersonCommand() } )
}

The important thing here is the use of Command objects. I’ve defined these in the same class as the Controller but they could be in a separate file. The really important tip is the use of `withDefault` on the list in the MultiplePersonCommand. When the binding takes place, we can’t guarantee what order it will be in. For example it might try to bind the second list item before the first. This would cause an error without the `withDefault` method.

And finally, multiEdit.gsp looks like this:

<g:form action="multiSave">
 <g:each var="person" in="${personInstanceList}" status="i">
     <div id="person${i}">
     <g:hiddenField name='people[${i}].id' value='$person.id'/>
     <g:fieldValue bean="${person}" field="firstName"/>
     <g:checkBox name='people[${i}].available' value='1' checked="${person.available}"/>
   </div>
 </g:each>
 </div>
<g:submitButton name="action" />
</g:form>

The important thing here is the use of the $i variable in square brackets on the fields in question. This means that the params that come back to the Controller will effectively contain people[0].id, people[0].available, people[1].id, people[1].available and so on. Grails is clever enough to bind all the people[0] values to the first SinglePersonCommand in the people list inside the MultiplePersonCommand, and so on. Then I can access this list and copy across the values to People objects and save them.

I hope this is useful to someone. I’m looking forward to spending some time on Groovy and Grails development, so hopefully more here soon!

 

maven, junit, cobertura and BeanCreationException

I have a set of junit tests for my project. When I run them with mvn test they run fine and pass.

However, when I ran them with mvn cobertura:cobertura, they failed with the following error:

org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'com.anorakgirl.MyTest': Autowiring of fields failed;
nested exception is org.springframework.beans.factory.BeanCreationException:
Could not autowire field: private com.anorakgirl.MyService com.anorakgirl.MyTest.myService;
nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException:
No unique bean of type [com.anorakgirl.MyService] is defined:
Unsatisfied dependency of type [class com.anorakgirl.MyService]:
expected at least 1 matching bean

After some googling, this is the problem. The ServiceObject was annotated with @Autowire. Spring autowires by type. I have only one class of type ‘com.anorakgirl.MyService’. So when run with junit this works fine. However, cobertura changes the type of the MyService class during Instrumentation, so Spring no longer recognises it as type MyService, and so cannot Autowire.

There are two possible answers:

The easy answer (what I did)

In your test application context add the line:

<aop:config proxy-target-class="true"/>

The other answer (what I read about)

Make sure the classes that you want to autowire all implement interfaces. Then autowire the interface rather than the implementation. The Cobertura class will still implement the same interface and therefore can still be autowired.

I didn’t try this as the service classes do not have interfaces and I haven’t time to add them!

Tomcat Error listenerStart

I’m sure I’m not the only one who has battled with this startup error:

4985 [main] ERROR org.apache.catalina.core.StandardContext  – Error listenerStart
4985 [main] ERROR org.apache.catalina.core.StandardContext  – Context [/mycontext] startup failed due to previous errors

The problem is getting the real cause of the problem to appear in the log. I’ve had this problem various times in the past (must be repeating the same mistakes). Today’s answer was simple:

  1. Remove log4j.jar from myapp/WEB-INF/lib (it is already in Tomcat/common/lib)
  2. Restart app.
  3. Debug using the reams of useful error messages which suddenly appear in the output.

Spring Security: Method Level Security with JSF so far…

My personal Gotcha’s in case they are of use to anyone else:

1. Ensure you have compatible versions of Spring and Spring Security. I am using Sping Security 2.0.4 and Spring 2.5.6. Originally my Spring was a slightly older version (2.5) and I got the following error:

java.lang.NoSuchMethodError:
org.springframework.aop.config.AopNamespaceUtils.registerAutoProxyCreatorIfNecessary

I fixed this by upgrading to the latest Spring. I think the problem was resolved in Spring 2.5.2 and relates to this bug: http://jira.springframework.org/browse/SPR-4459

2. Make sure the methods you are securing are actually in Spring Managed beans, doh! My @Secured annoration was being ignored entirely, and it took me ages to realise why – some of my beans are still in faces config files, so Spring has no way of knowing about them. Moving the beans into the Spring configuration fixed the problem straight away.

Spring Security 2.0: NTLM filter with custom UserDetailsService

I used this blog to get started: http://blog.mediasoft.be/ntlm-with-spring-security-20/

My Application-context is pretty much as per the Spring 2.0 configuration at the bottom of the post, with the following changes:

  • Different properties for the ntlm filter
  • servlet-api-provision=”false” on the <security:http> tag
  • Addition of custom UserDetailsService.

The full config is below:

<security:authentication-manager alias="_authenticationManager" />

    <bean id="authenticationProvider"
     class="com.mydomain.security.UserDetailsAuthenticationProvider">
        <security:custom-authentication-provider />
        <property name="userDetailsService" ref="userDetailsService" />
    </bean>

    <bean id="userDetailsService" class="com.mydomain.security.UserDetailsServiceImpl">
        <property name="daoUser" ref="daoUser" />
    </bean>

    <bean id="ntlmFilter" class="org.springframework.security.ui.ntlm.NtlmProcessingFilter">
        <security:custom-filter position="NTLM_FILTER" />
        <property name="stripDomain" value="true" />
        <property name="defaultDomain" value="mydomain.com" />
        <property name="domainController" value="mycontroller" />
        <property name="authenticationManager" ref="_authenticationManager" />
    </bean>

    <bean id="ntlmEntryPoint"
     class="org.springframework.security.ui.ntlm.NtlmProcessingFilterEntryPoint">
        <property name="authenticationFailureUrl" value="/denied.xhtml" />
    </bean>

    <bean id="exceptionTranslationFilter"
     class="org.springframework.security.ui.ExceptionTranslationFilter">
        <property name="authenticationEntryPoint" ref="ntlmEntryPoint" />
    </bean>

    <security:http access-denied-page="/denied.xhtml"
     entry-point-ref="ntlmEntryPoint" servlet-api-provision="false">
        <security:intercept-url pattern="/denied.xhtml" filters="none" />
        <security:intercept-url pattern="/**" access="ROLE_USER" />
    </security:http>

My UserDetailsAuthenticationProvider is exactly as per the blog.

My UserDetails Service is below. Note that the UserDetails created should have blank password. This confused me for a bit.

public class UserDetailsServiceImpl implements UserDetailsService {
	private UserDAO daoUser;
	private static Logger log = Logger.getLogger(UserDetailsService.class);

	public UserDAO getDaoUser() {
		return daoUser;
	}

	public void setDaoUser(UserDAO daoUser) {
		this.daoUser = daoUser;
	}

	public UserDetails loadUserByUsername(String username)
			throws UsernameNotFoundException, DataAccessException {

		MyUser user;

		try {
			user = daoUser.getUser(username);
		} catch (DAOException ex) {
			throw new DataRetrievalFailureException(ex.getMessage());
		}

        if (user != null) {

            ArrayList<GrantedAuthority> ga = new ArrayList<GrantedAuthority>();
            ga.add(new GrantedAuthorityImpl("ROLE_USER"));
            GrantedAuthority[] grantedAuthorities = new GrantedAuthority[ga.size()];
            ga.toArray(grantedAuthorities);

            UserDetailsImpl ud = new UserDetailsImpl(username, "", true, grantedAuthorities, user);
            return ud;
        } else {
            throw new UsernameNotFoundException("Username Not Found");
	}
}

My UserDetailsImpl simply extends org.springframework.security.userdetails.User and has an additional field for my ‘MyUser’

public class UserDetailsImpl extends org.springframework.security.userdetails.User {

	private static final long serialVersionUID = 1584153957347843760L;

	private MyUser user;

	public UserDetailsImpl(String username, String password, boolean enabled,
			 GrantedAuthority[] authorities, MyUser user)
			throws IllegalArgumentException {
		super(username, password, enabled, true, true,
				true, authorities);
		this.user = user;
	}

	public MyUser getUser() {
		return user;
	}

	public void setUser(MyUser user) {
		this.user = user;
	}
}

And that seems to work. Now I am trying to enable method level security, so more to come soon…

a4j:commandLink action not executed in datatable

I have an <a4j:commandLink> in a <rich:datatable>. The same problem applies to <a4j:commandButton> and <a4j:repeat>. The bean action specified was not executed, and the <a4j:actionparam> values were not bound.

For example this was not working:

<a4j:form>
   <rich:dataTable id="searchResults" value="#{myBean.searchResults}" var="item">
            <rich:column>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
              </rich:column>
   </rich:dataTable>
</a4j:form>

The region was getting rerendered, but myBean.myAction was not executed.

Then I tried moving the <a4j:form> inside the table, so there was a form on each row:

   <rich:dataTable id="searchResults" value="#{myBean.searchResults}" var="item">
      <rich:column>
              <a4j:form>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
                </a4j:form>
              </rich:column>
   </rich:dataTable>

This seemed to work for the first row, but not any subsequent ones.

The answer seems to be to base the dataTable on a session scoped bean. I didn’t want my orignal bean session scoped, so I split it into two like this:

 <rich:dataTable id="searchResults" value="#{mySessionBean.searchResults}" var="item">
      <rich:column>
              <a4j:form>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
                </a4j:form>
              </rich:column>
   </rich:dataTable>

And it works. The actions are still carried out on my request bean as I wanted and I just have to be careful about how I update the session bean.