Transaction handling woes in JDBI 3

I’ve stalled on a project, because I decided to have a go with JDBI 3. I’ve only used JDBI 2 in previous projects – this is the general pattern I was using:

An Interface:

public interface AccountService {
    void addAccount(Account account, User user);
}  

Implementation:

public abstract class AccountServiceJdbi implements AccountService {
    
    @Override  
    @Transaction  
    public final void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    abstract AccountDao accountDao();
}

You can imagine the Dao – just a simple interface with @SQLQuery and @SQLUpdate annotations.
The Service is instantiated using

dbi.onDemand(AccountServiceJdbi.class);

I like this approach, because it is easy to write a Mock implementation of AccountService for use when testing other components, and to create a concrete extension of AccountServiceJdbi with a mock AccountDao for testing the service class logic.

The abstract class allows me to have transactional methods that combine Data Access methods from one or more DAOs. The implemented methods are final, because my class is not intended to be sub classed. This prohibits unintended overriding of methods, for example when creating a concrete implementation for testing.

HOWEVER…

If I try to follow this pattern in JDBI I get the following error:

java.lang.IllegalArgumentException: On-demand extensions are only supported for interfaces

As far as I can tell, these are my options in JDBI 3:

1. Use default methods in interfaces:

public interface AccountServiceJdbi extends AccountService {
    
    @Override  
    @Transaction  
    default void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    AccountDao accountDao();
}

This may look similar to what I was doing in JDBI 2, but it feels WRONG. I’ve lost all control over concrete implementations of this interface. You can’t make default methods final, so if I create an implementation for testing, I can’t guarantee that some of the default methods won’t be overridden.

Furthermore, in the abstract class, the `accountDao` method has default access, so can’t be accessed from outside the package. In JDBI 3 I lose this restriction; all methods in interfaces are public, so my AccountDao can be accessed from outside the package.

It just feels less concise; I am less able to prescribe how the classes should be used. More commenting and self enforced discipline will be required.

2. Handle transactions programmatically:

public final class AccountServiceJdbi implements AccountService {
    
    @Override 
    public void addAccount(Account account, User user) {
        jdbi.useTransaction(handle -> {
           AccountDao accountDao = handle.attach(AccountDao.class);
           long accountId =  accountDao().insertAccount(account);
           accountDao().linkAccountToOwner(accountId, user.getId());
       });
    }    
}

Because this is a concrete implementation, I can make it final and prohibit unintended sub classing. There is no unintended access to DAO classes.

However, the transaction handling code is entwined with the business logic, making it difficult to unit test this file without connecting to a database; I can’t easily create mock implementations of the Dao classes in order to test the business logic in isolation, because the DAOs are instantiated within the business logic. I guess I could do this with a proper mocking framework, but personally I’d much rather implement simple mocks myself for testing.

SO…

I’m still stalled. I don’t like either of these approaches, and I’m not very good at writing code I don’t like. I could go back to the JDBI 2 but that doesn’t seem right either. Suggestions welcome…

Executing JavaScript on the JVM with Nashorn

We’ve been using Nashorn for a while to execute some very simple Javascript expressions. The latest challenge was to run some JavaScript from a Node.js project, which in turn had dependencies on a third party package. There were a few gotchas which I thought I’d share (disclaimer: I’m not a JavaScript developer, so this might all be obvious stuff to some).

The original JavaScript was in a plain old text file, with functions at the top level

function testThing(thing) {
   return thing === 'tiger';
}

This is how I was executing it:

ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");
String js = IOUtils.toString(this.getClass().getClassLoader().getResource("my-script.js"), "UTF-8");            
engine.eval(js);
Invocable invocable = (Invocable) engine;
Object result = invocable.invokeFunction("testThing", theThing);

Running Javascript generated from a Node.js project required a bit of tinkering:

1. In Java 8 Nashorn will only load ES5 compatible Javascript

I was running browserify to create a single JavaScript file from the Node.js project. Trying to load the generated instead of the plain javascript failed miserably:

javax.script.ScriptException: :1079:0 Expected : but found }

To get Nashorn to evaluate the js file, I had to transpile to ES5 using babel.

I decided I needed to run browsify in standalone mode, so that my exported functions were attached to a global variable. The scripts section in package.json had the following:

"build": "browserify src/main.js -r --standalone MyLib -o dist/build.js",
"transpile": "babel dist/build.js --out-file dist/build.es5.js"

This was a small step forward but Nashorn was still not able to load the script successfully…

2. Nashorn cannot set property of undefined!

Loading the transpiled file in Nashorn threw the following:

javax.script.ScriptException: TypeError: Cannot set property “MyLib” of undefined in <eval> at line number 19

Looking at the beginning of the transpiled file, you can see it is trying to determine where would be suitable to attach the global variable:

(function (f) {
  if ((typeof exports === "undefined" ? "undefined" : _typeof(exports)) === "object" && typeof module !== "undefined") {
    module.exports = f();
  } else if (typeof define === "function" && define.amd) {
    define([], f);
  } else {
    var g;if (typeof window !== "undefined") {
      g = window;
    } else if (typeof global !== "undefined") {      
      g = global;
    } else if (typeof self !== "undefined") {
      g = self;
    } else {
      g = this;
    }
    g.MyLib = f();
  }
})

A little bit of detective work using print() showed that it was falling through to the final “else” as it couldn’t find anything else to attach to. Adding this line to the Java, before parsing the file, magicked away this problem.

engine.eval("var global = this;");

(It doesn’t quite make sense to me why this works, as loading the script without this line it seems to think “this” is undefined?)

3. Executing methods of objects is different from executing functions

At this point, the file was loading without an error, but trying to execute the function using the original Java code with the new JavaScript file threw a NoSuchMethodException

java.lang.NoSuchMethodException: No such function testThing

This makes sense, as “testThing” is now a method of the MyLib global variable. However none of the permutations I tried in order to access the function worked. I tried all sorts, e.g:

Object result = invocable.invokeFunction("global.MyLib.testThing", theThing);

The key issue here is that testThing isn’t a function now, it is a method on an object, so we have to use invokeMethod:

Object result = invocable.invokeMethod(engine.eval("global.MyLib"), "testThing", theThing);

And that worked :)

Not sure if any of this is the correct way to approach executing Node.js based Javascript on the JVM, but these were the three gotchas that took me a while to work out.

Partial updates of JSON data in Postgres (using JDBI)

With Postgres 9.5 comes the jsonb_set function, for updating a single key within a JSON column. Hooray!

A sample bit of SQL might look like this:

update mytable 
set myfield = jsonb_set(myfield,'{key, subkey}', '"new string value"'::jsonb) 
where id = 5

I’ve put a text value in the example, but the new value can be an entire JSON structure.

I’ve posted previously on using JSON and Postgres with JDBI. To use the jsonb_set function, we need to reuse the BindJson annotation covered in that post. The jsonb_set function also takes an array parameter, defining the path to the key to be set. For this I wrote a new Bind annotation:

@BindingAnnotation(BindTextArray.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindTextArray {
    
    String value();

    public static class JsonBinderFactory implements BinderFactory {

        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindTextArray, String[]>() {
                
                @Override
                public void bind(SQLStatement sqlStatement, BindTextArray bind, String[] array) {
                    try {
                        String fieldName = bind.value();
                        Connection con = sqlStatement.getContext().getConnection();                        
                        Array sqlArray = con.createArrayOf("text", array);
			sqlStatement.bindBySqlType(fieldName, sqlArray, Types.ARRAY);
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding Array",ex);
                    }
                }
            };
        }
    }

(Code based on this post: http://codeonthecobb.com/2015/04/18/using-jdbi-with-postgres-arrays/).

Here’s the DAO for the SQL example above, using the new Bind annotation:

 
@SqlUpdate("update mytable set myfield = jsonb_set(myfield, :path,:json) where id = :id")
void updateMyTable(@Bind("id") int id, @BindTextArray("path") String[] path, @BindJson("json") String json)

As you can see, there are limitations to this functionality. We can’t update two different elements in the same JSON column, so if you want to do that, you still need to do it in code. However, the new syntax is handy if you want to update one section of your JSON document, without loading the whole thing into your code base.

Using JDBI with Postgres JSON data

I’ve been migrating some raw JDBC code over to JDBI, and joyfully stripping out lines of boilerplate code for preparing statements, opening record sets, sometimes remembering to close them, handling SQL exceptions which won’t ever occur anyway, and so on. Using the SQL Object API means the only code you have to write is the SQL and a ResultSetMapper to determine how to create your domain objects from the resultset. It really promotes adherence to the single responsibility principle and discourages you from mixing logic in with your database access code.

The database in question has a number of fields containing JSON data. More specifically, they use the PostgreSQL jsonb data type. This has required a little more tinkering to get working.

Inserting jsonb data

Out of the box, JDBI provides two annotations for binding parameters. The @Bind annotation binds a single named argument, and @BindBean binds bean properties with matching names. However, to insert jsonb data, you need to first create an PGobject instance and bind that. To do this, I created a new Binding annotation, following the guidance here: http://jdbi.org/sql_object_api_argument_binding/

The annotation code looks like this:

BindingAnnotation(BindJson.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindJson {
    String value();

    public static class JsonBinderFactory implements BinderFactory {
        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindJson, String>() {                
                @Override
                public void bind(SQLStatement q, BindJson bind, String jsonString) {
                    try {
                        PGobject data = new PGobject();
                        data.setType("jsonb");
                        data.setValue(jsonString);
                        q.bind(bind.value(), data);                        
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding JSON",ex);
                    }
                }
            };
        }
    }
}

To use it, annotate the json parameter with the new annotation:

@SqlUpdate("insert into my_table (id,data) VALUES (:id,:data)")
void insertJson(@Bind("id") int id, @BindJson("data") String jsonString);

And that’s it; it just works.

Querying json dynamically

I had a requirement where the parameter supplied to the query was the name of the json element to return. For example, consider the json below. I wanted to be able to paramterise a query to return either one of the key values.

{
   "element": {
      "key1": "value1",
      "key2": "value2",
      "key3": "value3"
   }
}

Using raw JDBC it was possible (although not very pretty) to concatenate a suitable sql statement and then execute it:

String sql = "select data->'element1'->'" + subKeyName + "' as value from mytable"
...

This isn’t possible when the SQL string is specified as a JDBI annotation. I found some useful Postgres json processing functions, including jsonb_extract_path_text which allows you to bind parameters normally:

@SqlQuery("select jsonb_extract_path_text(data,'element1',:subKeyName) as value from mytable")
List<String> getSubKey(@Bind("subKeyName") String subKeyName)

So far I haven’t come across any other issues using JDBI with a PostgreSQL JSON data store. I’m looking forward to trying out the new jsonb functionality in PostgreSQL 9.5 which supports writing partial updates to json fields, yippee!

Immutability and Collections.unmodifiableList

Or: why is my list still mutable?

Today I learnt a new thing about Collections.unmodifiableList.

First, I noted that Sonar wasn’t raising a violation where I thought it should.

Take the sample class below

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = myList;
    }

    public List<String> getMyList() {
        return myList;
    }
    
}

I would expect sonar to raise the following:

Malicious code vulnerability - May expose internal representation by incorporating reference to mutable object (Findbugs: EI_EXPOSE_REP2)
Malicious code vulnerability - May expose internal representation by returning reference to mutable object (Findbugs: EI_EXPOSE_REP)

But it didn’t. This is a mystery in itself, as I am sure it has picked me up on this before.

Anyway, no problem, I wrote a test for the class myself:

/**
     * Test of getMyList method, of class MyImmutableClass.
     */
    @Test
    public void testGetMyList() {
        List<String> list = new ArrayList<String>();
        list.add("Potato");
        list.add("Apple");
        
        MyImmutableClass instance = new MyImmutableClass(list);
        
        assertEquals(2,instance.getMyList().size());
        
        //check for immutability
        list.add("item 3");
        assertEquals(2,instance.getMyList().size());
        
        try {
            instance.getMyList().add("Message3");
            fail("Should not be possible");
        } catch (Exception ex) {
            assertTrue( ex instanceof UnsupportedOperationException);
        }
    }

Of course, the test fails with the code as it is.

So I modified it to this:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(myList);
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

And it STILL failed. I was very confused. How come Collections.unmodifiableList in the constructer wasn’t stopping the list inside the class from changing?

It took some googling to find the answer. For example: this stackoverflow post.
If you pay proper attention to the javadoc for Collections.unmodifiableList, it makes sense.


     * Returns an unmodifiable view of the specified list.  This method allows
     * modules to provide users with "read-only" access to internal
     * lists.  Query operations on the returned list "read through" to the
     * specified list, and attempts to modify the returned list, whether
     * direct or via its iterator, result in an
     * UnsupportedOperationException.

So, this just wraps the original list inside an unmodifiable view. I can’t modify the one inside my class, but if I modify the one I used to create the class, the view reflects the update.

The correct way to make my class immutable is:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(new ArrayList<String>(myList));
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

Phew. As an afterthought, because sonar and findbugs did not pick this up, I’m thinking of taking a look at this: http://mutability-detector.blogspot.co.uk/. We like to make all classes immutable, unless there is a good reason not to. It would be interesting to see what else has slipped through.

Sharing junit tests with Maven

We’ve now had two cases at work where we have created a set of junit tests, which we want to re-use in other maven modules. It is a two stage process, and there are a couple of options for how to run the tests in the project which imports them.

Creating re-usable tests

First, we create the tests in the ‘tester-module’. These are standard unit tests in src/test/java. Of course, they must pass, or your project will not build. So you need some sample implementations of the what you are testing within that project. For ease of use, we put all the tests which will be exported into a single package.

To create a test-jar, which can be a dependency for other projects, add to the pom:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    
    <executions>
        <execution>           
            <id>test-jar</id>
            <goals>
                <goal>test-jar</goal>
            </goals>
            <configuration>
                <includes>
                    <include>**/exported_tests/*</include>
                </includes>
            </configuration>
        </execution>       
    </executions>    
</plugin>

This will generate a test-jar containing only the tests in the package specified.

Importing the tests

To import the test to the module which will use them, the following dependency is added.

<dependency>
    <groupId>tester-group</groupId>
    <artifactId>tester-module</artifactId>
    <version>1.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

Running the tests

There are two ways of running the attached tests. They DO NOT get run automatically when you run mvn test.

Method 1. Extracting the dependency

Adding the following to the pom will extract the test classes into target/test-classes, so that they all get run when you run mvn test. This works well if you always want to run ALL the attached tests.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>process-test-classes</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>tester-group</groupId>
                        <artifactId>tester-module</artifactId>
                        <version>1.0</version>
                        <type>test-jar</type>
                        <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>

Method 2. Using test suites

If you want to run certain tests only, you can add a TestSuite to the module, and define which of the attached tests should be run as part of the test suite.

@RunWith(Suite.class)
@Suite.SuiteClasses({TestClass1.class, TestClass2.class})
public class TestSuite {
  //nothing
}

An afterthought: testing the tests

In the situations where we are re-using our tests, the tests themselves become a critical component of our systems. We wanted to be able to test the tests too. We got round this by not including logic in the tests themselves. Instead, the test-module contains a set of ‘validators’, in src/main. We can then write tests for these ‘validators’ as usual. The tests in ‘exported_tests’ can then simply delegate to an appropriate validator, which we know has been tested.

assertTrue(validator.validate(classUnderTest));

The only difference this makes is that you have to add a normal dependency to the tester-module, as well as the test-jar dependency.

We’ve found this approach very useful, as we’re using maven and junit as a framework for testing other file resources. However I think it is useful for java code too, if you have a set of interfaces, and different implementations in different modules.

References:
http://maven.apache.org/guides/mini/guide-attached-tests.html
http://softwaremavens.blogspot.co.uk/2009/09/running-tests-from-maven-test-jar-in.html

maven, junit, cobertura and BeanCreationException

I have a set of junit tests for my project. When I run them with mvn test they run fine and pass.

However, when I ran them with mvn cobertura:cobertura, they failed with the following error:

org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'com.anorakgirl.MyTest': Autowiring of fields failed;
nested exception is org.springframework.beans.factory.BeanCreationException:
Could not autowire field: private com.anorakgirl.MyService com.anorakgirl.MyTest.myService;
nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException:
No unique bean of type [com.anorakgirl.MyService] is defined:
Unsatisfied dependency of type [class com.anorakgirl.MyService]:
expected at least 1 matching bean

After some googling, this is the problem. The ServiceObject was annotated with @Autowire. Spring autowires by type. I have only one class of type ‘com.anorakgirl.MyService’. So when run with junit this works fine. However, cobertura changes the type of the MyService class during Instrumentation, so Spring no longer recognises it as type MyService, and so cannot Autowire.

There are two possible answers:

The easy answer (what I did)

In your test application context add the line:

<aop:config proxy-target-class="true"/>

The other answer (what I read about)

Make sure the classes that you want to autowire all implement interfaces. Then autowire the interface rather than the implementation. The Cobertura class will still implement the same interface and therefore can still be autowired.

I didn’t try this as the service classes do not have interfaces and I haven’t time to add them!

Tomcat Error listenerStart

I’m sure I’m not the only one who has battled with this startup error:

4985 [main] ERROR org.apache.catalina.core.StandardContext  – Error listenerStart
4985 [main] ERROR org.apache.catalina.core.StandardContext  – Context [/mycontext] startup failed due to previous errors

The problem is getting the real cause of the problem to appear in the log. I’ve had this problem various times in the past (must be repeating the same mistakes). Today’s answer was simple:

  1. Remove log4j.jar from myapp/WEB-INF/lib (it is already in Tomcat/common/lib)
  2. Restart app.
  3. Debug using the reams of useful error messages which suddenly appear in the output.

Spring Security: Method Level Security with JSF so far…

My personal Gotcha’s in case they are of use to anyone else:

1. Ensure you have compatible versions of Spring and Spring Security. I am using Sping Security 2.0.4 and Spring 2.5.6. Originally my Spring was a slightly older version (2.5) and I got the following error:

java.lang.NoSuchMethodError:
org.springframework.aop.config.AopNamespaceUtils.registerAutoProxyCreatorIfNecessary

I fixed this by upgrading to the latest Spring. I think the problem was resolved in Spring 2.5.2 and relates to this bug: http://jira.springframework.org/browse/SPR-4459

2. Make sure the methods you are securing are actually in Spring Managed beans, doh! My @Secured annoration was being ignored entirely, and it took me ages to realise why – some of my beans are still in faces config files, so Spring has no way of knowing about them. Moving the beans into the Spring configuration fixed the problem straight away.

Spring Security 2.0: NTLM filter with custom UserDetailsService

I used this blog to get started: http://blog.mediasoft.be/ntlm-with-spring-security-20/

My Application-context is pretty much as per the Spring 2.0 configuration at the bottom of the post, with the following changes:

  • Different properties for the ntlm filter
  • servlet-api-provision=”false” on the <security:http> tag
  • Addition of custom UserDetailsService.

The full config is below:

<security:authentication-manager alias="_authenticationManager" />

    <bean id="authenticationProvider"
     class="com.mydomain.security.UserDetailsAuthenticationProvider">
        <security:custom-authentication-provider />
        <property name="userDetailsService" ref="userDetailsService" />
    </bean>

    <bean id="userDetailsService" class="com.mydomain.security.UserDetailsServiceImpl">
        <property name="daoUser" ref="daoUser" />
    </bean>

    <bean id="ntlmFilter" class="org.springframework.security.ui.ntlm.NtlmProcessingFilter">
        <security:custom-filter position="NTLM_FILTER" />
        <property name="stripDomain" value="true" />
        <property name="defaultDomain" value="mydomain.com" />
        <property name="domainController" value="mycontroller" />
        <property name="authenticationManager" ref="_authenticationManager" />
    </bean>

    <bean id="ntlmEntryPoint"
     class="org.springframework.security.ui.ntlm.NtlmProcessingFilterEntryPoint">
        <property name="authenticationFailureUrl" value="/denied.xhtml" />
    </bean>

    <bean id="exceptionTranslationFilter"
     class="org.springframework.security.ui.ExceptionTranslationFilter">
        <property name="authenticationEntryPoint" ref="ntlmEntryPoint" />
    </bean>

    <security:http access-denied-page="/denied.xhtml"
     entry-point-ref="ntlmEntryPoint" servlet-api-provision="false">
        <security:intercept-url pattern="/denied.xhtml" filters="none" />
        <security:intercept-url pattern="/**" access="ROLE_USER" />
    </security:http>

My UserDetailsAuthenticationProvider is exactly as per the blog.

My UserDetails Service is below. Note that the UserDetails created should have blank password. This confused me for a bit.

public class UserDetailsServiceImpl implements UserDetailsService {
	private UserDAO daoUser;
	private static Logger log = Logger.getLogger(UserDetailsService.class);

	public UserDAO getDaoUser() {
		return daoUser;
	}

	public void setDaoUser(UserDAO daoUser) {
		this.daoUser = daoUser;
	}

	public UserDetails loadUserByUsername(String username)
			throws UsernameNotFoundException, DataAccessException {

		MyUser user;

		try {
			user = daoUser.getUser(username);
		} catch (DAOException ex) {
			throw new DataRetrievalFailureException(ex.getMessage());
		}

        if (user != null) {

            ArrayList<GrantedAuthority> ga = new ArrayList<GrantedAuthority>();
            ga.add(new GrantedAuthorityImpl("ROLE_USER"));
            GrantedAuthority[] grantedAuthorities = new GrantedAuthority[ga.size()];
            ga.toArray(grantedAuthorities);

            UserDetailsImpl ud = new UserDetailsImpl(username, "", true, grantedAuthorities, user);
            return ud;
        } else {
            throw new UsernameNotFoundException("Username Not Found");
	}
}

My UserDetailsImpl simply extends org.springframework.security.userdetails.User and has an additional field for my ‘MyUser’

public class UserDetailsImpl extends org.springframework.security.userdetails.User {

	private static final long serialVersionUID = 1584153957347843760L;

	private MyUser user;

	public UserDetailsImpl(String username, String password, boolean enabled,
			 GrantedAuthority[] authorities, MyUser user)
			throws IllegalArgumentException {
		super(username, password, enabled, true, true,
				true, authorities);
		this.user = user;
	}

	public MyUser getUser() {
		return user;
	}

	public void setUser(MyUser user) {
		this.user = user;
	}
}

And that seems to work. Now I am trying to enable method level security, so more to come soon…