Partial updates of JSON data in Postgres (using JDBI)

With Postgres 9.5 comes the jsonb_set function, for updating a single key within a JSON column. Hooray!

A sample bit of SQL might look like this:

update mytable 
set myfield = jsonb_set(myfield,'{key, subkey}', '"new string value"'::jsonb) 
where id = 5

I’ve put a text value in the example, but the new value can be an entire JSON structure.

I’ve posted previously on using JSON and Postgres with JDBI. To use the jsonb_set function, we need to reuse the BindJson annotation covered in that post. The jsonb_set function also takes an array parameter, defining the path to the key to be set. For this I wrote a new Bind annotation:

@BindingAnnotation(BindTextArray.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindTextArray {
    
    String value();

    public static class JsonBinderFactory implements BinderFactory {

        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindTextArray, String[]>() {
                
                @Override
                public void bind(SQLStatement sqlStatement, BindTextArray bind, String[] array) {
                    try {
                        String fieldName = bind.value();
                        Connection con = sqlStatement.getContext().getConnection();                        
                        Array sqlArray = con.createArrayOf("text", array);
			sqlStatement.bindBySqlType(fieldName, sqlArray, Types.ARRAY);
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding Array",ex);
                    }
                }
            };
        }
    }

(Code based on this post: http://codeonthecobb.com/2015/04/18/using-jdbi-with-postgres-arrays/).

Here’s the DAO for the SQL example above, using the new Bind annotation:

 
@SqlUpdate("update mytable set myfield = jsonb_set(myfield, :path,:json) where id = :id")
void updateMyTable(@Bind("id") int id, @BindTextArray("path") String[] path, @BindJson("json") String json)

As you can see, there are limitations to this functionality. We can’t update two different elements in the same JSON column, so if you want to do that, you still need to do it in code. However, the new syntax is handy if you want to update one section of your JSON document, without loading the whole thing into your code base.

  

Using JDBI with Postgres JSON data

I’ve been migrating some raw JDBC code over to JDBI, and joyfully stripping out lines of boilerplate code for preparing statements, opening record sets, sometimes remembering to close them, handling SQL exceptions which won’t ever occur anyway, and so on. Using the SQL Object API means the only code you have to write is the SQL and a ResultSetMapper to determine how to create your domain objects from the resultset. It really promotes adherence to the single responsibility principle and discourages you from mixing logic in with your database access code.

The database in question has a number of fields containing JSON data. More specifically, they use the PostgreSQL jsonb data type. This has required a little more tinkering to get working.

Inserting jsonb data

Out of the box, JDBI provides two annotations for binding parameters. The @Bind annotation binds a single named argument, and @BindBean binds bean properties with matching names. However, to insert jsonb data, you need to first create an PGobject instance and bind that. To do this, I created a new Binding annotation, following the guidance here: http://jdbi.org/sql_object_api_argument_binding/

The annotation code looks like this:

BindingAnnotation(BindJson.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindJson {
    String value();

    public static class JsonBinderFactory implements BinderFactory {
        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindJson, String>() {                
                @Override
                public void bind(SQLStatement q, BindJson bind, String jsonString) {
                    try {
                        PGobject data = new PGobject();
                        data.setType("jsonb");
                        data.setValue(jsonString);
                        q.bind(bind.value(), data);                        
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding JSON",ex);
                    }
                }
            };
        }
    }
}

To use it, annotate the json parameter with the new annotation:

@SqlUpdate("insert into my_table (id,data) VALUES (:id,:data)")
void insertJson(@Bind("id") int id, @BindJson("data") String jsonString);

And that’s it; it just works.

Querying json dynamically

I had a requirement where the parameter supplied to the query was the name of the json element to return. For example, consider the json below. I wanted to be able to paramterise a query to return either one of the key values.

{
   "element": {
      "key1": "value1",
      "key2": "value2",
      "key3": "value3"
   }
}

Using raw JDBC it was possible (although not very pretty) to concatenate a suitable sql statement and then execute it:

String sql = "select data->'element1'->'" + subKeyName + "' as value from mytable"
...

This isn’t possible when the SQL string is specified as a JDBI annotation. I found some useful Postgres json processing functions, including jsonb_extract_path_text which allows you to bind parameters normally:

@SqlQuery("select jsonb_extract_path_text(data,'element1',:subKeyName) as value from mytable")
List<String> getSubKey(@Bind("subKeyName") String subKeyName)

So far I haven’t come across any other issues using JDBI with a PostgreSQL JSON data store. I’m looking forward to trying out the new jsonb functionality in PostgreSQL 9.5 which supports writing partial updates to json fields, yippee!

  

Immutability and Collections.unmodifiableList

Or: why is my list still mutable?

Today I learnt a new thing about Collections.unmodifiableList.

First, I noted that Sonar wasn’t raising a violation where I thought it should.

Take the sample class below

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = myList;
    }

    public List<String> getMyList() {
        return myList;
    }
    
}

I would expect sonar to raise the following:

Malicious code vulnerability - May expose internal representation by incorporating reference to mutable object (Findbugs: EI_EXPOSE_REP2)
Malicious code vulnerability - May expose internal representation by returning reference to mutable object (Findbugs: EI_EXPOSE_REP)

But it didn’t. This is a mystery in itself, as I am sure it has picked me up on this before.

Anyway, no problem, I wrote a test for the class myself:

/**
     * Test of getMyList method, of class MyImmutableClass.
     */
    @Test
    public void testGetMyList() {
        List<String> list = new ArrayList<String>();
        list.add("Potato");
        list.add("Apple");
        
        MyImmutableClass instance = new MyImmutableClass(list);
        
        assertEquals(2,instance.getMyList().size());
        
        //check for immutability
        list.add("item 3");
        assertEquals(2,instance.getMyList().size());
        
        try {
            instance.getMyList().add("Message3");
            fail("Should not be possible");
        } catch (Exception ex) {
            assertTrue( ex instanceof UnsupportedOperationException);
        }
    }

Of course, the test fails with the code as it is.

So I modified it to this:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(myList);
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

And it STILL failed. I was very confused. How come Collections.unmodifiableList in the constructer wasn’t stopping the list inside the class from changing?

It took some googling to find the answer. For example: this stackoverflow post.
If you pay proper attention to the javadoc for Collections.unmodifiableList, it makes sense.


     * Returns an unmodifiable view of the specified list.  This method allows
     * modules to provide users with "read-only" access to internal
     * lists.  Query operations on the returned list "read through" to the
     * specified list, and attempts to modify the returned list, whether
     * direct or via its iterator, result in an
     * UnsupportedOperationException.

So, this just wraps the original list inside an unmodifiable view. I can’t modify the one inside my class, but if I modify the one I used to create the class, the view reflects the update.

The correct way to make my class immutable is:

public final class MyImmutableClass {
    
    private final List<String> myList;

    public MyImmutableClass(List<String>myList) {
        this.myList = Collections.unmodifiableList(new ArrayList<String>(myList));
    }

    public List<String> getMyList() {
        return Collections.unmodifiableList(myList);
    }
    
}

Phew. As an afterthought, because sonar and findbugs did not pick this up, I’m thinking of taking a look at this: http://mutability-detector.blogspot.co.uk/. We like to make all classes immutable, unless there is a good reason not to. It would be interesting to see what else has slipped through.

  

Sharing junit tests with Maven

We’ve now had two cases at work where we have created a set of junit tests, which we want to re-use in other maven modules. It is a two stage process, and there are a couple of options for how to run the tests in the project which imports them.

Creating re-usable tests

First, we create the tests in the ‘tester-module’. These are standard unit tests in src/test/java. Of course, they must pass, or your project will not build. So you need some sample implementations of the what you are testing within that project. For ease of use, we put all the tests which will be exported into a single package.

To create a test-jar, which can be a dependency for other projects, add to the pom:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-jar-plugin</artifactId>
    <version>2.3.1</version>
    
    <executions>
        <execution>           
            <id>test-jar</id>
            <goals>
                <goal>test-jar</goal>
            </goals>
            <configuration>
                <includes>
                    <include>**/exported_tests/*</include>
                </includes>
            </configuration>
        </execution>       
    </executions>    
</plugin>

This will generate a test-jar containing only the tests in the package specified.

Importing the tests

To import the test to the module which will use them, the following dependency is added.

<dependency>
    <groupId>tester-group</groupId>
    <artifactId>tester-module</artifactId>
    <version>1.0</version>
    <type>test-jar</type>
    <scope>test</scope>
</dependency>

Running the tests

There are two ways of running the attached tests. They DO NOT get run automatically when you run mvn test.

Method 1. Extracting the dependency

Adding the following to the pom will extract the test classes into target/test-classes, so that they all get run when you run mvn test. This works well if you always want to run ALL the attached tests.

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>process-test-classes</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>tester-group</groupId>
                        <artifactId>tester-module</artifactId>
                        <version>1.0</version>
                        <type>test-jar</type>
                        <outputDirectory>${project.build.directory}/test-classes</outputDirectory>
                    </artifactItem>
                </artifactItems>
            </configuration>
        </execution>
    </executions>
</plugin>

Method 2. Using test suites

If you want to run certain tests only, you can add a TestSuite to the module, and define which of the attached tests should be run as part of the test suite.

@RunWith(Suite.class)
@Suite.SuiteClasses({TestClass1.class, TestClass2.class})
public class TestSuite {
  //nothing
}

An afterthought: testing the tests

In the situations where we are re-using our tests, the tests themselves become a critical component of our systems. We wanted to be able to test the tests too. We got round this by not including logic in the tests themselves. Instead, the test-module contains a set of ‘validators’, in src/main. We can then write tests for these ‘validators’ as usual. The tests in ‘exported_tests’ can then simply delegate to an appropriate validator, which we know has been tested.

assertTrue(validator.validate(classUnderTest));

The only difference this makes is that you have to add a normal dependency to the tester-module, as well as the test-jar dependency.

We’ve found this approach very useful, as we’re using maven and junit as a framework for testing other file resources. However I think it is useful for java code too, if you have a set of interfaces, and different implementations in different modules.

References:
http://maven.apache.org/guides/mini/guide-attached-tests.html
http://softwaremavens.blogspot.co.uk/2009/09/running-tests-from-maven-test-jar-in.html

  

maven, junit, cobertura and BeanCreationException

I have a set of junit tests for my project. When I run them with mvn test they run fine and pass.

However, when I ran them with mvn cobertura:cobertura, they failed with the following error:

org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'com.anorakgirl.MyTest': Autowiring of fields failed;
nested exception is org.springframework.beans.factory.BeanCreationException:
Could not autowire field: private com.anorakgirl.MyService com.anorakgirl.MyTest.myService;
nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException:
No unique bean of type [com.anorakgirl.MyService] is defined:
Unsatisfied dependency of type [class com.anorakgirl.MyService]:
expected at least 1 matching bean

After some googling, this is the problem. The ServiceObject was annotated with @Autowire. Spring autowires by type. I have only one class of type ‘com.anorakgirl.MyService’. So when run with junit this works fine. However, cobertura changes the type of the MyService class during Instrumentation, so Spring no longer recognises it as type MyService, and so cannot Autowire.

There are two possible answers:

The easy answer (what I did)

In your test application context add the line:

<aop:config proxy-target-class="true"/>

The other answer (what I read about)

Make sure the classes that you want to autowire all implement interfaces. Then autowire the interface rather than the implementation. The Cobertura class will still implement the same interface and therefore can still be autowired.

I didn’t try this as the service classes do not have interfaces and I haven’t time to add them!

  

Tomcat Error listenerStart

I’m sure I’m not the only one who has battled with this startup error:

4985 [main] ERROR org.apache.catalina.core.StandardContext  – Error listenerStart
4985 [main] ERROR org.apache.catalina.core.StandardContext  – Context [/mycontext] startup failed due to previous errors

The problem is getting the real cause of the problem to appear in the log. I’ve had this problem various times in the past (must be repeating the same mistakes). Today’s answer was simple:

  1. Remove log4j.jar from myapp/WEB-INF/lib (it is already in Tomcat/common/lib)
  2. Restart app.
  3. Debug using the reams of useful error messages which suddenly appear in the output.
  

Spring Security: Method Level Security with JSF so far…

My personal Gotcha’s in case they are of use to anyone else:

1. Ensure you have compatible versions of Spring and Spring Security. I am using Sping Security 2.0.4 and Spring 2.5.6. Originally my Spring was a slightly older version (2.5) and I got the following error:

java.lang.NoSuchMethodError:
org.springframework.aop.config.AopNamespaceUtils.registerAutoProxyCreatorIfNecessary

I fixed this by upgrading to the latest Spring. I think the problem was resolved in Spring 2.5.2 and relates to this bug: http://jira.springframework.org/browse/SPR-4459

2. Make sure the methods you are securing are actually in Spring Managed beans, doh! My @Secured annoration was being ignored entirely, and it took me ages to realise why – some of my beans are still in faces config files, so Spring has no way of knowing about them. Moving the beans into the Spring configuration fixed the problem straight away.

  

Spring Security 2.0: NTLM filter with custom UserDetailsService

I used this blog to get started: http://blog.mediasoft.be/ntlm-with-spring-security-20/

My Application-context is pretty much as per the Spring 2.0 configuration at the bottom of the post, with the following changes:

  • Different properties for the ntlm filter
  • servlet-api-provision=”false” on the <security:http> tag
  • Addition of custom UserDetailsService.

The full config is below:

<security:authentication-manager alias="_authenticationManager" />

    <bean id="authenticationProvider"
     class="com.mydomain.security.UserDetailsAuthenticationProvider">
        <security:custom-authentication-provider />
        <property name="userDetailsService" ref="userDetailsService" />
    </bean>

    <bean id="userDetailsService" class="com.mydomain.security.UserDetailsServiceImpl">
        <property name="daoUser" ref="daoUser" />
    </bean>

    <bean id="ntlmFilter" class="org.springframework.security.ui.ntlm.NtlmProcessingFilter">
        <security:custom-filter position="NTLM_FILTER" />
        <property name="stripDomain" value="true" />
        <property name="defaultDomain" value="mydomain.com" />
        <property name="domainController" value="mycontroller" />
        <property name="authenticationManager" ref="_authenticationManager" />
    </bean>

    <bean id="ntlmEntryPoint"
     class="org.springframework.security.ui.ntlm.NtlmProcessingFilterEntryPoint">
        <property name="authenticationFailureUrl" value="/denied.xhtml" />
    </bean>

    <bean id="exceptionTranslationFilter"
     class="org.springframework.security.ui.ExceptionTranslationFilter">
        <property name="authenticationEntryPoint" ref="ntlmEntryPoint" />
    </bean>

    <security:http access-denied-page="/denied.xhtml"
     entry-point-ref="ntlmEntryPoint" servlet-api-provision="false">
        <security:intercept-url pattern="/denied.xhtml" filters="none" />
        <security:intercept-url pattern="/**" access="ROLE_USER" />
    </security:http>

My UserDetailsAuthenticationProvider is exactly as per the blog.

My UserDetails Service is below. Note that the UserDetails created should have blank password. This confused me for a bit.

public class UserDetailsServiceImpl implements UserDetailsService {
	private UserDAO daoUser;
	private static Logger log = Logger.getLogger(UserDetailsService.class);

	public UserDAO getDaoUser() {
		return daoUser;
	}

	public void setDaoUser(UserDAO daoUser) {
		this.daoUser = daoUser;
	}

	public UserDetails loadUserByUsername(String username)
			throws UsernameNotFoundException, DataAccessException {

		MyUser user;

		try {
			user = daoUser.getUser(username);
		} catch (DAOException ex) {
			throw new DataRetrievalFailureException(ex.getMessage());
		}

        if (user != null) {

            ArrayList<GrantedAuthority> ga = new ArrayList<GrantedAuthority>();
            ga.add(new GrantedAuthorityImpl("ROLE_USER"));
            GrantedAuthority[] grantedAuthorities = new GrantedAuthority[ga.size()];
            ga.toArray(grantedAuthorities);

            UserDetailsImpl ud = new UserDetailsImpl(username, "", true, grantedAuthorities, user);
            return ud;
        } else {
            throw new UsernameNotFoundException("Username Not Found");
	}
}

My UserDetailsImpl simply extends org.springframework.security.userdetails.User and has an additional field for my ‘MyUser’

public class UserDetailsImpl extends org.springframework.security.userdetails.User {

	private static final long serialVersionUID = 1584153957347843760L;

	private MyUser user;

	public UserDetailsImpl(String username, String password, boolean enabled,
			 GrantedAuthority[] authorities, MyUser user)
			throws IllegalArgumentException {
		super(username, password, enabled, true, true,
				true, authorities);
		this.user = user;
	}

	public MyUser getUser() {
		return user;
	}

	public void setUser(MyUser user) {
		this.user = user;
	}
}

And that seems to work. Now I am trying to enable method level security, so more to come soon…

  

a4j:commandLink action not executed in datatable

I have an <a4j:commandLink> in a <rich:datatable>. The same problem applies to <a4j:commandButton> and <a4j:repeat>. The bean action specified was not executed, and the <a4j:actionparam> values were not bound.

For example this was not working:

<a4j:form>
   <rich:dataTable id="searchResults" value="#{myBean.searchResults}" var="item">
            <rich:column>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
              </rich:column>
   </rich:dataTable>
</a4j:form>

The region was getting rerendered, but myBean.myAction was not executed.

Then I tried moving the <a4j:form> inside the table, so there was a form on each row:

   <rich:dataTable id="searchResults" value="#{myBean.searchResults}" var="item">
      <rich:column>
              <a4j:form>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
                </a4j:form>
              </rich:column>
   </rich:dataTable>

This seemed to work for the first row, but not any subsequent ones.

The answer seems to be to base the dataTable on a session scoped bean. I didn’t want my orignal bean session scoped, so I split it into two like this:

 <rich:dataTable id="searchResults" value="#{mySessionBean.searchResults}" var="item">
      <rich:column>
              <a4j:form>
               <a4j:commandLink value="#{item.code}" action="#{myBean.myAction}"
                reRender="myRegion">
                    <a4j:actionparam name="code" value="#{item.code}"
                     assignTo="#{myBean.selectedCode}"/>
                </a4j:commandLink>
                </a4j:form>
              </rich:column>
   </rich:dataTable>

And it works. The actions are still carried out on my request bean as I wanted and I just have to be careful about how I update the session bean.

  

NoClassDefFoundError com.sun.activation.registries.LogSupport

I’m trying to run a single Junit test via Eclipse and getting this error. But I don ‘t get the error if I run the whole suite of junit tests using ant. The jars activation.jar and mail.jar are definitely in my Project classpath.

The answer: go to Project > Properties > Java Build Path > Order and Export. Move activation.jar up the list.