Transaction handling woes in JDBI 3

I’ve stalled on a project, because I decided to have a go with JDBI 3. I’ve only used JDBI 2 in previous projects – this is the general pattern I was using:

An Interface:

public interface AccountService {
    void addAccount(Account account, User user);
}  

Implementation:

public abstract class AccountServiceJdbi implements AccountService {
    
    @Override  
    @Transaction  
    public final void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    abstract AccountDao accountDao();
}

You can imagine the Dao – just a simple interface with @SQLQuery and @SQLUpdate annotations.
The Service is instantiated using

dbi.onDemand(AccountServiceJdbi.class);

I like this approach, because it is easy to write a Mock implementation of AccountService for use when testing other components, and to create a concrete extension of AccountServiceJdbi with a mock AccountDao for testing the service class logic.

The abstract class allows me to have transactional methods that combine Data Access methods from one or more DAOs. The implemented methods are final, because my class is not intended to be sub classed. This prohibits unintended overriding of methods, for example when creating a concrete implementation for testing.

HOWEVER…

If I try to follow this pattern in JDBI I get the following error:

java.lang.IllegalArgumentException: On-demand extensions are only supported for interfaces

As far as I can tell, these are my options in JDBI 3:

1. Use default methods in interfaces:

public interface AccountServiceJdbi extends AccountService {
    
    @Override  
    @Transaction  
    default void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    AccountDao accountDao();
}

This may look similar to what I was doing in JDBI 2, but it feels WRONG. I’ve lost all control over concrete implementations of this interface. You can’t make default methods final, so if I create an implementation for testing, I can’t guarantee that some of the default methods won’t be overridden.

Furthermore, in the abstract class, the `accountDao` method has default access, so can’t be accessed from outside the package. In JDBI 3 I lose this restriction; all methods in interfaces are public, so my AccountDao can be accessed from outside the package.

It just feels less concise; I am less able to prescribe how the classes should be used. More commenting and self enforced discipline will be required.

2. Handle transactions programmatically:

public final class AccountServiceJdbi implements AccountService {
    
    @Override 
    public void addAccount(Account account, User user) {
        jdbi.useTransaction(handle -> {
           AccountDao accountDao = handle.attach(AccountDao.class);
           long accountId =  accountDao().insertAccount(account);
           accountDao().linkAccountToOwner(accountId, user.getId());
       });
    }    
}

Because this is a concrete implementation, I can make it final and prohibit unintended sub classing. There is no unintended access to DAO classes.

However, the transaction handling code is entwined with the business logic, making it difficult to unit test this file without connecting to a database; I can’t easily create mock implementations of the Dao classes in order to test the business logic in isolation, because the DAOs are instantiated within the business logic. I guess I could do this with a proper mocking framework, but personally I’d much rather implement simple mocks myself for testing.

SO…

I’m still stalled. I don’t like either of these approaches, and I’m not very good at writing code I don’t like. I could go back to the JDBI 2 but that doesn’t seem right either. Suggestions welcome…

How to use a JDBI ResultSetMapper with a non default constructor

If your ResultSetMapper does not require a non default constructor, it is easy to specify a specific Mapper for a DAO method:

@UseMapper(PersonMapper.class)
@SqlQuery("select id, data from person where id = :id")
Person getPerson(@Bind("id") String id);

However, sometimes the ResultSetMapper may require a non default constructor. In this case you can register a mapper with the DBI:

dbi.registerMapper(new PersonMapper(extraParameters));

JDBI will use the Mapper globally wherever the return type of the DAO method matches that of the ResultSetMapper.

If you want to use different ResultSetMapper implementations for DAO methods which return the same data type, you can register a ResultSetMapperFactory.

dbi.registerMapper(new PersonMapperFactory(extraParameters));

The ResultSetMapperFactory can return different ResultSetMapper implementations based on the StatementContext. This gives you access to things like the name of the method being called and the raw SQL.

public final class PersonMapperFactory implements ResultSetMapperFactory {

    private EmployeeMapper employeeMapper;
    private PersonMapper personMapper;

    public PersonMapperFactory(deptLookup) {
        this.employeeMapper = new EmployeeMapper(deptLookup);
        this.personMapper = new PersonMapper();
    }

    @Override
    public boolean accepts(Class type, StatementContext ctx) {
        return type.equals(Person.class);
    }

    @Override
    public ResultSetMapper mapperFor(Class type, StatementContext ctx) {        
        if (ctx.getSqlObjectMethod().getName().contains("Employee")) {
            return employeeMapper; 
        } else {
            return personMapper;
        }
    }
    
}

(Note that these examples use JDBI 2, I haven’t had chance to update yet…)

There’s some more info on the scope of a registered mapper in the JDBI SQL Object Reference.

Partial updates of JSON data in Postgres (using JDBI)

With Postgres 9.5 comes the jsonb_set function, for updating a single key within a JSON column. Hooray!

A sample bit of SQL might look like this:

update mytable 
set myfield = jsonb_set(myfield,'{key, subkey}', '"new string value"'::jsonb) 
where id = 5

I’ve put a text value in the example, but the new value can be an entire JSON structure.

I’ve posted previously on using JSON and Postgres with JDBI. To use the jsonb_set function, we need to reuse the BindJson annotation covered in that post. The jsonb_set function also takes an array parameter, defining the path to the key to be set. For this I wrote a new Bind annotation:

@BindingAnnotation(BindTextArray.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindTextArray {
    
    String value();

    public static class JsonBinderFactory implements BinderFactory {

        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindTextArray, String[]>() {
                
                @Override
                public void bind(SQLStatement sqlStatement, BindTextArray bind, String[] array) {
                    try {
                        String fieldName = bind.value();
                        Connection con = sqlStatement.getContext().getConnection();                        
                        Array sqlArray = con.createArrayOf("text", array);
			sqlStatement.bindBySqlType(fieldName, sqlArray, Types.ARRAY);
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding Array",ex);
                    }
                }
            };
        }
    }

(Code based on this post: http://codeonthecobb.com/2015/04/18/using-jdbi-with-postgres-arrays/).

Here’s the DAO for the SQL example above, using the new Bind annotation:

 
@SqlUpdate("update mytable set myfield = jsonb_set(myfield, :path,:json) where id = :id")
void updateMyTable(@Bind("id") int id, @BindTextArray("path") String[] path, @BindJson("json") String json)

As you can see, there are limitations to this functionality. We can’t update two different elements in the same JSON column, so if you want to do that, you still need to do it in code. However, the new syntax is handy if you want to update one section of your JSON document, without loading the whole thing into your code base.

Using JDBI with Postgres JSON data

I’ve been migrating some raw JDBC code over to JDBI, and joyfully stripping out lines of boilerplate code for preparing statements, opening record sets, sometimes remembering to close them, handling SQL exceptions which won’t ever occur anyway, and so on. Using the SQL Object API means the only code you have to write is the SQL and a ResultSetMapper to determine how to create your domain objects from the resultset. It really promotes adherence to the single responsibility principle and discourages you from mixing logic in with your database access code.

The database in question has a number of fields containing JSON data. More specifically, they use the PostgreSQL jsonb data type. This has required a little more tinkering to get working.

Inserting jsonb data

Out of the box, JDBI provides two annotations for binding parameters. The @Bind annotation binds a single named argument, and @BindBean binds bean properties with matching names. However, to insert jsonb data, you need to first create an PGobject instance and bind that. To do this, I created a new Binding annotation, following the guidance here: http://jdbi.org/sql_object_api_argument_binding/

The annotation code looks like this:

BindingAnnotation(BindJson.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindJson {
    String value();

    public static class JsonBinderFactory implements BinderFactory {
        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindJson, String>() {                
                @Override
                public void bind(SQLStatement q, BindJson bind, String jsonString) {
                    try {
                        PGobject data = new PGobject();
                        data.setType("jsonb");
                        data.setValue(jsonString);
                        q.bind(bind.value(), data);                        
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding JSON",ex);
                    }
                }
            };
        }
    }
}

To use it, annotate the json parameter with the new annotation:

@SqlUpdate("insert into my_table (id,data) VALUES (:id,:data)")
void insertJson(@Bind("id") int id, @BindJson("data") String jsonString);

And that’s it; it just works.

Querying json dynamically

I had a requirement where the parameter supplied to the query was the name of the json element to return. For example, consider the json below. I wanted to be able to paramterise a query to return either one of the key values.

{
   "element": {
      "key1": "value1",
      "key2": "value2",
      "key3": "value3"
   }
}

Using raw JDBC it was possible (although not very pretty) to concatenate a suitable sql statement and then execute it:

String sql = "select data->'element1'->'" + subKeyName + "' as value from mytable"
...

This isn’t possible when the SQL string is specified as a JDBI annotation. I found some useful Postgres json processing functions, including jsonb_extract_path_text which allows you to bind parameters normally:

@SqlQuery("select jsonb_extract_path_text(data,'element1',:subKeyName) as value from mytable")
List<String> getSubKey(@Bind("subKeyName") String subKeyName)

So far I haven’t come across any other issues using JDBI with a PostgreSQL JSON data store. I’m looking forward to trying out the new jsonb functionality in PostgreSQL 9.5 which supports writing partial updates to json fields, yippee!