Single SPA, Lerna, Typescript and shared utility modules

I’ve been playing with using Lerna to manage a monorepo for my single-spa project. It seems like a really good fit so far. One stumbling block I encountered was implementing a single-spa utility module (in this case a styleguide/component library).

A quick overview of the setup. First the lerna.json file:

{
  "packages": ["apps/*"],
  "npmClient": "yarn",
  "useWorkspaces": true,
  "version": "0.1.0"
}

Each of the micro frontends (MFEs) is in a subdirectory under apps. The package.json for each has a start command with a different port:

    "start": "webpack serve --port 8501"

The root package.json uses Lerna to start all the micro frontends in parallel:

    "start": "lerna run start --parallel"

So far so good, and much easier than checking out several different projects and remembering to run start on all of them.

Typescript woes

I created a new single-spa utility module in the apps directory, using the single-spa CLI and selecting util-module as the module type.

First I installed the util module as a dependency in the other micro frontends. The lerna add command “Adds package to each applicable package. Applicable are packages that are not package and are in scope”

lerna add @myorg/my-util --ignore @myorg/root-config

However, when I imported anything from the util-module into any of the MFEs, typescript was not happy:

TS2307: Cannot find module '@myorg/my-util' or its corresponding type declarations.

I read a few suggestions about needing to specify a “main” in package.json for the util-module (I don’t want to depend on the built module) and generating type declaration files (I think the ts-config-single-spa config already does this). In the end the answer was to add a paths element to the compilerOptions in the tsconfig for each MFE

"compilerOptions": {
    "jsx": "react",
    "baseUrl": ".",
    "paths": {
      "@myorg/my-util": ["../my-util/src/myorg-myutil"]
    }
  },

This made my IDE happy!

Configuring as a shared dependency

Because this is a shared utility module, I don’t want to bundle it with every MFE. So I added it to webpack externals for each MFE:

return merge(defaultConfig, {
    externals: {
      "react": "react",
      "react-dom": "react-dom",
      "@myorg/my-util": "@myorg/my-util"
    }
  });

And added it to the import map in the root config. Note that there is no need to call registerApplication() for the util-module because it won’t be loaded independently.

<% if (isLocal) { %>
    <script type="systemjs-importmap">
    {
      "imports": {
        "@myorg/mfe1": "//localhost:8501/myorg-mfe1.js",
        "@myorg/mfe2": "//localhost:8502/myorg-mfe2.js",
        "@myorg/my-util": "//localhost:8503/myorg-myutil.js"
      }
    }
  </script>
<% } %>

And that’s it – all working in a development environment. The next step will be to look at how to hook up to Lerna’s version management abilities so I can test and deploy micro frontends independently.

Transaction handling woes in JDBI 3

I’ve stalled on a project, because I decided to have a go with JDBI 3. I’ve only used JDBI 2 in previous projects – this is the general pattern I was using:

An Interface:

public interface AccountService {
    void addAccount(Account account, User user);
}  

Implementation:

public abstract class AccountServiceJdbi implements AccountService {
    
    @Override  
    @Transaction  
    public final void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    abstract AccountDao accountDao();
}

You can imagine the Dao – just a simple interface with @SQLQuery and @SQLUpdate annotations.
The Service is instantiated using

dbi.onDemand(AccountServiceJdbi.class);

I like this approach, because it is easy to write a Mock implementation of AccountService for use when testing other components, and to create a concrete extension of AccountServiceJdbi with a mock AccountDao for testing the service class logic.

The abstract class allows me to have transactional methods that combine Data Access methods from one or more DAOs. The implemented methods are final, because my class is not intended to be sub classed. This prohibits unintended overriding of methods, for example when creating a concrete implementation for testing.

HOWEVER…

If I try to follow this pattern in JDBI I get the following error:

java.lang.IllegalArgumentException: On-demand extensions are only supported for interfaces

As far as I can tell, these are my options in JDBI 3:

1. Use default methods in interfaces:

public interface AccountServiceJdbi extends AccountService {
    
    @Override  
    @Transaction  
    default void addAccount(Account account, User user) {
        long accountId =  accountDao().insertAccount(account);
        accountDao().linkAccountToOwner(accountId, user.getId());
    }
    
    @CreateSqlObject
    AccountDao accountDao();
}

This may look similar to what I was doing in JDBI 2, but it feels WRONG. I’ve lost all control over concrete implementations of this interface. You can’t make default methods final, so if I create an implementation for testing, I can’t guarantee that some of the default methods won’t be overridden.

Furthermore, in the abstract class, the `accountDao` method has default access, so can’t be accessed from outside the package. In JDBI 3 I lose this restriction; all methods in interfaces are public, so my AccountDao can be accessed from outside the package.

It just feels less concise; I am less able to prescribe how the classes should be used. More commenting and self enforced discipline will be required.

2. Handle transactions programmatically:

public final class AccountServiceJdbi implements AccountService {
    
    @Override 
    public void addAccount(Account account, User user) {
        jdbi.useTransaction(handle -> {
           AccountDao accountDao = handle.attach(AccountDao.class);
           long accountId =  accountDao().insertAccount(account);
           accountDao().linkAccountToOwner(accountId, user.getId());
       });
    }    
}

Because this is a concrete implementation, I can make it final and prohibit unintended sub classing. There is no unintended access to DAO classes.

However, the transaction handling code is entwined with the business logic, making it difficult to unit test this file without connecting to a database; I can’t easily create mock implementations of the Dao classes in order to test the business logic in isolation, because the DAOs are instantiated within the business logic. I guess I could do this with a proper mocking framework, but personally I’d much rather implement simple mocks myself for testing.

SO…

I’m still stalled. I don’t like either of these approaches, and I’m not very good at writing code I don’t like. I could go back to the JDBI 2 but that doesn’t seem right either. Suggestions welcome…

Executing JavaScript on the JVM with Nashorn

We’ve been using Nashorn for a while to execute some very simple Javascript expressions. The latest challenge was to run some JavaScript from a Node.js project, which in turn had dependencies on a third party package. There were a few gotchas which I thought I’d share (disclaimer: I’m not a JavaScript developer, so this might all be obvious stuff to some).

The original JavaScript was in a plain old text file, with functions at the top level

function testThing(thing) {
   return thing === 'tiger';
}

This is how I was executing it:

ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");
String js = IOUtils.toString(this.getClass().getClassLoader().getResource("my-script.js"), "UTF-8");            
engine.eval(js);
Invocable invocable = (Invocable) engine;
Object result = invocable.invokeFunction("testThing", theThing);

Running Javascript generated from a Node.js project required a bit of tinkering:

1. In Java 8 Nashorn will only load ES5 compatible Javascript

I was running browserify to create a single JavaScript file from the Node.js project. Trying to load the generated instead of the plain javascript failed miserably:

javax.script.ScriptException: :1079:0 Expected : but found }

To get Nashorn to evaluate the js file, I had to transpile to ES5 using babel.

I decided I needed to run browsify in standalone mode, so that my exported functions were attached to a global variable. The scripts section in package.json had the following:

"build": "browserify src/main.js -r --standalone MyLib -o dist/build.js",
"transpile": "babel dist/build.js --out-file dist/build.es5.js"

This was a small step forward but Nashorn was still not able to load the script successfully…

2. Nashorn cannot set property of undefined!

Loading the transpiled file in Nashorn threw the following:

javax.script.ScriptException: TypeError: Cannot set property “MyLib” of undefined in <eval> at line number 19

Looking at the beginning of the transpiled file, you can see it is trying to determine where would be suitable to attach the global variable:

(function (f) {
  if ((typeof exports === "undefined" ? "undefined" : _typeof(exports)) === "object" && typeof module !== "undefined") {
    module.exports = f();
  } else if (typeof define === "function" && define.amd) {
    define([], f);
  } else {
    var g;if (typeof window !== "undefined") {
      g = window;
    } else if (typeof global !== "undefined") {      
      g = global;
    } else if (typeof self !== "undefined") {
      g = self;
    } else {
      g = this;
    }
    g.MyLib = f();
  }
})

A little bit of detective work using print() showed that it was falling through to the final “else” as it couldn’t find anything else to attach to. Adding this line to the Java, before parsing the file, magicked away this problem.

engine.eval("var global = this;");

(It doesn’t quite make sense to me why this works, as loading the script without this line it seems to think “this” is undefined?)

3. Executing methods of objects is different from executing functions

At this point, the file was loading without an error, but trying to execute the function using the original Java code with the new JavaScript file threw a NoSuchMethodException

java.lang.NoSuchMethodException: No such function testThing

This makes sense, as “testThing” is now a method of the MyLib global variable. However none of the permutations I tried in order to access the function worked. I tried all sorts, e.g:

Object result = invocable.invokeFunction("global.MyLib.testThing", theThing);

The key issue here is that testThing isn’t a function now, it is a method on an object, so we have to use invokeMethod:

Object result = invocable.invokeMethod(engine.eval("global.MyLib"), "testThing", theThing);

And that worked :)

Not sure if any of this is the correct way to approach executing Node.js based Javascript on the JVM, but these were the three gotchas that took me a while to work out.

Refactoring SQLiteOpenHelper so it isn’t a mile long

If, like me, you are new to Android Development you may find that your SQLiteOpenHelper implementation can quickly get long and disorganised, as data access methods for tables are added organically during development. It gets hard to see which methods are related, and starts to feel cumbersome. You can only have one SQLiteOpenHelper per database, but it isn’t difficult to refactor the single SQLiteOpenHelper into multiple classes with distinct responsibilities.

To keep things really tidy, I defined an interface for my new classes:

public interface MySQLiteDao {
    void onCreate(SQLiteDatabase db);
    void onUpdate(SQLiteDatabase db, int oldVersion, int newVersion);
}

The individual DAO classes are constructed with an instance of SQLiteOpenHelper which is used in data access methods. onCreate and onUpgrade contain the create/update sql specific to this table.

public class ThingDao implements MySQLiteDao {

    private SQLiteOpenHelper db;

    public ThingDao(SQLiteOpenHelper db) {
        this.db = db;
    }

    @Override
    public void onCreate(SQLiteDatabase sqlDb) {
        sqlDb.execSQL("CREATE TABLE ...)");
    }

    @Override
    public void onUpdate(SQLiteDatabase sqlDb, int oldVersion, int newVersion) {
        if (newVersion = 2) {           
            sqlDb.execSQL("...");
        }
    }

    public List<Thing> getThings() {
        List<Thing> things = new ArrayList<>();
        Cursor c = db.getReadableDatabase().rawQuery("SELECT ... ", new String[]{});

        for (int i = 0; i < c.getCount(); i++) {
           c.moveToPosition(i);
           things.add(new Thing(...);
        }
        
        c.close();
        return things;
    }

    public Thing getThing(int id) {
       ...
    }

    public void deleteThing(Thing thing) {
       ...
    }
    ...
}

Then the SQLiteOpenHelper implementation itself is much easier to read

public class MySqlHelper extends SQLiteOpenHelper {

    private static MySqlHelper instance;

    private static final String DATABASE_NAME = "my.db";
    private static final int SCHEMA = 1;


    private ThingDao thingDao;

    public synchronized static MySqlHelper getInstance(Context ctxt) {
        if (instance == null) {
            instance = new MySqlHelper(ctxt.getApplicationContext());
        }

        return (instance);
    }

    public MySqlHelper(Context context) {
        super(context, DATABASE_NAME, null, SCHEMA);
    }

    @Override
    public void onCreate(SQLiteDatabase db) {
        getThingDao().onCreate(db);
        //other DAOs called here
    }

    @Override
    public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) {
        getThingDao().onUpdate(db, oldVersion, newVersion);
        //other DAOs called here
    }

    public ThingDao getThingDao() {
        if (thingDao == null) {
            thingDao = new ThingDao(this);
        }
        return thingDao;
    }
}

You can group the data access methods into DAO classes in whatever way makes sense for the application – it doesn’t have to be one per table. In your application code, you can easily get an instance of the DAO you want.

ThingDao = MySqlHelper.getInstance(this).getThingDao();

This means you are working with smaller classes containing related methods – much easier to maintain than a sprawling mess containing every DAO method you dashed out in order to get the thing working…

How to use a JDBI ResultSetMapper with a non default constructor

If your ResultSetMapper does not require a non default constructor, it is easy to specify a specific Mapper for a DAO method:

@UseMapper(PersonMapper.class)
@SqlQuery("select id, data from person where id = :id")
Person getPerson(@Bind("id") String id);

However, sometimes the ResultSetMapper may require a non default constructor. In this case you can register a mapper with the DBI:

dbi.registerMapper(new PersonMapper(extraParameters));

JDBI will use the Mapper globally wherever the return type of the DAO method matches that of the ResultSetMapper.

If you want to use different ResultSetMapper implementations for DAO methods which return the same data type, you can register a ResultSetMapperFactory.

dbi.registerMapper(new PersonMapperFactory(extraParameters));

The ResultSetMapperFactory can return different ResultSetMapper implementations based on the StatementContext. This gives you access to things like the name of the method being called and the raw SQL.

public final class PersonMapperFactory implements ResultSetMapperFactory {

    private EmployeeMapper employeeMapper;
    private PersonMapper personMapper;

    public PersonMapperFactory(deptLookup) {
        this.employeeMapper = new EmployeeMapper(deptLookup);
        this.personMapper = new PersonMapper();
    }

    @Override
    public boolean accepts(Class type, StatementContext ctx) {
        return type.equals(Person.class);
    }

    @Override
    public ResultSetMapper mapperFor(Class type, StatementContext ctx) {        
        if (ctx.getSqlObjectMethod().getName().contains("Employee")) {
            return employeeMapper; 
        } else {
            return personMapper;
        }
    }
    
}

(Note that these examples use JDBI 2, I haven’t had chance to update yet…)

There’s some more info on the scope of a registered mapper in the JDBI SQL Object Reference.

Use Docker Compose to run a PHP site without contaminating your system

This must be the quickest way to get the LAMP stack up and running locally.
Nothing installed locally -> you can run different versions of PHP/MySQL without conflicts.
Database gets created on first startup and persisted in a named volume.
PHP files are in a mapped volume so you can edit without rebuilding the container.

The docker-compose.yml file:

version: "2"
services:
  site:
    image: php:5.6.27-apache   
    volumes:
      - ./site:/var/www/html
    depends_on:
      - mysql
    networks:
      back-tier:        
  mysql:
    image: mysql:5.5
    environment:
      MYSQL_ROOT_PASSWORD: topsecret
      MYSQL_DATABASE: sitedbname
      MYSQL_USER: sitedbuser
      MYSQL_PASSWORD: sitedbpassword
    volumes:
      - site_db:/var/lib/mysql
    networks:
      back-tier:
networks:
  back-tier:
volumes:
  site_db: 

Then just

docker-compose up

It takes a bit of the pain out of working on a legacy PHP site!

Unit Testing AngularJS Directives with templateUrl

Quick blog as this confused me for a while. When running a unit test for a directive which had the template in an html file, I got the following error:

Error: Unexpected request: GET assets/views/template.html

Odd I thought, I haven’t been explicitly using $httpBackend. However, I discovered that when using when unit testing all HTTP requests are processed locally, and as the template is requested via HTTP, it too is processed locally.

The answer is to use karma-ng-html2js-preprocessor to generate an Angular module which puts your HTML files into a $templateCache to use in your tests. Then Angular won’t try to fetch the templates from the server.

First, install the dependency:

npm install karma-ng-html2js-preprocessor --save-dev

Then add the following to karma.conf.js (add to existing entries under files and preprocessors if they exist)


    files: [
        'build/assets/views/*.html'
    ],
    preprocessors: {
        'build/assets/views/*.html': "ng-html2js"
        },
    ngHtml2JsPreprocessor: {
        stripPrefix: 'build/',
        moduleName: 'ngTemplates' 
    }

Then in the unit test, add the line:


  beforeEach(module('ngTemplates'));

After doing this, you may encounter the following error:

Error: [$injector:modulerr] Failed to instantiate module ngTemplates due to:
Error: [$injector:nomod] Module ‘ngTemplates’ is not available! …

To get it available, you need to get the settings right – the module will only be created if html files exist in the specified directory. The stripPrefix setting allows you to ensure that the path to the view matches what is expected by you application, if the basePath in your karma.conf.js isn’t the same as the base of your application. Other settings are available too.

Quick backendless development with AngularJS

There are occasions when you want to run your AngularJS app without accessing a live REST API. There are various blog posts on the internet about doing this using $httpBackend, which is part of angular-mocks and very handy for unit testing.

For example:

var cakeData = [{ name: 'Hot Cross Bun'}];
$httpBackend.whenGET('/cakes').respond(function(method,url,data) { 
    return [200, cakeData, {}];
});

This is fine if you have small snippets of JSON to return. However, in real life data is usually bigger and uglier than that!

It would seem logical to put your mock data into JSON files, and return these when running without a live backend, keeping the code nice and succinct and readable. Unfortunately this doesn’t seem to be possible with $httpBackend method.

I tried something like this:

$httpBackend.whenGET('/cakes').respond(function(method, url, data) {
    return $http.get("/mock/cakes.json");
  });

This doesn’t work, because $httpBackend doesn’t work with returned promises. The respond method needs static data.
Workarounds include falling back to a synchronous `XMLHttpRequest` to get the data (ugh) or using a preprocessor to insert the contents of the json file into the code when you build. Neither seem particularly nice.

Using Http Interceptors to serve mock data

I came across this blog post: Data mocking in angular E2E testing, which describes an alternate approach to serving mock data for testing. This approach works just as well for running your app without a backend.

Here’s the code for a simple interceptor

angular.module('mock-backend',[])
    .factory('MockInterceptor', mockInterceptor)
    .config(function ($httpProvider) {
        $httpProvider.interceptors.push("MockInterceptor");
    });

function mockInterceptor() {
    return {
        'request': function (config) {
            if (config.url.indexOf('/cakes') >= 0) {
                config.url = 'mock/cakes.json';
            } 
            return config;
        }
    };
}

It’s fairly easy to use your build script to include this module conditionally when you want to run without a backend.

You can extend the interceptor logic; for example check the method, and switch POST to GET (you can’t POST to a file!). It’s not as sophisticated as a full mock backend, as data doesn’t change to reflect updates, but is a really quick way to view your app with a big chunk of data in it.

Partial updates of JSON data in Postgres (using JDBI)

With Postgres 9.5 comes the jsonb_set function, for updating a single key within a JSON column. Hooray!

A sample bit of SQL might look like this:

update mytable 
set myfield = jsonb_set(myfield,'{key, subkey}', '"new string value"'::jsonb) 
where id = 5

I’ve put a text value in the example, but the new value can be an entire JSON structure.

I’ve posted previously on using JSON and Postgres with JDBI. To use the jsonb_set function, we need to reuse the BindJson annotation covered in that post. The jsonb_set function also takes an array parameter, defining the path to the key to be set. For this I wrote a new Bind annotation:

@BindingAnnotation(BindTextArray.JsonBinderFactory.class)
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.PARAMETER})
public @interface BindTextArray {
    
    String value();

    public static class JsonBinderFactory implements BinderFactory {

        @Override
        public Binder build(Annotation annotation) {
            return new Binder<BindTextArray, String[]>() {
                
                @Override
                public void bind(SQLStatement sqlStatement, BindTextArray bind, String[] array) {
                    try {
                        String fieldName = bind.value();
                        Connection con = sqlStatement.getContext().getConnection();                        
                        Array sqlArray = con.createArrayOf("text", array);
			sqlStatement.bindBySqlType(fieldName, sqlArray, Types.ARRAY);
                    } catch (SQLException ex) {
                        throw new IllegalStateException("Error Binding Array",ex);
                    }
                }
            };
        }
    }

(Code based on this post: http://codeonthecobb.com/2015/04/18/using-jdbi-with-postgres-arrays/).

Here’s the DAO for the SQL example above, using the new Bind annotation:

 
@SqlUpdate("update mytable set myfield = jsonb_set(myfield, :path,:json) where id = :id")
void updateMyTable(@Bind("id") int id, @BindTextArray("path") String[] path, @BindJson("json") String json)

As you can see, there are limitations to this functionality. We can’t update two different elements in the same JSON column, so if you want to do that, you still need to do it in code. However, the new syntax is handy if you want to update one section of your JSON document, without loading the whole thing into your code base.

Alternative Dropwizard token authentication

Out of the box, the Dropwizard auth module comes with OAuth and Basic Authentication. The documentation for implementing the related Authenticator and Authorizer is simple enough to follow, but there wasn’t much about how to implement an alternative mechanism for authentication, which I needed to do.

It turned out to be fairly fairly straightforward to write a new AuthFilter which used a non standard header for authorisation. I based the implementation on the existing OAuthCredentialAuthFilter filter. I just had to change the getCredentials header to read the header value I wanted to use.

Getting your priorities right

The authentication worked fine. Then I implemented authorisation as well. Remember:

Authentication is the mechanism for identifying the user (e.g. username + password)

Authorisation is the mechanism for determining what they are allowed to do (e.g. role membership)

I created an Authorizer implementation as per the user guide, annotated some resource methods with @RolesAllowed and even remembered to register the RolesAllowedDynamicFeature. However on testing and on running the application, DropWizard appeared to be checking the role membership BEFORE authenticating. Resulting in a 403 access denied, as the roles had not been loaded.

Time for some head scratching, debugging, and trying to register things in different order (no change) until I remembered noticing this line at the top of the AuthFilter I based mine on:

@Priority(Priorities.AUTHENTICATION)

Once I added that to my custom AuthFilter, everything started happening in the correct order.

Today’s lesson: always get your priorities right!