Web app with h2 console to access your database via browser

Here’s a simple war file that deploys H2 database web console along with h2, postgresql and mysql drivers on your web container.

You can access it by simply going to /h2console url and manage your database remotely only with your browser.

No dependencies and no manual steps required, simply deploy and connect to your database.

It’s based on configuration from JBoss quickstart, I’ve only added h2 and other db jars, so it’s ready to be used in any web app container.

Here’s a download link: h2console.war


How to control relay module with BeagleBone Black

I’ve finally received my 5V relay module and started messing up with my BegleBone Black tiny Arm board won from Touk.pl at Confitura conference.

Idea is to build a flowers watering system calculating water amount from current temperature and weather forecast with watering statistics.

Here’s how it works:

So. I’ve first connected my relay module to power(P8_7) and ground(P8_1) connectors on BB and input to some GPIO pins (ex: P9_8, P9_10). My relay module has only 2 relays, but you can connect as many as you like, just use more GPIO pins.
Here you can get nice pinout image to check what pin is where:

Next we have to enable our GPIOs so we can control them:

if [ ! -d /sys/class/gpio/gpio67 ]; then echo 67 > /sys/class/gpio/export; fi

You have to do this as root to have access. Remember that GPIO numbers are different from pin numbers. Check pinout if needed.

Next we can power up our GPIO, it should enable relay channel:

echo out > /sys/class/gpio/gpio68/direction

to disable:

echo in > /sys/class/gpio/gpio68/direction

That’s all. Now you can use your BeagleBone to power on/off almost any device you like. Here’s a full script used in the demo if you like to check it out:


if [ ! -d /sys/class/gpio/$gpio1 ]; then echo $gpio1 > /sys/class/gpio/export; fi
if [ ! -d /sys/class/gpio/$gpio2 ]; then echo $gpio2 > /sys/class/gpio/export; fi

click() {
    state="`cat /sys/class/gpio/gpio$1/direction`"
    if [ $state = "in" ] ; then

    echo $state > /sys/class/gpio/gpio$1/direction

while : ;do
    click $gpio1
    sleep .2
    click $gpio2
    sleep .4

How to run SolR in maven for tests that need to do some searches

Here’s a Maven pom section that launches Jetty instance with Apache SolR deployed just before tests and stops it after integration tests are finished.
You can select what what war will be deployed and where will be SolR home located (of course you need to prepare it first).
Hope you find it useful:

        <!--Need empty war tag so contextHandlers are loaded with Solr on proper path-->                               
            <connector implementation="org.eclipse.jetty.server.nio.SelectChannelConnector">                           
            <contextHandler implementation="org.mortbay.jetty.plugin.JettyWebAppContext">                              

JSON endpoint in Playframework with one annotation

Playframework 1.2 is really nice in getting things done quickly. Today I’ve had to add JSON endpoints
in my application that would render exactly same models as html pages on same URLs.
I could have done it manually, but there are some AOP mechanisms you can use in Play that may be useful here.

So I’ve implemented some simple aspect that would cut trough requests and return JSON instead
of rendering html templates:

public class JsonPointcut extends Controller {
    static void renderModelsJson() {
        if (isJsonRequest()) {
            Scope.RenderArgs renderArgs = Scope.RenderArgs.current();
            Map<String, Object> outputObjects = new HashMap<>();
            for (Map.Entry<String, Object> entry : renderArgs.data.entrySet()) {
                if(entry.getValue() instanceof JPABase) {
                    outputObjects.put(entry.getKey(), entry.getValue());
    static boolean isJsonRequest() {
        Http.Header accepts = request.headers.get("accept");
        return accepts != null && "application/json".contains(accepts.value());

It will simply override render statements in controllers annotated with @With(JsonPointcut.class)
and render all parameters that would usually go into html template and extend JPABase (base class of Play entities) in form of JSON map.
Just remember to use Accept:application/json in request HTTP header.

Simple and beautiful.


Just uploaded a bit more fail safe and powerful version as Playframework module. Check my company GitHub account:

No documentation yet, but javadocs on controller.JsonRenderer class describe pretty much everything. Will add some later on.

How to use Apache SolR SQL integration and not get hurt

Recently I’ve spent more than one day fixing crazy issues in SolR SQL database integration on my project. You can set everything up using SolR documentation here: http://wiki.apache.org/solr/DataImportHandler. It’s not that difficult and probably is enough to handle data loading for many applications.

We have quite complex query that is feeding SolR with data. It has few sub-selects, group concats etc. We also use SQL database to store original content of documents we’re feeding SolR (so we can recreate index whenever it’s needed). Everything worked fine with simple varchars or integers, but when we wanted to process CLOB/LONGTEXT fields it didn’t work. First for CLOB data SolR was not indexing it’s content, but class name and address of database CLOB handler (i think it was something like org.h2.jdbc.Clob@1c341a for H2 database, oracle.sql.CLOB@24d12a for Oracle). It was Object.toString() call as you probably already guessed and database API was not returning String for CLOB, but some internal representation that SolR should read data from.

Everything should be fixed by using ClobTransformer. Just few changes in data-config.xml and it should be fine… but it wasn’t. I spent quite few hours to find out, that it won’t work if data column you’re feeding ClobTransformer is not written all in uppercase. Yes, it was just that. Adding alias to column name that made it in named in upper case fixed everything.

select col as COLUMN from table

This and sourceColName=”COLUMN” in entity mapping helped. So first advice how to not get hurt is:

Use only upper case names in query result table. Use aliases when needed.

Second issue we had was that after switching database to MySql CLOBs (it’s named LONGTEXT in MySql) again stopped working and again it was some crazy issue nowhere documented. After some time spent in debugger and SolR source I’ve found out that it was not using field name from data-config.xml to map it to schema field, but sourceColName. It also has some logic that was trying to resolve using sourceColName.toLowerCase when it couldn’t match it’s name. I have no idea why it does that way for CLOBs as other fields worked fine. Also switching back to H2 database worked fine. So next advice is:

Use same name as schema field name for query result table columns, but still in upper case. It will keep you safe from first issue described here and work fine because of toLowerCase logic in Solr

Hope it will save you few hours of searching. Will keep posting new crazy things about SolR if I find them.That’s all I have found for now.


Ok, this is not really true. Done some live debugging in SolR transformers and what you really have to do is use exactly same naming as your database will return.
SolR transformers use Map with table column names (as String in case returned by database) as keys and data as values. So make sure you declare
sourceColNames in mapping in exactly same case as your database returns or map.get() won’t match it.

Web Services – Java client

Robert Mac asked some time ago for Java client for my old post here: Web Services in Ruby, Python and Java. So here it is (sorry for the delay). Simplest possible solution, no jars or IDE needed. Just plain Java 6 JDK.

First we have to generate proxy classes for our Web Service (you need to pass WSDL location, as URL or path to file):

wsimport http://localhost:8080/WSServer/Music?wsdl

wsimport is in /bin folder in your JDK.

Now let’s use them:

public class WSClient {

    public static void main(String[] args) {
        Music music = new Music();
        String[] artists = music.listArtists();
        for (String artist : artists) {
            Song[] songs = music.listSongs(artist);
            for (Song song : songs) {
                System.out.format("\t%s : %s : %d%s\n", new Object[]{song.getFileName(), song.getArtist(), song.getSize(), "MB"});

Now compile it and execute with classes generated by wsimport in classpath.
This is all. Simple, isn’t it?

Java annotations – little disappointment

I must say I’m little disappointed with Java annotations. There are no way to introduce dependencies between annotation parameters. This way you loose part of static error code checking on compilation. Example?

Let’s create annotation that generate some field in class. What you might want to do is to declare some interface for reference type and some implementation type to be assigned. But there is no way to make one type dependant on another, so user of your annotation may declare reference as List and implementation as HashMap. There is nothing you can do about it.

If only we had some generics there.

Log4j logger.error(Object) trap

Today I’ve fell into ugly Log4j logger trap. Everybody knows there is error, debug, info… method on Logger object. But what you might not know is that there is no error(Throwable) method there, only error(Object). What’s the difference you ask? It’s quite big. There’s error(String, Throwable) that will log out your message (the String param), build Throwable stack trace and log it along, but error(Object) will treat exception just like every other object. There will be NO stack trace in your logs, only exception message, Throwable.toString() will be called to generate it.

Continue reading…

Generating Web Services server using Axis2

Everybody knows that you can easily generate Web Services client classes just from WSDL file in Java.
You even have such tool (wsimport) in every JDK distribution (check JDK_HOME/bin). It handles
all XML data types to Java mapping and generates all complex types needed. You even have such tools
for many other programming languages. Scripting languages don’t need them at all, they generate them
on the fly.

But what to do if you have to implement WS client and only thing you can get from your customer
is a WSDL file? Of course you can generate client classes from it. But how to test it? (You wouldn’t give
untested code to your customer, won’t you?)

Continue reading…

Web Services in Ruby, Python and Java

Today I’ve had to prepare some examples to show that web services are interoperable. So I’ve created a simple web service in Java using Metro and launched it on Tomcat. Then tried to consume them using Python and Ruby. Here’s how it all finished…
Continue reading…

%d bloggers like this: