Skip to end of metadata
Go to start of metadata
Shows how to access databases from OSGi applications running in Karaf and how to abstract from the DB product by installing DataSources as OSGi services. Some new Karaf shell commands can be used to work with the database from the command line. Finally JDBC and JPA examples show how to use such a DataSource from user code.

Prerequisites

You need an installation of apache karaf 3.0.3 for this tutorial.

Example sources

The example projects are on github Karaf-Tutorial/db.

Drivers and DataSources

In plain java it is quite popular to use the DriverManager to create a database connection (see this tutorial). In OSGi this does not work as the ClassLoader of your bundle will have no visibility of the database driver. So in OSGi the best practice is to create a DataSource at some place that knows about the driver and publish it as an OSGi service. The user bundle should then only use the DataSource without knowing the driver specifics. This is quite similar to the best practice in application servers where the DataSource is managed by the server and published to jndi.

So we need to learn how to create and use DataSources first.

The DataSourceFactory services

To make it easier to create DataSources in OSGi the specs define a DataSourceFactory interface. It allows to create a DataSource using a specific driver from properties. Each database driver is expected to implement this interface and publish it with properties for the driver class name and the driver name.

Introducing pax-jdbc

The pax-jdbc project aims at making it a lot easier to use databases in an OSGi environment. It does the following things:

  • Implement the DataSourceFactory service for Databases that do not create this service directly
  • Implement a pooling and XA wrapper for XADataSources (This is explained at the pax jdbc docs)
  • Provide a facility to create DataSource services from config admin configurations
  • Provide karaf features for many databases as well as for the above additional functionality

So it covers everything you need from driver installation to creation of production quality DataSources.

Installing the driver

The first step is to install the driver bundles for your database system into Karaf. Most drivers are already valid bundles and available in the maven repo.

For several databases pax-jdbc already provides karadf features to install a current version of the database driver.

For H2 the following commands will work

Strictly speaking we would only need the pax-jdbc-h2 feature but we will need the others for the next steps.

This will install the pax-jdbc feature repository and the h2 database driver. This driver already implements the DataSourceFactory so the last command will display this service.

DataSourceFactory

The pax-jdbc-pool-dbcp2 feature wraps this DataSourceFactory to provide pooling and XA support.

pooled and XA DataSourceFactory

Technically this DataSourceFactory also creates DataSource objects but internally they manage XA support and pooling. So we want to use this one for our later code examples.

Creating the DataSource

Now we just need to create a configuration with the correct factory pid to create a DataSource as a service

So create the file etc/org.ops4j.datasource-tasklist.cfg with the following content

config for DataSource

The config will automatically trigger the pax-jdbc-config module to create a DataSource.

  • The name osgi.jdbc.driver=H2-pool-xa will select the H2 DataSourceFactory with pooling and XA support we previously installed.
  • The url configures H2 to create a simple in memory database named test.
  • The dataSourceName will be reflected in a service property of the DataSource so we can find it later
  • You could also set pooling configurations in this config but we leave it at the defaults

DataSource

So when we search for services implementing the DataSource interface we find the person datasource we just created.

When we installed the features above we also installed the aries jndi feature. This module maps OSGi services to jndi objects. So we can also use jndi to retrieve the DataSource which will be used in the persistence.xml for jpa later.

jndi url of DataSource

Karaf jdbc commands

Karaf contains some commands to manage DataSources and do queries on databases. The commands for managing DataSources in karaf 3.x still work with the older approach of using blueprint to create DataSources. So we will not use these commands but we can use the functionality to list datasources, list tables and execute queries.

jdbc commands

We first install the karaf jdbc feature which provides the jdbc commands. Then we list the DataSources and show the tables of the database accessed by the person DataSource.

This creates a table person, adds a row to it and shows the table.

The output should look like this

select * from person

Accessing the database using JDBC

The project db/examplejdbc shows how to use the datasource we installed and execute jdbc commands on it. The example uses a blueprint.xml to refer to the OSGi service for the DataSource and injects it into the class
DbExample.The test method is then called as init method and shows some jdbc statements on the DataSource.The DbExample class is completely independent of OSGi and can be easily tested standalone using the DbExampleTest. This test shows how to manually set up the DataSource outside of OSGi.

Build and install

Build works like always using maven

In Karaf we just need our own bundle as we have no special dependencies

After installation the bundle should directly print the db info and the persisted person.

Accessing the database using JPA

For larger projects often JPA is used instead of hand crafted SQL. Using JPA has two big advantages over JDBC.

  1. You need to maintain less SQL code
  2. JPA provides dialects for the subtle differences in databases that else you would have to code yourself.

For this example we use Hibernate as the JPA Implementation. On top of it we add Apache Aries JPA which supplies an implementation of the OSGi JPA Service Specification and blueprint integration for JPA.

The project examplejpa shows a simple project that implements a PersonService managing Person objects.
Person is just a java bean annotated with JPA @Entitiy.

Additionally the project implements two Karaf shell commands person:add and person:list that allow to easily test the project.

persistence.xml

Like in a typical JPA project the peristence.xml defines the DataSource lookup, database settings and lists the persistent classes. The datasource is refered using the jndi name "osgi:service/person".

The OSGi JPA Service Specification defines that the Manifest should contain an attribute "Meta-Persistence" that points to the persistence.xml. So this needs to be defined in the config of the maven bundle plugin in the prom. The Aries JPA container will scan for these attributes
and register an initialized EntityMangerFactory as an OSGi service on behalf of the use bundle.

blueprint.xml

We use a blueprint.xml context to inject an EntityManager into our service implementation and to provide automatic transaction support.
The following snippet is the interesting part:

This makes a lookup for the EntityManagerFactory OSGi service that is suitable for the persistence unit person and injects a thread safe EnityManager (using a ThreadLocal under the hood) into the
PersonServiceImpl. Additionally it wraps each call to a method of PersonServiceImpl with code that opens a transaction before the method and commits on success or rollbacks on any exception thrown.

Build and Install

Build

A prerequisite is that the derby datasource is installed like described above. Then we have to install the bundles for hibernate, aries jpa, transaction, jndi and of course our db-examplejpa bundle.
See ReadMe.txt for the exact commands to use.

Test

Then we list the persisted persons

Summary

In this tutorial we learned how to work with databases in Apache Karaf. We installed drivers for our database and a DataSource. We were able to check and manipulate the DataSource using the jdbc:* commands. In the examplejdbc we learned how to acquire a datasource
and work with it using plain jdbc4.  Last but not least we also used jpa to access our database.

Back to Karaf Tutorials