Friday, July 30, 2010

Using Seam 2 with JPA 2

It’s a difficult time to architect new Java web applications. Seam 2 is a proven and well working application stack, but we will hardly see many new versions on the Seam 2 train. Java EE 6 is in general an excellent option, but if your customer’s choice of application server does not yet support it, it is not reasonable. Also there is still some time left for Seam 3 prime time, which builds on top of Java EE 6.

Facing this kind of choice recently, I looked into possible migration paths between the two variants. One thing I have seen often on Seam 2 applications is that people really like the Hibernate criteria API and therefore use Hibernate directly. While Hibernate is an excellent ORM framework, it’s preferrable to use the JPA API when moving to Java EE 6. So - why not use Seam 2 with JPA 2, which finally features an even better (typesafe) criteria API?

It turns out to be a quite easy setup (once you get classloading right), with some small extra modifications. I’ve been using Maven, Seam 2.2 and Hibernate 3.5.4 on JBoss 4.2.3. Lets start with preparing the server. You need to remove the old Hibernate and JPA classes and add the new persistence provider with some dependencies (of course this will be different on other servers):

To be removed from server/lib:


Add to server/lib:


Next lets create a sample Seam project. I’m using the CTP Maven archetype for Seam. In the POM file, remove the references to the Hibernate and old JPA libraries and add the new JPA 2 libraries:

<!-- Remove all embedded dependencies
<dependency>
<groupId>org.jboss.seam.embedded</groupId>
<artifactId>...</artifactId>
</dependency -->

<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>3.5.4-Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<version>1.0.0.Final</version>
<scope>provided</scope>
</dependency>

Note: If you’ve been using embedded JBoss for your integration tests, this will probably not work without exchanging the JARS there too. I’ve been moving away from this approach as it turned out to be a common reason for headache on our nightly builds as well as running tests in Eclipse. I’m very excited to see Arquillian evolving on this topic!

JPA 2 integrates with JSR 303 bean validation, which is the successor of Hibernate Validator. Unfortunately Seam 2 has references to Hibernate Validator 3, where JPA needs version 4. Adding the validator legacy JAR fixes this problem. As bean validation is now part of Java EE 6, we can add it to the server classpath as shown above, as well as to our POM:

<dependency>
<artifactId>validation-api</artifactId>
<groupId>javax.validation</groupId>
<version>1.0.0.GA</version>
<scope>provided</scope>
</dependency>

Now it’s time to move to some code. You can jump straight in your persistence config files and bring the schema up to version 2:

<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
version="2.0"> ...

<entity-mappings xmlns="http://java.sun.com/xml/ns/persistence/orm"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm http://java.sun.com/xml/ns/persistence/orm_2_0.xsd"
version="2.0"> ...

Seam proxies entity managers to implement features like EL replacements in JPQL queries. This proxy does not implement methods new in JPA 2 and will therefore fail. You can write your own proxy very easy:

public class Jpa2EntityManagerProxy implements EntityManager {

private EntityManager delegate;

public Jpa2EntityManagerProxy(EntityManager entityManager) {
this.delegate = entityManager;
}

@Override
public Object getDelegate() {
return PersistenceProvider.instance()
.proxyDelegate(delegate.getDelegate());
}

@Override
public void persist(Object entity) {
delegate.persist(entity);
}
...
}

Add the special Seam functionality as needed. In order to use the proxy with Seam, you’ll have to overwrite the HibernatePersistenceProvider Seam component:

@Name("org.jboss.seam.persistence.persistenceProvider")
@Scope(ScopeType.STATELESS)
@BypassInterceptors
// The original component is precedence FRAMEWORK
@Install(precedence = Install.APPLICATION,
classDependencies={"org.hibernate.Session",
"javax.persistence.EntityManager"})
public class HibernateJpa2PersistenceProvider extends HibernatePersistenceProvider {

@Override
public EntityManager proxyEntityManager(EntityManager entityManager) {
return new Jpa2EntityManagerProxy(entityManager);
}

}

If you use Hibernate Search, have a look at the superclass implementation - you might want to instantiate a FullTextEntityManager directly (as you have it in your classpath - but note that this has not been tested here).

Both implementations are on our Google Code repository, and you can integrate them directly over the following Maven dependency:

<dependency>
<groupId>com.ctp.seam</groupId>
<artifactId>seam-jpa2</artifactId>
<version>1.0.0</version>
</dependency>


You’re now ready to code JPA 2 queries! We’ve already included the meta model generator utility, so let’s activate it for the build:

<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-source</id>
<phase>validate</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${basedir}/src/main/hot</source>
<source>${basedir}/target/metamodel</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<version>1.3.1</version>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<outputDirectory>${basedir}/target/metamodel</outputDirectory>
<processors>
<processor>
org.hibernate.jpamodelgen.JPAMetaModelEntityProcessor
</processor>
</processors>
</configuration>
</execution>
</executions>
</plugin>


In order to use the processor plugin, you also need the following Maven repositories in your POM:

<pluginRepository>
<id>annotation-processing-repository</id>
<name>Annotation Processing Repository</name>
<url>http://maven-annotation-plugin.googlecode.com/svn/trunk/mavenrepo</url>
</pluginRepository>
<pluginRepository>
<id>jfrog-repository</id>
<name>JFrog Releases Repository</name>
<url>http://repo.jfrog.org/artifactory/plugins-releases</url>
</pluginRepository>

Run the build, update the project config to include the new source folder - and finally we’re ready for some sample code:

@Name("userDao")
@AutoCreate
public class UserDao {

@In
private EntityManager entityManager;

private ParameterExpression<String> param;
private CriteriaQuery<User> query;

@Create
public void init() {
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
query = cb.createQuery(User.class);
param = cb.parameter(String.class);
Root<User> user = query.from(User.class);
query.select(user)
.where(cb.equal(user.get(User_.username), param));
}

public User lookupUser(String username) {
return entityManager.createQuery(query)
.setParameter(param, username)
.getSingleResult();
}
}

This is now quite close to Java EE 6 code - all we will have to do is exchange some annotations:

@Stateful
public class UserDao {

@PersistenceContext
private EntityManager entityManager; ...

@Inject
public void init() { ...
}

Enjoy!

Tuesday, July 13, 2010

Test drive with Arquillian and CDI (Part 1)

Here at Cambridge Technology Partners we are as serious about testing as we are about cutting-edge technologies like CDI. Last time we wrote about testing EJBs on embedded Glassfish and now we are back with something even more powerful, so keep on reading!

Background

Recently I was involved in a project based on JBoss Seam where we used Unitils for testing business logic and JPA. I really like this library, mainly because of the following aspects:

  • Provides easy configuration and seamless integration of JPA (also a little bit of DI).
  • Greatly simplifies management of the test data. All you need to do in order to seed your database with prepared data is providing a xml dataset (in a DBUnit flat xml format) and then add @DataSet annotation on the test class or method.
Unitils library is definitely an interesting topic for another blog entry, but since we are going to dive into the Java EE 6 testing those of you who are not patient enough for next blog entry can jump directly to the tutorial site. I'm sure you will like it.

The only thing in Unitils which I'm not really comfortable with, is the fact that this library is not really designed for integration testing. The example which is clearly demonstrating it is an observer for Seam events. In this particular case we might need to leave unit testing world (mocks, spies and other test doubles) and develop real integration tests. The SeamTest module together with JBoss Embedded could help but it's really a tough task to make it running with Maven. On the other hand JBoss AS wasn't the target environment for us. Thankfully there is a new kid on the block from JBoss called Arquillian. In the next part of this post I will try to summarize my hands-on experience with this very promissing integration testing library. But first things first, let's look briefly at CDI events.

CDI Events

We are going to have JEE6 workshops for our customers and I was extremely happy when my colleagues asked me to play around with Arquillian and prepare some integration tests. I picked up a piece of logic responsible for logging market transactions based on CDI events. In brief it is a design technique which provides components interaction without any compilation-time dependencies. It's similar to the observer pattern but in case of CDI events, producers and observers are entirely decoupled from each other. If following example won't give you a clear explanation of the concept please refer to this well written blog post. Let's take a look at quite simplified code example.

import javax.ejb.Stateless;
import javax.enterprise.event.Event;
import javax.inject.Inject;

@Stateless
public class TradeService {

@Inject @Buy
private Event<ShareEvent> buyEvent;

public void buy(User user, Share share, Integer amount) {
user.addShares(share, amount);
ShareEvent shareEvent = new ShareEvent(share, user, amount);
buyEvent.fire(shareEvent);
}

...

}

import javax.enterprise.event.Observes;
import javax.inject.Inject;
import javax.inject.Singleton;

@Singleton
public class TradeTransactionObserver implements Serializable {

...

@Inject
TradeTransactionDao tradeTransactionDao;

public void shareBought(@Observes @Buy ShareEvent event) {
TradeTransaction tradeTransaction = new TradeTransaction(event.getUser(), event.getShare(), event.getAmount(), TransactionType.BUY);
tradeTransactionDao.save(tradeTransaction);
}

...

}

To preserve the clear picture I'm not going to include Share, TradeTransaction, User and ShareEvent classes. What is worth to mention however is that the instance of ShareEvent contains user, a share which he bought and amount. In the User entity we store a map of shares together with amount using new @ElementCollection annotation introduced in JPA 2.0. It allows to use entity classes as keys in the map.

@ElementCollection
@CollectionTable(name="USER_SHARES")
@Column(name="AMOUNT")
@MapKeyJoinColumn(name="SHARE_ID")
private Map<Share, Integer> shares = new HashMap<Share, Integer>();

Then in the TradeTransaction entity we simply store this information and additionally date and TransactionType. Complete code example could be downloaded from our google code page - see Resources section at the bottom of the post.

The very first test

We will use a following scenario for our test example (written in the BDD manner):

  • given user choose a CTP share,
  • when he buys it,
  • then market transaction should be logged.
So the test method could look as follows:

@Test
public void shouldLogTradeTransactionAfterBuyingShare() {
// given
User user = em.find(User.class, 1L);
Share share = shareDao.getByKey("CTP");
int amount = 1;

// when
tradeService.buy(user, share, amount);

// then
List<TradeTransaction> transactions = tradeTransactionDao.getTransactions(user);
assertThat(transactions).hasSize(1);
}

In the ideal world we could simply run this test in our favourite IDE or a build tool without writing a lot of plumbing code or dirty hacks to set up the environment like Glassfish, JBoss AS or Tomcat. And here's when Arquillian comes to play. The main goal of this project is to provide a convenient way for developers to run tests either in embedded or remote containers. It's still in alpha version but amount of already supported containers is really impressive. It opens the door to world of easy and pleasant to write integration tests. There are only two things required in order to make our tests "arquillian infected":

  1. Set @RunWith(Arquillian.class) annotation for you test class (or extend Arquillian base class if you are a TestNG guy).
  2. Prepare a deployment package using ShrinkWrap API in a method marked by the @Deployment annotation.

@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage(),
TradeTransactionDao.class.getPackage())
.addClass(TradeService.class)
.addManifestResource(new ByteArrayAsset("<beans/>".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

Odds and ends

Until now I guess everything was rather easy to grasp. Unfortunately while playing with tests I encountered a few shortcomings but found the solutions which I hope make this post valuable for readers. Otherwise you could simply jump to the user guide and code examples, don't you?

JAR hell

The most time consuming issue were the dependencies conflicts better known as JAR hell. The target environment for the workshop application is Glassfish v3 so I used the embedded version for my integration tests. I decided to have my tests as integral part of the project and here problems began.

The main problem is that you cannot use javaee-api because you will get exceptions while bootstrapping the container more or less similar to : java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file javax/validation/constraints/Pattern$Flag (related thread on JBoss forum). I also recommend not to download separated jars for each project which you are using because you will get even more exceptions :)

Important remark here: if you are using JPA 2.0 with Criteria API and a hibernate-jpamodelgen module for generating metamodel classes then you should also exclude org.hibernate.javax.persistence:hibernate-jpa-2.0-api dependency to avoid yet another class conflict.

You have basically two options:

  1. Use glassfish-embedded-all jar since it already contains all needed APIs.
  2. Create a separated project for integration testing and forget about everything what I mentioned in this section.

Preparing a database for testing

Next step is to create a data source for the Glassfish instance. But first we need to tell Arquillian to not delete the Glassfish server folder after each deployment / test execution (which is the default behaviour). All you need to do is to create a arquillian.xml file and add following configuration:

<arquillian xmlns="http://jboss.com/arquillian"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:glassfish="urn:arq:org.jboss.arquillian.glassfish.embedded30">
<glassfish:container>
<glassfish:bindPort>9090</glassfish:bindPort>
<glassfish:instanceRoot>src/test/glassfish-embedded30</glassfish:instanceRoot>
<glassfish:autoDelete>false</glassfish:autoDelete>
</glassfish:container>
</arquillian>

Then we need to take a domain.xml file from our normal Glassfish instance (i.e. from ${glassfish_home}/glassfish/domains/domain1/config), remove all <applications> and <system-applications> nodes, add new data source and copy it to the src/test/glassfish-embedded-30/config folder. We will use HSQL 1.8.0.7 version in our tests (2.0 version is causing some problems with DBUnit).

<domain log-root="${com.sun.aas.instanceRoot}/logs" application-root="${com.sun.aas.instanceRoot}/applications" version="22">
<system-applications />
<applications />
<resources>
...
<jdbc-connection-pool res-type="java.sql.Driver" description="In memory HSQLDB instance" name="arquilliandemo" driver-classname="org.hsqldb.jdbcDriver">
<property name="URL" value="jdbc:hsqldb:mem:arquilliandemomem" />
<property name="user" value="sa" />
<property name="password" value="" />
</jdbc-connection-pool>
<jdbc-resource pool-name="arquilliandemo" jndi-name="arquilliandemo-ds" />
</resources>
<servers>
<server name="server" config-ref="server-config">
...
<resource-ref ref="arquilliandemo-ds" />
</server>
</servers>
<configs>
...
</configs>
</domain>

The last file which you need to take from ${glassfish_home}/glassfish/domains/domain1/config folder is server.policy. And that's it! You have running Glassfish with HSQL database ready for some serious testing.

Data preparation

As I mentioned in the introductory section I really like Unitils and the way how you can seed the database with the test data. The only thing to do is to provide an XML file in flat dbunit format like this one:

<dataset>
<user id="1" firstname="John" lastname="Smith" username="username" password="password" />
<share id="1" key="CTP" price="18.00" />
</dataset>

and then put the @DataSet("test-data.xml") annotation either on the test method or a class.

I was really missing this feature so I decided to implement it myself. Very cool way of adding such behaviour is by using a JUnit rule. This mechanism, similar to interceptors, has been available since 4.7 release. I choose to extend the TestWatchman class since it provides methods to hook around test invocation. You can see the rule's logic based on the DBUnit flat xml data for seeding database in the example project. All you need to do is to create a public field in your test class and decorate it with @Rule annotation. Here's the complete test class.

@RunWith(Arquillian.class)
public class TradeServiceTest {

@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage(),
TradeTransactionDao.class.getPackage())
.addClass(TradeService.class)
.addManifestResource(new ByteArrayAsset("<beans />".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

@Rule
public DataHandlingRule dataHandlingRule = new DataHandlingRule();

@PersistenceContext
EntityManager em;

@Inject
ShareDao shareDao;

@Inject
TradeTransactionDao tradeTransactionDao;

@Inject
TradeService tradeService;

@Test
@PrepareData("datasets/shares.xml")
public void shouldLogTradeTransactionAfterBuyingShare() {
// given
User user = em.find(User.class, 1L);
Share share = shareDao.getByKey("CTP");
int amount = 1;

// when
tradeService.buy(user, share, amount);

// then
List<TradeTransaction> transactions = tradeTransactionDao.getTransactions(user);
assertThat(transactions).hasSize(1);
}

}

I must admit that it's a JUnit specific solution, but you can always implement your own @BeforeTest and @AfterTest methods to achieve the same result in TestNG.

DBUnit gotchas

Using DBUnit's CLEAN_INSERT strategy (or deleting table content after test execution by using DELETE_ALL) could raise constraint violation exceptions. HSQL provides special SQL statement for this purpose and the sample project is invoking this statement just before DBUnit.

Final thoughts

All in all Arquillian is a really great integration testing tool with full of potential. It's just great that the JBoss guys are aiming to provide support for almost all widely used application servers and web containers. As you could see from the examples above it's not that hard to have tests for more sophisticated scenarios than you can find in the user guide. Keep your eyes on Arquillian - the roadmap is really promising.
In the upcoming second part I will dive into CDI contexts and demonstrate how to use Arquillian for testing contextual components.
If you are writing an application for the Java EE 6 stack while not using Arquillian is a serious mistake!

Resources