Showing posts with label jboss. Show all posts
Showing posts with label jboss. Show all posts

Tuesday, May 22, 2012

CDI Query Alpha 4 Released

There's been a while now since my last post on CDI Query, but that doesn't mean we haven't been busy working on it! Last week I've pushed another Alpha release to Maven Central, and I'm very curious about your feedback!

Summarizing the feature highlights since the last post:
  • Entities: Support for entities defined in orm.xml descriptors, @IdClass and composite primary keys: this should close the gap for all kind of entities.
  • Method expression have gotten some more love: First of all validation. As those expressions are not very stable once it comes to refactorings, they are now validated at extension initialization. Also there's now support for ordering and nested properties.
  • Auditing: Keeping track of entity creation and change dates is often a requirement in enterprise applications. This is supported by simpy annotating your entities temporal properties with @CreatedOn or @ModifiedOn
  • Extensions in the Criteria Support API: Allows now selections. There's also been a major cleanup with regards to separation of API and implementation. 
  • Postprocessing of query methods with the QueryResult class. This will encapsulate the result of a method expression or a @Query annotation, and allows to add ordering, paging and other post processing dynamically.
  • The EntityHome API, inspired by the Seam 2 application framework, helps you to easily connect your entities to the UI and implementing CRUD pages with a few lines of code.
There have already jumped a few people on board, providing great feedback, bugfixes and new features! Special thanks to:
  • Jason Porter (bug fixes, Solder upgrade, support for PK related features and feature requests)
  • Marek Smigielski (bug fixes, feature requests)
  • Aaron Walker (API extensions)
  • and various others, providing some food for thought - thanks guys!

So what's next? We're still in Alpha, and some features have been - frankly said - just been hacked in to try them out. So there is need for cleanups. Featurewise there is also a polish of the various APIs required, but a larger part could be improved support for stored procedures, and a Forge plugin (in case you don't know Forge - hurry up and install it!). Or whatever you feel is missing - looking forward to your Issues on GitHub!

Tuesday, November 22, 2011

CDI Query Module First Alpha Released!


It's been some months now since we started exploring CDI extensions as a small exercise. As it turned out, the exercise forged into something usable which we're pushing now in the open as a first Alpha shot.

So I'm happy to announce the availability of the CDI Query Module on Maven Central! The module helps you creating JPA queries with much less boilerplate, and is of course leveraging the CDI extension API as well as Seam Solder. Some of the feature highlights are:

Query By Name

Assuming you have a Person entity which looks probably similar to this:

@Entity
public class Person { 

    ... // primary key etc. skipped

    @Getter @Setter
    private String firstName;
    
    @Getter @Setter
    private String lastName;

}

You can simply create a DAO interface to query for Persons:

@Dao
public interface PersonDao extends EntityDao<Person, Long> {

    Person findByFirstNameAndLastName(String firstName, String lastName);

}

This interface does not need to be implemented. A client can just inject it, call the interface method and in the background, the JPA query is automatically created and executed. To create the query, the method name is analyzed and matching the name to entity properties.

public class PersonAction {

    @Inject
    private PersonDao personDao;
    
    public void lookup() {
        person = personDao.findByFirstNameAndLastName(firstName, lastName);
    }

}

Note that the base interface contains also a couple of other methods which you might also expect from an entity DAO. Ideally, you should not need to inject an EntityManager anymore.

Query by Query Strings and Named Queries

Of course matching property names is not extremely safe to refactorings (some more validation support here is on the roadmap) - if you like to have more control over your JPA queries, you can also annotate the method with the query to execute:

@Dao
public interface PersonDao extends EntityDao<Person, Long> {

    @Query("select p from Person p where p.ssn = ?1")
    Person findBySSN(String ssn);

    @Query(named=Person.BY_FULL_NAME)
    Person findByFullName(String firstName, String lastName);

}

Criteria API Simplifications

If you're not a big fan of query strings but rather prefer using the JPA 2 criteria API, we also allow to simplify this with a small utility API:

public abstract class PersonDao extends AbstractEntityDao<Person, Long> {

    public List<Person> findAdultFamilyMembers(String name, Integer minAge) {
        return criteria()
                    .like(Person_.name, "%" + name + "%")
                    .gtOrEq(Person_.age, minAge)
                    .eq(Person_.validated, Boolean.TRUE)
                    .orderDesc(Person_.age)
                    .createQuery()
                    .getResultList();
    }

}

All the code is hosted and documented on GitHub. Please:

  • Give feedback! If you find this useful or actually not so, we're happy to hear what is still missing.
  • Participate! Forking and creating pull requests are really a breeze on GitHub :-)


Credits:

  • Bartek Majsak for improving the initial code, taking care about quality reports and soon finalizing the stuff on the validation branch ;-) (just kidding, check out Bartek's cool work on the Arquillian Persistence Module!)
  • Grails GORM for inspiring me for this Java implementation
  • The CDI folks for a really great specification
  • Last but not least the Arquillian guys, developing and testing this stuff is pure fun with Arquillian!

Friday, November 11, 2011

JBoss Forge JRebel Plugin - Video Tutorial

It's actually out for quite a while now (and therefore already slighly out of date) but I have hardly found some time to blog about it: If you're using JRebel and JBoss Forge, have a look at this Video tutorial how to use both of them together.



Make sure to watch it in HD and full screen mode. Thanks to Chris "Kubrik" Reimann for putting the tutorial together!

As an addition, Forge now also supports checking out specific versions of a plugin, which is in our case version 1.0.0.Beta1 of the JRebel plugin. We've also added the plugin to the Forge plugin repository index so you can search for it.

By the way, if you're only using one of the tools or (even worse) none of them, this is the perfect opportunity to get a quick hands-on.

Monday, April 4, 2011

A CDI Extension for Query Generation

While the DAO pattern has come out of fashion with Java EE 6, it can still be a useful approach to centralize query related logic when you have to do more than just delegating to the entity manager. Often this approach leads then to small frameworks containing reusable code within a project, or also sometimes into a bigger framework spreading over complete IT departments within a company.

Inspired by features of Grails or the Spring Data JPA project, my plan was to learn more about CDI extensions by implementing a proof of concept on such a DAO framework based on CDI. CDI is part of Java EE 6 and, already by itself a powerful addition to the platform programming model, provides SPIs to extend it even further. A prominent sample is the Seam framework, which in its latest version contains a lot of those extensions. Just drop them in your classpath and they are ready for use. Impressive enough to learn more about the technology.

While I was getting my hands dirty it seemed to me the result is useful enough to share it here - and also of course to demonstrate the power and easiness of creating CDI extensions. In this article I’ll give you a quick start on CDI extensions as well as (hopefully) an idea on how a portable framework might look like based on this technology. All code presented here is available on GitHub.

The DAO Framework

Some common ingredients of a DAO framework are captured in the code snippet below:

@Dao
public interface SimpleDao extends EntityDao<Simple, Long> {

Simple findByNameAndEnabled(String name, Boolean enabled);

@Query(named=Simple.BY_NAME)
List<Simple> findByNamedQuery(String name);

@Query(named=Simple.BY_NAME)
List<Simple> findByNamedQueryRestricted(String name,
@MaxResults int max, @FirstResult int first);

@Query(named=Simple.BY_ID)
Simple findByNamedQueryNamedParams(
@QueryParam("id") Long id,
@QueryParam("enabled") Boolean enabled);

@Query("select s from Simple s where s.name = ?1")
Simple findByQueryString(String name);

}

Typically a DAO framework has a common interface concrete DAOs can extend from. Using generics here allows to have a standard set of methods like saving or retrieving all entities of a specific type, and can also be used during query generation. Usually this is nothing that cannot be done easily with an entity manager. But once you have injected a DAO in your service class - do you really want to inject the entity manager as well? In order to keep code leaner, a DAO base interface should provide this kind of methods and of course implement them all automagically - nothing you would like to rewrite again and again.

Some other features are shown in the method declarations above. Automatic query generation out of method names and parameters as GORM method expressions do, or creating queries based on annotation meta data and parameters will often allow just leaving those easy cases to the query generator and keep the code lean to focus on the complex ones.

The CDI Approach

One way to implement such a framework is over a CDI extension. CDI allows extensions to listen to various lifecycle events:
- Before CDI starts discovering beans.
- While it processes annotated types, injection targets, producers, beans and observers.
- And when it finishes with both discovery and validation.

As in our case we are dealing with plain interfaces, the easiest approach is to simply annotate the interface and then listen for annotation processing events. The sample above shows a Dao annotation on the interface, but this would be placed on the extended AbstractEntityDao interface so developers won’t have to worry about it.

So on the extension we listen for annotated types, check if it is our Dao annotation and register a proxy bean which implements the annotated type. Registering the extension a matter of two things:
1. Implementing the extension class and listen for the Dao annotation.

public class QueryExtension implements Extension {

<X> void processAnnotatedType(@Observes ProcessAnnotatedType<X> event, BeanManager beanManager) {
// all the required information on the type is found in the event
// the bean manager is used to register the proxy
}

}

2. Register the extension class as a service provider in the appropriate file (META-INF/services/javax.enterprise.inject.spi.Extension).

Registering the proxy with the bean manager is slightly more work, but luckily someone has already done that. If you work with CDI extensions, make sure to include Seam Solder - the Swiss army knife for CDI developers. Solder has built-in support for so called service handlers, where you annotate an abstract type with a reference to the handler class. The documentation use case looks probably kind of familiar ;-) All our extension will have to do is to override the handler lookup - and we’re done with registering the proxy! The reason we don't use the ServiceHandlerExtension directly is to separate the handler reference from the annotation, and that we can have a chance to e.g. validate and process further meta data in the extension class.

public class QueryExtension extends ServiceHandlerExtension {

@Override
protected <X> Class<?> getHandlerClass(ProcessAnnotatedType<X> event) {
if (event.getAnnotatedType().isAnnotationPresent(Dao.class)) {
return QueryHandler.class;
}
return null;
}

}

public class QueryHandler {

@Inject
private Instance<EntityManager> entityManager;

@AroundInvoke
public Object handle(InvocationContext ctx) throws Exception {
...
}

}


The handler class simply has to provide a public method annotated with @AroundInvoke, where you get all the required information to build up your query dynamically. Note that in the handler class you will also be able to use CDI services like injection.
As a framework user, all you will have to do is to drop the JAR in the classpath and annotate the interface. Look mom, no XML! Well almost, you still need an (empty) beans.xml somewhere to activate all the CDI magic.

Testing the Extension

Arquillian is a relatively new testing framework which allows you to create dynamic deployment units and run them in a container. A great feature for an extension as you can easily test it live in a unit test without leaving the IDE! This looks like the following:

@RunWith(Arquillian.class)
public class QueryHandlerTest {

@Deployment
public static Archive<?> deployment() {
return ShrinkWrap.create(WebArchive.class, "test.war")
.addClasses(QueryExtension.class)
.addAsWebInfResource("test-persistence.xml", ArchivePaths.create("classes/META-INF/persistence.xml"))
.addAsWebInfResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml"))
.addAsWebInfResource("glassfish-resources.xml")
.addClasses(SimpleDao.class)
.addPackage(Simple.class.getPackage());
}

@Inject
private SimpleDao dao;

@Produces
@PersistenceContext
private EntityManager entityManager;

@Test
public void shouldCreateQueryByMethodName() {
// given
final String name = "testCreateQueryByMethodName";
createSimple(name);

// when
Simple result = dao.findByNameAndEnabled(name, Boolean.TRUE);

// then
Assert.assertNotNull(result);
Assert.assertEquals(name, result.getName());
}

}

This creates a stripped down web archive deployment, in this sample for an embedded GlassFish. The test class itself is registered as a bean in the test and can therefore use injections (CDI or container injections, see the @PersistenceContext). Resources like persistence units can be added on demand as we see in the deployment method (persistence.xml for the persistence unit, glassfish-resources.xml contains a data source definition).

The container gets started and we can see our injected proxy in action. In the sample above we then create some test data and see if our generated query fetches the right data back.

Conclusion

Of course this article is a very simplified version of the whole setup - especially the Arquillian setup required some trial and error as the framework is still in Alpha (if anybody finds out how to create the data source without deploying the glassfish-resources.xml into a web archive - let me know).

Check out the project source on GitHub to get started. The module structure might look a little complicate but it follows common Seam module structure separating API (in our case the client annotations) from implementation (the extension code). Documentation is still "basic" but looking at the unit tests might give an indication on the usage.

Once the project setup is done, things get extremely productive though. Seam Solder provides a rich tool set you can use to create an extension, Arquillian lets you immediately test your code inside a container. The result is an easy to reuse and easy to distribute framework you will probably get back to in many of your following Java EE 6 projects.

Friday, August 13, 2010

Test drive with Arquillian and CDI (Part 2)

The first part of the Arquillian series was mainly focused on working with an in-memory database, DI (dependency injection) and events from the CDI spec. Now we will take a closer look on how to deal with testing Contextual components. For this purpose we will extend our sample project from the first part by adding a PortfolioController class, a conversation scoped bean for handling processing of user's portfolio management.

@ConversationScoped @Named("portfolioController")
public class PortfolioController implements Serializable {

// ...

Map<Share, Integer> sharesToBuy = new HashMap<Share, Integer>();

@Inject @LoggedIn
User user;

@Inject
private TradeService tradeService;

@Inject
private Conversation conversation;

public void buy(Share share, Integer amount) {
if (conversation.isTransient()) {
conversation.begin();
}
Integer currentAmount = sharesToBuy.get(share);
if (null == currentAmount) {
currentAmount = Integer.valueOf(0);
}

sharesToBuy.put(share, currentAmount + amount);
}

public void confirm() {
for (Map.Entry<Share, Integer> sharesAmount : sharesToBuy.entrySet()) {
tradeService.buy(user, sharesAmount.getKey(), sharesAmount.getValue());
}
conversation.end();
}

public void cancel() {
sharesToBuy.clear();
conversation.end();
}

// ...

}

So, let's try out Arquillian! As we already know from the first part we need to create a deployment package, which then will be deployed by Arquillian on the target container (in our case Glassfish 3.0.1 Embedded).


@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage())
.addClasses(TradeTransactionDao.class,
ShareDao.class,
PortfolioController.class)
.addManifestResource(new ByteArrayAsset("<beans />".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

Next we can start develop a simple test scenario:


  • given user choose CTP share,

  • when he confirms the order,

  • then his portfolio should be updated.

Which in JUnit realms could be written as follows:


@RunWith(Arquillian.class)
public class PortfolioControllerTest {

// deployment method

@Inject
ShareDao shareDao;

@Inject
PortfolioController portfolioController;

@Test
public void shouldAddCtpShareToUserPortfolio() {
// given
User user = portfolioController.getUser();
Share ctpShare = shareDao.getByKey("CTP");

// when
portfolioController.buy(ctpShare, 1);
portfolioController.confirm();

// then
assertThat(user.getSharesAmount(ctpShare)).isEqualTo(3);
}

}

Looks really simple, doesn't it? Well, it's almost that simple but there are some small details which you need to be aware of.

Producers

CDI provides a feature similar to Seam factories or Guice providers. It's called producer and it allows you to create injectable dependency. This could be especially useful when creation of such an instance requires additional logic, i.e. it needs to be obtained from an external source. A logged in user in a web application is a good example here. Thanks to the CDI @Produces construct we can still have very clean code which just works! All we need to do in order to inject the currently logged in user to our bean is as simple as that:

1. Create a @LoggedIn qualifier which will be used to define that a particular injection is expecting this concrete User bean.


@Qualifier
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.METHOD, ElementType.FIELD, ElementType.PARAMETER, ElementType.TYPE})
public @interface LoggedIn {
}

2. Implement the producer method which will instantiate the logged in user in the session scope just after he successfully accesses the application, so it will provide an instance of the User class which is of @LoggedIn "type".


@Produces @SessionScoped @LoggedIn User loggedInUser() {
// code for retrieving current user from session
}

3. Decorate all injection points in other beans where we need this instance.


@Inject @LoggedIn
User user;

However this construct could be problematic when writing tests and an attentive reader would probably already be concerned about it. But with Arquillian we will run our test code in the CDI container and there is no need to simulate login procedure, using mock http sessions or any other constructs. We can take full advantage of this fact and create producer method which will replace our original one and provide the user directly from entity manager for example.


@Produces @LoggedIn User loggedInUser() {
return entityManager.find(User.class, 1L);
}

Note: I removed @SessionScoped annotation from loggedInUser() producer method intentionally. Otherwise you could have troubles with Weld proxies and EclipseLink while trying to persist the entity class. For tests it actually does not make any difference.

Context handling

One small problem arrived when I tried to test logic based on the conversation context. I had to figure out a way to programmatically create the appropriate context which then will be used by the SUT (or CUT if you prefer this abbreviation), because I was getting org.jboss.weld.context.ContextNotActiveException. Unfortunately I wasn't able to find anything related to it on the Arquillian forum or wiki, so I desperately jumped to the Seam 3 examples. I read somewhere that they are also using this library to test their modules and sample projects. Bingo! I found what I was looking for. To make the test code more elegant I built my solution the same way as for handling the database in the first part - by using annotations and JUnit rules. Using a @RequiredScope annotation on the test method will instruct JUnit rule to handle proper context initialization and cleanup after finishing the test. To make the code even cleaner we can implement such logic in a dedicated class and treat the enum as a factory:


public enum ScopeType {

CONVERSATION {
@Override
public ScopeHandler getHandler() {
return new ConversationScopeHandler();
}
}

// ... other scopes

public abstract ScopeHandler getHandler();

}

public class ConversationScopeHandler implements ScopeHandler {

@Override
public void initializeContext() {
ConversationContext conversationContext = Container.instance().services().get(ContextLifecycle.class).getConversationContext();
conversationContext.setBeanStore(new HashMapBeanStore());
conversationContext.setActive(true);
}

@Override
public void cleanupContext() {
ConversationContext conversationContext = Container.instance().services().get(ContextLifecycle.class).getConversationContext();
if (conversationContext.isActive()) {
conversationContext.setActive(false);
conversationContext.cleanup();
}
}
}

The JUnit rule will only extract the annotation's value of the test method and delegate context handling to the proper implementation:


public class ScopeHandlingRule extends TestWatchman {

private ScopeHandler handler;

@Override
public void starting(FrameworkMethod method) {
RequiredScope rc = method.getAnnotation(RequiredScope.class);
if (null == rc) {
return;
}
ScopeType scopeType = rc.value();
handler = scopeType.getHandler();
handler.initializeContext();
}

@Override
public void finished(FrameworkMethod method) {
if (null != handler) {
handler.cleanupContext();
}
}
}

Finally here's fully working test class with two additional test scenarios. I also used DBUnit add-on from first post for convenience.


@RunWith(Arquillian.class)
public class PortfolioControllerTest {

@Rule
public DataHandlingRule dataHandlingRule = new DataHandlingRule();

@Rule
public ScopeHandlingRule scopeHandlingRule = new ScopeHandlingRule();

@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage())
.addClasses(TradeTransactionDao.class,
ShareDao.class,
TradeService.class,
PortfolioController.class)
.addManifestResource(new ByteArrayAsset("<beans />".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

@PersistenceContext
EntityManager entityManager;

@Inject
ShareDao shareDao;

@Inject
TradeTransactionDao tradeTransactionDao;

@Inject
PortfolioController portfolioController;

@Test
@PrepareData("datasets/shares.xml")
@RequiredScope(ScopeType.CONVERSATION)
public void shouldAddCtpShareToUserPortfolio() {
// given
User user = portfolioController.getUser();
Share ctpShare = shareDao.getByKey("CTP");

// when
portfolioController.buy(ctpShare, 1);
portfolioController.confirm();

// then
assertThat(user.getSharesAmount(ctpShare)).isEqualTo(3);
}

@Test
@PrepareData("datasets/shares.xml")
@RequiredScope(ScopeType.CONVERSATION)
public void shouldNotModifyUserPortfolioWhenCancelProcess() {
// given
User user = portfolioController.getUser();
Share ctpShare = shareDao.getByKey("CTP");

// when
portfolioController.buy(ctpShare, 1);
portfolioController.cancel();

// then
assertThat(user.getSharesAmount(ctpShare)).isEqualTo(2);
}

@Test
@RequiredScope(ScopeType.CONVERSATION)
@PrepareData("datasets/shares.xml")
public void shouldRecordTransactionWhenUserBuysAShare() {
// given
User user = portfolioController.getUser();
Share ctpShare = shareDao.getByKey("CTP");

// when
portfolioController.buy(ctpShare, 1);
portfolioController.confirm();

// then
List<TradeTransaction> transactions = tradeTransactionDao.getTransactions(user);
assertThat(transactions).hasSize(1);
}

@Produces @LoggedIn User loggedInUser() {
return entityManager.find(User.class, 1L);
}

}

For the full source code you can jump directly to our google code repository.

Conclusion

As you can see playing with Arquillian is pure fun for me. Latest 1.0.0.Alpha3 release brought a lot of new goodies to the table. I hope that the examples in this blog post convinced you that working with different scopes is quite straightforward and requires just a little bit of additional code. However it's still not the ideal solution because it's using Weld's internal API to create and manage scopes. So if you are using a different CDI container you need to figure out how to achieve it, but it's just a matter of adjusting ScopeHandler implementation to your needs.

There is much more to write about Arquillian so keep an eye on our blog and share your thoughts and suggestions through comments.

Friday, July 30, 2010

Using Seam 2 with JPA 2

It’s a difficult time to architect new Java web applications. Seam 2 is a proven and well working application stack, but we will hardly see many new versions on the Seam 2 train. Java EE 6 is in general an excellent option, but if your customer’s choice of application server does not yet support it, it is not reasonable. Also there is still some time left for Seam 3 prime time, which builds on top of Java EE 6.

Facing this kind of choice recently, I looked into possible migration paths between the two variants. One thing I have seen often on Seam 2 applications is that people really like the Hibernate criteria API and therefore use Hibernate directly. While Hibernate is an excellent ORM framework, it’s preferrable to use the JPA API when moving to Java EE 6. So - why not use Seam 2 with JPA 2, which finally features an even better (typesafe) criteria API?

It turns out to be a quite easy setup (once you get classloading right), with some small extra modifications. I’ve been using Maven, Seam 2.2 and Hibernate 3.5.4 on JBoss 4.2.3. Lets start with preparing the server. You need to remove the old Hibernate and JPA classes and add the new persistence provider with some dependencies (of course this will be different on other servers):

To be removed from server/lib:


Add to server/lib:


Next lets create a sample Seam project. I’m using the CTP Maven archetype for Seam. In the POM file, remove the references to the Hibernate and old JPA libraries and add the new JPA 2 libraries:

<!-- Remove all embedded dependencies
<dependency>
<groupId>org.jboss.seam.embedded</groupId>
<artifactId>...</artifactId>
</dependency -->

<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.0-api</artifactId>
<version>1.0.0.Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>3.5.4-Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-jpamodelgen</artifactId>
<version>1.0.0.Final</version>
<scope>provided</scope>
</dependency>

Note: If you’ve been using embedded JBoss for your integration tests, this will probably not work without exchanging the JARS there too. I’ve been moving away from this approach as it turned out to be a common reason for headache on our nightly builds as well as running tests in Eclipse. I’m very excited to see Arquillian evolving on this topic!

JPA 2 integrates with JSR 303 bean validation, which is the successor of Hibernate Validator. Unfortunately Seam 2 has references to Hibernate Validator 3, where JPA needs version 4. Adding the validator legacy JAR fixes this problem. As bean validation is now part of Java EE 6, we can add it to the server classpath as shown above, as well as to our POM:

<dependency>
<artifactId>validation-api</artifactId>
<groupId>javax.validation</groupId>
<version>1.0.0.GA</version>
<scope>provided</scope>
</dependency>

Now it’s time to move to some code. You can jump straight in your persistence config files and bring the schema up to version 2:

<persistence xmlns="http://java.sun.com/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd"
version="2.0"> ...

<entity-mappings xmlns="http://java.sun.com/xml/ns/persistence/orm"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm http://java.sun.com/xml/ns/persistence/orm_2_0.xsd"
version="2.0"> ...

Seam proxies entity managers to implement features like EL replacements in JPQL queries. This proxy does not implement methods new in JPA 2 and will therefore fail. You can write your own proxy very easy:

public class Jpa2EntityManagerProxy implements EntityManager {

private EntityManager delegate;

public Jpa2EntityManagerProxy(EntityManager entityManager) {
this.delegate = entityManager;
}

@Override
public Object getDelegate() {
return PersistenceProvider.instance()
.proxyDelegate(delegate.getDelegate());
}

@Override
public void persist(Object entity) {
delegate.persist(entity);
}
...
}

Add the special Seam functionality as needed. In order to use the proxy with Seam, you’ll have to overwrite the HibernatePersistenceProvider Seam component:

@Name("org.jboss.seam.persistence.persistenceProvider")
@Scope(ScopeType.STATELESS)
@BypassInterceptors
// The original component is precedence FRAMEWORK
@Install(precedence = Install.APPLICATION,
classDependencies={"org.hibernate.Session",
"javax.persistence.EntityManager"})
public class HibernateJpa2PersistenceProvider extends HibernatePersistenceProvider {

@Override
public EntityManager proxyEntityManager(EntityManager entityManager) {
return new Jpa2EntityManagerProxy(entityManager);
}

}

If you use Hibernate Search, have a look at the superclass implementation - you might want to instantiate a FullTextEntityManager directly (as you have it in your classpath - but note that this has not been tested here).

Both implementations are on our Google Code repository, and you can integrate them directly over the following Maven dependency:

<dependency>
<groupId>com.ctp.seam</groupId>
<artifactId>seam-jpa2</artifactId>
<version>1.0.0</version>
</dependency>


You’re now ready to code JPA 2 queries! We’ve already included the meta model generator utility, so let’s activate it for the build:

<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-source</id>
<phase>validate</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>${basedir}/src/main/hot</source>
<source>${basedir}/target/metamodel</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<version>1.3.1</version>
<executions>
<execution>
<id>process</id>
<goals>
<goal>process</goal>
</goals>
<phase>generate-sources</phase>
<configuration>
<outputDirectory>${basedir}/target/metamodel</outputDirectory>
<processors>
<processor>
org.hibernate.jpamodelgen.JPAMetaModelEntityProcessor
</processor>
</processors>
</configuration>
</execution>
</executions>
</plugin>


In order to use the processor plugin, you also need the following Maven repositories in your POM:

<pluginRepository>
<id>annotation-processing-repository</id>
<name>Annotation Processing Repository</name>
<url>http://maven-annotation-plugin.googlecode.com/svn/trunk/mavenrepo</url>
</pluginRepository>
<pluginRepository>
<id>jfrog-repository</id>
<name>JFrog Releases Repository</name>
<url>http://repo.jfrog.org/artifactory/plugins-releases</url>
</pluginRepository>

Run the build, update the project config to include the new source folder - and finally we’re ready for some sample code:

@Name("userDao")
@AutoCreate
public class UserDao {

@In
private EntityManager entityManager;

private ParameterExpression<String> param;
private CriteriaQuery<User> query;

@Create
public void init() {
CriteriaBuilder cb = entityManager.getCriteriaBuilder();
query = cb.createQuery(User.class);
param = cb.parameter(String.class);
Root<User> user = query.from(User.class);
query.select(user)
.where(cb.equal(user.get(User_.username), param));
}

public User lookupUser(String username) {
return entityManager.createQuery(query)
.setParameter(param, username)
.getSingleResult();
}
}

This is now quite close to Java EE 6 code - all we will have to do is exchange some annotations:

@Stateful
public class UserDao {

@PersistenceContext
private EntityManager entityManager; ...

@Inject
public void init() { ...
}

Enjoy!

Tuesday, July 13, 2010

Test drive with Arquillian and CDI (Part 1)

Here at Cambridge Technology Partners we are as serious about testing as we are about cutting-edge technologies like CDI. Last time we wrote about testing EJBs on embedded Glassfish and now we are back with something even more powerful, so keep on reading!

Background

Recently I was involved in a project based on JBoss Seam where we used Unitils for testing business logic and JPA. I really like this library, mainly because of the following aspects:

  • Provides easy configuration and seamless integration of JPA (also a little bit of DI).
  • Greatly simplifies management of the test data. All you need to do in order to seed your database with prepared data is providing a xml dataset (in a DBUnit flat xml format) and then add @DataSet annotation on the test class or method.
Unitils library is definitely an interesting topic for another blog entry, but since we are going to dive into the Java EE 6 testing those of you who are not patient enough for next blog entry can jump directly to the tutorial site. I'm sure you will like it.

The only thing in Unitils which I'm not really comfortable with, is the fact that this library is not really designed for integration testing. The example which is clearly demonstrating it is an observer for Seam events. In this particular case we might need to leave unit testing world (mocks, spies and other test doubles) and develop real integration tests. The SeamTest module together with JBoss Embedded could help but it's really a tough task to make it running with Maven. On the other hand JBoss AS wasn't the target environment for us. Thankfully there is a new kid on the block from JBoss called Arquillian. In the next part of this post I will try to summarize my hands-on experience with this very promissing integration testing library. But first things first, let's look briefly at CDI events.

CDI Events

We are going to have JEE6 workshops for our customers and I was extremely happy when my colleagues asked me to play around with Arquillian and prepare some integration tests. I picked up a piece of logic responsible for logging market transactions based on CDI events. In brief it is a design technique which provides components interaction without any compilation-time dependencies. It's similar to the observer pattern but in case of CDI events, producers and observers are entirely decoupled from each other. If following example won't give you a clear explanation of the concept please refer to this well written blog post. Let's take a look at quite simplified code example.

import javax.ejb.Stateless;
import javax.enterprise.event.Event;
import javax.inject.Inject;

@Stateless
public class TradeService {

@Inject @Buy
private Event<ShareEvent> buyEvent;

public void buy(User user, Share share, Integer amount) {
user.addShares(share, amount);
ShareEvent shareEvent = new ShareEvent(share, user, amount);
buyEvent.fire(shareEvent);
}

...

}

import javax.enterprise.event.Observes;
import javax.inject.Inject;
import javax.inject.Singleton;

@Singleton
public class TradeTransactionObserver implements Serializable {

...

@Inject
TradeTransactionDao tradeTransactionDao;

public void shareBought(@Observes @Buy ShareEvent event) {
TradeTransaction tradeTransaction = new TradeTransaction(event.getUser(), event.getShare(), event.getAmount(), TransactionType.BUY);
tradeTransactionDao.save(tradeTransaction);
}

...

}

To preserve the clear picture I'm not going to include Share, TradeTransaction, User and ShareEvent classes. What is worth to mention however is that the instance of ShareEvent contains user, a share which he bought and amount. In the User entity we store a map of shares together with amount using new @ElementCollection annotation introduced in JPA 2.0. It allows to use entity classes as keys in the map.

@ElementCollection
@CollectionTable(name="USER_SHARES")
@Column(name="AMOUNT")
@MapKeyJoinColumn(name="SHARE_ID")
private Map<Share, Integer> shares = new HashMap<Share, Integer>();

Then in the TradeTransaction entity we simply store this information and additionally date and TransactionType. Complete code example could be downloaded from our google code page - see Resources section at the bottom of the post.

The very first test

We will use a following scenario for our test example (written in the BDD manner):

  • given user choose a CTP share,
  • when he buys it,
  • then market transaction should be logged.
So the test method could look as follows:

@Test
public void shouldLogTradeTransactionAfterBuyingShare() {
// given
User user = em.find(User.class, 1L);
Share share = shareDao.getByKey("CTP");
int amount = 1;

// when
tradeService.buy(user, share, amount);

// then
List<TradeTransaction> transactions = tradeTransactionDao.getTransactions(user);
assertThat(transactions).hasSize(1);
}

In the ideal world we could simply run this test in our favourite IDE or a build tool without writing a lot of plumbing code or dirty hacks to set up the environment like Glassfish, JBoss AS or Tomcat. And here's when Arquillian comes to play. The main goal of this project is to provide a convenient way for developers to run tests either in embedded or remote containers. It's still in alpha version but amount of already supported containers is really impressive. It opens the door to world of easy and pleasant to write integration tests. There are only two things required in order to make our tests "arquillian infected":

  1. Set @RunWith(Arquillian.class) annotation for you test class (or extend Arquillian base class if you are a TestNG guy).
  2. Prepare a deployment package using ShrinkWrap API in a method marked by the @Deployment annotation.

@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage(),
TradeTransactionDao.class.getPackage())
.addClass(TradeService.class)
.addManifestResource(new ByteArrayAsset("<beans/>".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

Odds and ends

Until now I guess everything was rather easy to grasp. Unfortunately while playing with tests I encountered a few shortcomings but found the solutions which I hope make this post valuable for readers. Otherwise you could simply jump to the user guide and code examples, don't you?

JAR hell

The most time consuming issue were the dependencies conflicts better known as JAR hell. The target environment for the workshop application is Glassfish v3 so I used the embedded version for my integration tests. I decided to have my tests as integral part of the project and here problems began.

The main problem is that you cannot use javaee-api because you will get exceptions while bootstrapping the container more or less similar to : java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file javax/validation/constraints/Pattern$Flag (related thread on JBoss forum). I also recommend not to download separated jars for each project which you are using because you will get even more exceptions :)

Important remark here: if you are using JPA 2.0 with Criteria API and a hibernate-jpamodelgen module for generating metamodel classes then you should also exclude org.hibernate.javax.persistence:hibernate-jpa-2.0-api dependency to avoid yet another class conflict.

You have basically two options:

  1. Use glassfish-embedded-all jar since it already contains all needed APIs.
  2. Create a separated project for integration testing and forget about everything what I mentioned in this section.

Preparing a database for testing

Next step is to create a data source for the Glassfish instance. But first we need to tell Arquillian to not delete the Glassfish server folder after each deployment / test execution (which is the default behaviour). All you need to do is to create a arquillian.xml file and add following configuration:

<arquillian xmlns="http://jboss.com/arquillian"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:glassfish="urn:arq:org.jboss.arquillian.glassfish.embedded30">
<glassfish:container>
<glassfish:bindPort>9090</glassfish:bindPort>
<glassfish:instanceRoot>src/test/glassfish-embedded30</glassfish:instanceRoot>
<glassfish:autoDelete>false</glassfish:autoDelete>
</glassfish:container>
</arquillian>

Then we need to take a domain.xml file from our normal Glassfish instance (i.e. from ${glassfish_home}/glassfish/domains/domain1/config), remove all <applications> and <system-applications> nodes, add new data source and copy it to the src/test/glassfish-embedded-30/config folder. We will use HSQL 1.8.0.7 version in our tests (2.0 version is causing some problems with DBUnit).

<domain log-root="${com.sun.aas.instanceRoot}/logs" application-root="${com.sun.aas.instanceRoot}/applications" version="22">
<system-applications />
<applications />
<resources>
...
<jdbc-connection-pool res-type="java.sql.Driver" description="In memory HSQLDB instance" name="arquilliandemo" driver-classname="org.hsqldb.jdbcDriver">
<property name="URL" value="jdbc:hsqldb:mem:arquilliandemomem" />
<property name="user" value="sa" />
<property name="password" value="" />
</jdbc-connection-pool>
<jdbc-resource pool-name="arquilliandemo" jndi-name="arquilliandemo-ds" />
</resources>
<servers>
<server name="server" config-ref="server-config">
...
<resource-ref ref="arquilliandemo-ds" />
</server>
</servers>
<configs>
...
</configs>
</domain>

The last file which you need to take from ${glassfish_home}/glassfish/domains/domain1/config folder is server.policy. And that's it! You have running Glassfish with HSQL database ready for some serious testing.

Data preparation

As I mentioned in the introductory section I really like Unitils and the way how you can seed the database with the test data. The only thing to do is to provide an XML file in flat dbunit format like this one:

<dataset>
<user id="1" firstname="John" lastname="Smith" username="username" password="password" />
<share id="1" key="CTP" price="18.00" />
</dataset>

and then put the @DataSet("test-data.xml") annotation either on the test method or a class.

I was really missing this feature so I decided to implement it myself. Very cool way of adding such behaviour is by using a JUnit rule. This mechanism, similar to interceptors, has been available since 4.7 release. I choose to extend the TestWatchman class since it provides methods to hook around test invocation. You can see the rule's logic based on the DBUnit flat xml data for seeding database in the example project. All you need to do is to create a public field in your test class and decorate it with @Rule annotation. Here's the complete test class.

@RunWith(Arquillian.class)
public class TradeServiceTest {

@Deployment
public static Archive<?> createDeploymentPackage() {
return ShrinkWrap.create("test.jar", JavaArchive.class)
.addPackages(false, Share.class.getPackage(),
ShareEvent.class.getPackage(),
TradeTransactionDao.class.getPackage())
.addClass(TradeService.class)
.addManifestResource(new ByteArrayAsset("<beans />".getBytes()), ArchivePaths.create("beans.xml"))
.addManifestResource("inmemory-test-persistence.xml", ArchivePaths.create("persistence.xml"));
}

@Rule
public DataHandlingRule dataHandlingRule = new DataHandlingRule();

@PersistenceContext
EntityManager em;

@Inject
ShareDao shareDao;

@Inject
TradeTransactionDao tradeTransactionDao;

@Inject
TradeService tradeService;

@Test
@PrepareData("datasets/shares.xml")
public void shouldLogTradeTransactionAfterBuyingShare() {
// given
User user = em.find(User.class, 1L);
Share share = shareDao.getByKey("CTP");
int amount = 1;

// when
tradeService.buy(user, share, amount);

// then
List<TradeTransaction> transactions = tradeTransactionDao.getTransactions(user);
assertThat(transactions).hasSize(1);
}

}

I must admit that it's a JUnit specific solution, but you can always implement your own @BeforeTest and @AfterTest methods to achieve the same result in TestNG.

DBUnit gotchas

Using DBUnit's CLEAN_INSERT strategy (or deleting table content after test execution by using DELETE_ALL) could raise constraint violation exceptions. HSQL provides special SQL statement for this purpose and the sample project is invoking this statement just before DBUnit.

Final thoughts

All in all Arquillian is a really great integration testing tool with full of potential. It's just great that the JBoss guys are aiming to provide support for almost all widely used application servers and web containers. As you could see from the examples above it's not that hard to have tests for more sophisticated scenarios than you can find in the user guide. Keep your eyes on Arquillian - the roadmap is really promising.
In the upcoming second part I will dive into CDI contexts and demonstrate how to use Arquillian for testing contextual components.
If you are writing an application for the Java EE 6 stack while not using Arquillian is a serious mistake!

Resources

Wednesday, April 28, 2010

JBoss Seam Maven Archetype Video Tutorial

While the release of Seam 3 is getting closer, you might still want to do quick prototyping based on a Seam 2 project. Of course this is easiest done with either seam-gen, or if you're like me a Maven user, with an Archetype.

Chris has prepared a video tutorial on how to get started with our Maven WAR archetype (featuring hot deploy, TestNG with embedded JBoss and either RichFaces or ICEFaces inclusion) based on one of my previous articles. Check it out here (best seen in HD [UPDATE: sorry, missed the HD embed]):



Enjoy!

Friday, October 30, 2009

Portal Update November 2009 and Java Buzz

After a busy summer I think it is absolutely necessary to summarize the latest updates in the Java Portal space.

From the commercial side, major players are:
  • Oracle WebLogic Portal:
    The current release is still WLP 10g3 but soon we expect the first 11g release with codename "Sunshine".
    Major improvements are
    - JSR-286 compliance (Portlet 2.0)
    - WSRP 2.0 support (Event based coordination, IPC for remote portlets, resource serving)
    - Full interoperability with WebCenter in both directions
    - Improved Ajax support
    - VCR Direct SPI Support for UCM
    - Even more REST APIs to access portal informations
    - New REST API to access Unified User Profile data
    - First support of the new Content Management Standard driven by Oasis: CMIS
    - Apache Beehive still supported but not enhanced
    - Replacement of Autonomy Search Engine by SES

    The REST API Architecture in WLP 11g:




  • Oracle WebCenter Suite 11g R1:
    WebCenter Suite includes the formerly known product AquaLogic Interaction by BEA, now called WCI, WebCenter Interaction. Download it here.

  • Adobe: Adobe? Yes... Since the new release of Adobe ColdFusion 9, there is a new portal player to be considered when it comes to interoperability based on JSR-168/286 portlets. ColdFusion 9 is now fully compliant with the Portlet Containers from the Java world.

  • IBM WebSphere Portal 6.1: no updates.
Open Source portals that have the most promising potential at the moment are:
  • JBoss Portal and eXo Portal have been merged!



    This latest interesting announcement has been made official on Sep 3rd at the JBoss World in Chicago. eXo has fully committed its entire open source portal stack to Red Hat’s newly introduced GateIn portal project extending it with cutting-edge collaboration features as well as document and content management features.

  • Liferay Portal 5.2: no major updates since last post.

  • SUN continues to offer the Web Space Server based on the Liferay Portal source code. No major updates besides a brand new white paper.

  • JBoss Portal 2.7.2:
    - Since June 2009, JBoss fully focused on the new project GateIn

  • eXo Portal 2.5.1: no updates besides GateIn announcements

  • Jetspeed 2.2.0: After quite a long time without updates a new version has been released (summer 2009) that is fully JSR-286 compliant! The new version comes along with quite a bunch of updated documentation pages... So it would be worth to have a look at it again. Download it here.
Besides the Portal related activities, let's have a look on Java related quick news:
  • Java EE 6: The JSR-316 has reached Proposed Final Draft. The hot discussions between JSR-299 vs JSR-330 have finally found a common resolution and both will be part of EE6 where JSR-299 will be based on the dependency injection specification defined by JSR-330.
  • Oracle has still not acquired SUN : The OK from EMEA is still pending.
  • JSF 2.0 : Mojarra 2.0, the production-quality, reference implementation for JSF 2.0 is out! This will of course be part of GlassFish v3 (final release planned for Dec-12th 2009) but you can grab the bits right now for your first dirty hands-on experience!
  • IntelliJ IDEA is now available in two editions. The Community Edition (JavaSE-focused) is now Available under OpenSource at JetBrains.org
  • Geek Food: I discovered mainly two new things that I found diserve a Geek award:
    - Prezi ! Forget PPT and Google Presentations... Old school!
    - Play ! Clean alternative to develop JavaEE apps based on RESTful architectures.
  • Google Wave: I finally got an account (thanks a lot J. !!) but up to now I'm rather disappointed... nonetheless, the full features are not released yet, so I'm ready to get blown away. If you are interested, I still have some invitations left :-) Start your wave!
  • JavaOne 2010 : ... no, still no signs whether there will be a next JavaOne :-(

Wednesday, July 1, 2009

JBoss Seam Archetype - now with ICEfaces

Last week I visited Jazoon - and had a great opportunity to see several impressive demos of ICEfaces! Of course I had to immediately get my hands on it, and started to integrate ICEfaces in the JBoss Seam Maven archetype (as described in my previous post).

You can give it a spin by starting a shell and running Maven with the Archetype plugin:

>mvn archetype:generate -DarchetypeCatalog=http://tinyurl.com/jbsarch -DajaxLibrary=icefaces
[INFO] Scanning for projects...
[INFO] Searching repository for plugin with prefix: 'archetype'.
...
Choose archetype:
1: http://tinyurl.com/jbsarch -> jboss-seam-archetype (Archetype for JBoss Seam Projects)
Choose a number: (1): 1
Define value for serverDir: : [your JBoss 5 server location]
Define value for groupId: : [your groupId]
Define value for artifactId: : [your artifactId]
Define value for version: 1.0-SNAPSHOT: : [your version]
Define value for package: : [your package]
Confirm properties configuration:
serverType: jboss5
ajaxLibrary: icefaces
...
Y: : y
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
...


Make sure you don't misspell "icefaces", this will otherwise screw up the application. No input validation in Maven Archetype yet, but I started looking into it ;-)

Again, change to the project directory and build the project:

>mvn package

Now this also executes a sample unit test (fingers crossed it works this time ;-) - thanks to Oscar for the feedback!

Have fun with it! Anybody mind contributing a decent layout template?

Friday, June 19, 2009

Creating a JBoss Seam Maven Archetype

In case you're a regular reader of this blog, I guess you're aware that I'm a frequent user of both Maven and JBoss Seam - and that I'm regularly trying to combine working with both! My usual approach for setting up a new project was either to start with an empty Maven web application project and copying the missing files over, or to start with a seam-gen project and move it into a Maven structure. Both less than ideal...

Maven provides so called archetypes for a quick project setup. As there is not (yet?) an official archetype for Seam projects, I've been working on my own - and here's how you can use it as well!

All you need is
  • a recent version of Maven downloaded (I used 2.0.10)
  • and the Maven executable referenced in your path so you can use it on the console.
Open up a console, cd to your projects directory and type:

mvn archetype:generate -DarchetypeCatalog=http://tinyurl.com/jbsarch

This will start Maven and show the following command line output:

[INFO] Scanning for projects...
[INFO] Searching repository for plugin with prefix: 'archetype'.
[INFO] ------------------------------------------------------------------------
[INFO] Building Maven Default Project
[INFO] task-segment: [archetype:generate] (aggregator-style)
[INFO] ------------------------------------------------------------------------
[INFO] Preparing archetype:generate
[INFO] No goals needed for project - skipping
[INFO] Setting property: classpath.resource.loader.class => 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'.
[INFO] Setting property: velocimacro.messages.on => 'false'.
[INFO] Setting property: resource.loader => 'classpath'.
[INFO] Setting property: resource.manager.logwhenfound => 'false'.
[INFO] [archetype:generate]
[INFO] Generating project in Interactive mode
[INFO] No archetype defined. Using maven-archetype-quickstart (org.apache.maven.archetypes:maven-archetype-quickstart:1.0)
Choose archetype:
1: http://tinyurl.com/jbsarch -> jboss-seam-archetype (Archetype for JBoss Seam Projects)
Choose a number: (1):


The remote archetype catalog contains so far only one archetype (BTW: the jbsarch in tinyurl.com/jbsarch stands for JBoss Seam ARCHetype - hope you can remember this better than the full URL :-) Select the archetype by typing 1 and enter your Maven project properties as well as your JBoss Server directory:

[INFO] snapshot com.ctp.archetype:jboss-seam-archetype:1.0.0-SNAPSHOT: checking for updates from jboss-seam-archetype-repo
Define value for serverDir: : /Developer/Servers/JBoss/jboss-5.1.0.GA
Define value for groupId: : com.ctp
Define value for artifactId: : fluxcapacitor
Define value for version: 1.0-SNAPSHOT: :
Define value for package: com.ctp: : com.ctp.fluxcapacitor
Confirm properties configuration:
serverType: jboss5
serverDir: /Developer/Servers/JBoss/jboss-5.1.0.GA
groupId: com.ctp
artifactId: fluxcapacitor
version: 1.0-SNAPSHOT
package: com.ctp.fluxcapacitor
Y: : y
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4 minutes 57 seconds
[INFO] Finished at: Fri Jun 19 19:12:19 CEST 2009
[INFO] Final Memory: 12M/79M
[INFO] ------------------------------------------------------------------------


Note that the serverType property defaults to jboss5. If you have a JBoss 4.2.x installation, quit with n and retype everything (hmm...) and use jboss4 instead.

In case anything fails here, make sure your archetype plugin is at least version 2.0-alpha-4 (I had to delete the local repo info file in the local repository once). Now with your project created, lets build and deploy it!

Aragorn:sandbox thug$ cd fluxcapacitor/
Aragorn:fluxcapacitor thug$ mvn package
[INFO] Scanning for projects...
[INFO] Reactor build order:
[INFO] [fluxcapacitor]
[INFO] [fluxcapacitor :: JBoss Configuration]
[INFO] [fluxcapacitor :: Web Application]


The project currently features two modules: One with the actual web application, and the other one containing JBoss Server configuration files. With the basic archetype you only get a datasource there, but you can also use it to change server ports, define security realms, queues etc.

Environment specific properties are referenced in filter property files. You can find the development filter file in ${artifactId}/environment/filters/${artifactId}-development.properties. The filter file selection happens in the parent POM. It defines a development profile which sets the environment property. Setting this property to test will look for a ${artifactId}/environment/filters/${artifactId}-test.properties filter file.

Note that the parent POM contains your JBoss installation directory hard coded. You might want to factor this out into a developer specific profile when working in a team. Fire up your JBoss instance. Everything should start up well (fingers crossed...) and you can point your browser to the base URL of your application.


Done! Import the project in your favorite IDE and start prototyping!

Unfortunately this is still far away of what seam-gen provides up to now. Maven archetypes have improved over time but are currently still not as flexible to be useful in complex setups:
  • Entering project properties is not even close to be user friendly. At least a description, input validation (enumerations) and reasonable handling of default values might help.
  • Conditional installation of files. If you know how this works - let me know :-) This would be required to distinguish between WAR and EAR projects, and for targeting different applications servers (yeah, like GlassFish!).
In case I find some time to contribute something here, I'll report back. If you have feedback / contributions to the archetype, this is as usual very welcome (the code is available at our Google Code repository)!

Friday, May 29, 2009

JBoss Seam Hot Deploy with Maven - EAR Projects

As requested on some comments on my previous posts (initial post and update), the Seam Hotdeploy Maven plugin now works also with EAR projects.

Some more details on how to configure your project can be found in our Google Code Wiki. The plugin now defines a new packaging type for Seam WAR modules - the only possibility to override the war:war goal in the package phase.

If you're interested in how to override the default lifecycle Maven goal, you can check out the plugin source - something I haven't been able to google easily. Or, maybe wait for a dedicated blog post - stay tuned!

Something you should definitely checkout with the Hot Deploy plugin is the Maven CLI plugin. See the great post from Dan here, or the integration in the sample POM.

As usual, feedback and contributions are very welcome!

Thursday, April 16, 2009

JBoss Seam on Google App Engine - First Steps

[UPDATE] The latest Morjarra release 1.2_13 as well as the latest Appengine SDK 1.2.2 seem to fix a couple of problems described below. My task backlog is actually growing :-) but I hope I find some time to look again how things work with these new versions.

I spent the last three weeks in a repetition course of the Swiss army - far away from any Java code. Naturally my fingers started to itch while reading the announcement of Google that their App Engine now supports Java! So I grabbed my MacBook to enjoy the sun, Java coding and Eastern holidays.

As I usually write web applications with JBoss Seam, I decided to give the framework a try in the Google cloud - preparing for a bumpy road as Seam founds on standards which are mostly listed as either not or not known to be working. The following article describes the (bad) tweaks I had to do to get something basic running - I guess if you start doing serious development, you might hit more walls.

First, install the Eclipse Plugin for App Engine and create a new project. I'll base my descriptions on this initial setup. Switch on sessions in appengine-web.xml:
<sessions-enabled>true</sessions-enabled>

Setting up JSF 1.2

Download the latest Mojarra 1.2 release [1.2_12] and put it in the WEB-INF/lib directory. You can configure the Faces servlet in web.xml as usual
<servlet>
<servlet-name>Faces Servlet</servlet-name>
<servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>Faces Servlet</servlet-name>
<url-pattern>*.seam</url-pattern>
</servlet-mapping>

Starting Jetty with only this will fail due to some incompatibilities of Jetty and Mojarra. As with Seam we will need JBoss EL anyway, we can configure Mojarra to use the JBoss EL ExpressionFactory. Add the JBoss EL JAR to WEB-INF/lib and the following XML to your web.xml:
<context-param>
<param-name>com.sun.faces.expressionFactory</param-name>
<param-value>org.jboss.el.ExpressionFactoryImpl</param-value>
</context-param>

Now it's already patch time. Jetty in the Google environment seems to have a bug in its Servlet API implementation, missing the ServletContext.getContextPath() method (new in version 2.5). Also, Mojarra tries to be clever about initialization work and uses Threads in the ConfigManager - the App Engine SecurityManager will let the whole thing blow up. Something similar happens in the JBoss ReferenceCache class. All patched classes can be found here. Drop the Java code in your source folder, Jettys classloader will pick it up.

This will at least make the whole thing start up. I also added facelets (JAR file, view handler in faces-config.xml and view suffix in web.xml).

Unfortunately, using RichFaces components is an absolute no go on App Engine. RichFaces is full of references to AWT or JAI classes, which Google blocks completely. If anybody wants to try ICEFaces - good luck!

Adding Seam

Now it's time to add all the Seam stuff. Quite a couple of JARs to put into the application. I basically used Seam 2.1.1.GA and libraries from JBoss 4.2.3.GA. The complete list of what should go into WEB-INF/lib is shown in the screenshot below.



JTA is not even on the list of APIs Google gives a recommendation, so let's fall back on Java SE behavior. Configure components.xml with:
<transaction:entity-transaction entity-manager="#{entityManager}"/>


Although we cannot use Hibernate as persistence provider, Seam has a couple of dependencies to it (e.g. when using Hibernate Validators). As soon as we have it in the classpath, Seam activates its HibernatePersistenceProvider component. This will behave pretty bad, and for simplicity we just override this component:
@Name("org.jboss.seam.persistence.persistenceProvider")
@Scope(ScopeType.STATELESS)
@BypassInterceptors
@Install(precedence = Install.APPLICATION,
classDependencies={"org.hibernate.Session", "javax.persistence.EntityManager"})
public class OverrideHibernatePersistenceProvider extends PersistenceProvider {
}

Now also add a persistence provider as described in the Google docs and disable DataNucleus checking for multiple PersistenceContextFactory instantiations with this property in appengine-web.xml.
<property name="appengine.orm.disable.duplicate.emf.exception" value="true"/>


As before, also Seam needs a couple of patches. Main reasons for those are references to javax.naming.NamingException (which is not white listed, credits to Toby for the hint) and session/conversation components not implementing Serializable correctly. The last point is probably something not hitting Seam or your application the last time.

Identity

As a next step I tried to add some users to the application. Seam's Identity module builds around the javax.security.auth.Subject and Principal classes. Even though those classes are white listed, the SecurityManager blocks any attempt to add a Principal to a Subject. Well, how useful is that... As a quick fallback I integrated the Google Accounts API:
@Name("org.jboss.seam.security.identity")
@Scope(SESSION)
@Install(precedence = Install.APPLICATION)
@BypassInterceptors
@Startup
public class AppEngineIdentity extends Identity {

private static final long serialVersionUID = -9111123179634646677L;

public static final String ROLE_USER = "user";
public static final String ROLE_ADMIN = "admin";

private transient UserService userService;

@Create
@Override
public void create() {
userService = UserServiceFactory.getUserService();
}

@Override
public boolean isLoggedIn() {
return getUserService().isUserLoggedIn();
}

@Override
public Principal getPrincipal() {
if (isLoggedIn())
return new SimplePrincipal(getUserService().getCurrentUser().getNickname());
return null;
}

@Override
public void checkRole(String role) {
if (!isLoggedIn())
throw new NotLoggedInException();
if ((ROLE_ADMIN.equals(role) && !getUserService().isUserAdmin()) || !ROLE_USER.equals(role))
throw new AuthorizationException(String.format(
"Authorization check failed for role [%s]", role));
}

@Override
public boolean hasRole(String role) {
if (!isLoggedIn())
return false;
return ((ROLE_ADMIN.equals(role) && getUserService().isUserAdmin()) || ROLE_USER.equals(role));
}

@Override
public String getUsername() {
if (isLoggedIn())
return getUserService().getCurrentUser().getNickname();
return null;
}

public String createLoginURL(String destination) {
return getUserService().createLoginURL(destination);
}

public String createLogoutURL(String destination) {
return getUserService().createLogoutURL(destination);
}

public User getUser() {
if (isLoggedIn())
return getUserService().getCurrentUser();
return null;
}

private UserService getUserService() {
if (userService == null)
userService = UserServiceFactory.getUserService();
return userService;
}

}

Both create... methods can be used in the UI for generating login/logout URLs. Destination defines the URL the user gets redirected after successful login/logout. Also make sure that the identity configuration is removed from components.xml

Wrap up

Running this setup should give you a base for a very simple Seam app like in the screenshot below.



This first step has not been doing any persistence work, and my first tries with DataNucleus were not as straight forward as I had expected from a JPA implementation. Hope Google will catch up here with something more mature. Also, even the simple setup required a couple of nasty tweaks on the frameworks. Another big hurdle here are the runtime differences from production to local environment. For some serious work on App Engine, it's so far more recommendable to look into GWT.

Anyway, if you found this useful - looking forward to hear from your next steps in the cloud.