Monday, January 6, 2014

New Year, New Blog Location

Dear Readers and Blog Followers

First of all a Happy New Year to all of you! May 2014 bring us flying cars, light sabers and other geeky stuff (and, well, Java 8 maybe). While we wait for all this exciting stuff, we'll start with some less spectacular nevertheless important new things - a new blog location. Please redirect your RSS feeds to

the New and Official CTP Java blog (hurray!)



where you will also find other articles about things we do at Cambridge - see Bartosz' article about recovering from mistakes in Git. We will be working hard to make our new year resolution (blog more often) come true!

We will leave all our old posts here as some other sites have referrals to it, but new posts will only appear on the new location.

Monday, April 29, 2013

Efficient Structural Analysis of existing projects - part 1


Introduction

As a software engineer you already might have run in the following situation: A legacy application needs to be extended, and most of the initial developers have either left or work on new projects. Documentation is sparse and the code base is huge - but you need to quickly understand the structure of the project, or at least the parts you're about to change. How can you possibly achieve that?

Structural analysis to the rescue

Under structural analysis we understand the analysis of:
  • project structure
  • project elements
  • elements structure
  • elements relations
We will have a general overview of those aspects below - the description refers mainly to two object oriented languages: Java and C#. The details we will present in a series of blog posts - so stay tuned for more!

Analysis of project structure

A typical project is composed of:
  • modules
    • packages / namespaces
      • elements (classes)
So first it is required to understand what modules are included in the project and what their roles are. Additionally one also has to understand how elements (classes) are organized within modules - what packages / namespaces are there and what elements do they contain.

Analysis of project elements

In Java and C# projects, elements of the following kinds/categories can be found:
  • class
    • interface
      • annotation (Java)
    • attribute (C#) - equivalent to annotation in Java
    • enum
    • throwable / exception 1)
    • array
    • delegate (C#)
  • struct (C#)
1) logical category

Analysis of elements structure

Java and C# classes can contain the following members:
  • attribute
    • field
    • constant 1)
  • operation
    • method
    • constructor
    • finalizer / destructor 2)
  • property 1)
    • indexer (C#)
  • operator (C#)
  • event (C#)
  • nested element
1) logical category in Java, language-level category in C#
2) logical category in both languages - should not be used anymore

Analysis of element relations

An element can have any of the following relations:
  • outbound
    • generalization
    • abstractions
    • nestings
    • associations
      • uni-directional
      • bi-directional
      • aggregations
        • compositions
    • dependencies
  • inbound
    • specializations
    • realizations
    • nesting owner
    • association usages
      • aggregation usages
        • composition usages
    • dependency usages

Structural analysis of a sample Java project

Now imagine that you start working on an existing project: ScrumToys, an application to support conducting projects in SCRUM. Below you can see 2 screenshots from the application.

Figure 1a. ScrumToys application - dashboard showing stories and tasks of sample project sprint.

Figure 1b. ScrumToys application - editing of sample task.


Figure 1a shows stories and tasks. Tasks are grouped into 3 categories, according to their status: TODO, DOING, DONE. Figure 1b shows the task edit dialog.

Your role is to modify the application. You've received the following assignment:

  • add next task status-category: APPROVED
  • remove inconsistency in the name of task status: DOING (Figure 1a) and WORKING (Figure 1b) - it should be WORKING on all screens across the system (also on those not shown on screenshots)
  • make task status changeable - currently it cannot be changed (Figure 1b)

In order to do your assignment correctly, you have to understand the structure of the application - at least of the parts related to the given tasks. Let's see a comparison on how popular software engineering tools can support you in your assignment.



Tools


The following software engineering tools are chosen to help us with the structural analysis:


1) officially not supported on Eclipse 4

Analysis


Project structure

First, we would like to have general overview of the project structure:
Figure 2a. Project structure in Eclipse IDE.

Figure 2b. Project structure in NetBeans IDE.

Figure 2c. Project structure in IntelliJ IDEA.

Figure 2d. Project structure in Class Visualizer.

As shown in Figure 2, two out of four tools (IntelliJ IDEA and Class Visualizer) present graphical information about the kind of element (class, interface, enum, annotation) and one of them (Class Visualizer) presents the full list of project elements. All tools except Eclipse IDE show the structure of the element currently selected on the list.
The analysis of the element structure and its relations will be discussed in next parts of this series.

Tuesday, May 22, 2012

CDI Query Alpha 4 Released

There's been a while now since my last post on CDI Query, but that doesn't mean we haven't been busy working on it! Last week I've pushed another Alpha release to Maven Central, and I'm very curious about your feedback!

Summarizing the feature highlights since the last post:
  • Entities: Support for entities defined in orm.xml descriptors, @IdClass and composite primary keys: this should close the gap for all kind of entities.
  • Method expression have gotten some more love: First of all validation. As those expressions are not very stable once it comes to refactorings, they are now validated at extension initialization. Also there's now support for ordering and nested properties.
  • Auditing: Keeping track of entity creation and change dates is often a requirement in enterprise applications. This is supported by simpy annotating your entities temporal properties with @CreatedOn or @ModifiedOn
  • Extensions in the Criteria Support API: Allows now selections. There's also been a major cleanup with regards to separation of API and implementation. 
  • Postprocessing of query methods with the QueryResult class. This will encapsulate the result of a method expression or a @Query annotation, and allows to add ordering, paging and other post processing dynamically.
  • The EntityHome API, inspired by the Seam 2 application framework, helps you to easily connect your entities to the UI and implementing CRUD pages with a few lines of code.
There have already jumped a few people on board, providing great feedback, bugfixes and new features! Special thanks to:
  • Jason Porter (bug fixes, Solder upgrade, support for PK related features and feature requests)
  • Marek Smigielski (bug fixes, feature requests)
  • Aaron Walker (API extensions)
  • and various others, providing some food for thought - thanks guys!

So what's next? We're still in Alpha, and some features have been - frankly said - just been hacked in to try them out. So there is need for cleanups. Featurewise there is also a polish of the various APIs required, but a larger part could be improved support for stored procedures, and a Forge plugin (in case you don't know Forge - hurry up and install it!). Or whatever you feel is missing - looking forward to your Issues on GitHub!

Tuesday, November 22, 2011

CDI Query Module First Alpha Released!


It's been some months now since we started exploring CDI extensions as a small exercise. As it turned out, the exercise forged into something usable which we're pushing now in the open as a first Alpha shot.

So I'm happy to announce the availability of the CDI Query Module on Maven Central! The module helps you creating JPA queries with much less boilerplate, and is of course leveraging the CDI extension API as well as Seam Solder. Some of the feature highlights are:

Query By Name

Assuming you have a Person entity which looks probably similar to this:

@Entity
public class Person { 

    ... // primary key etc. skipped

    @Getter @Setter
    private String firstName;
    
    @Getter @Setter
    private String lastName;

}

You can simply create a DAO interface to query for Persons:

@Dao
public interface PersonDao extends EntityDao<Person, Long> {

    Person findByFirstNameAndLastName(String firstName, String lastName);

}

This interface does not need to be implemented. A client can just inject it, call the interface method and in the background, the JPA query is automatically created and executed. To create the query, the method name is analyzed and matching the name to entity properties.

public class PersonAction {

    @Inject
    private PersonDao personDao;
    
    public void lookup() {
        person = personDao.findByFirstNameAndLastName(firstName, lastName);
    }

}

Note that the base interface contains also a couple of other methods which you might also expect from an entity DAO. Ideally, you should not need to inject an EntityManager anymore.

Query by Query Strings and Named Queries

Of course matching property names is not extremely safe to refactorings (some more validation support here is on the roadmap) - if you like to have more control over your JPA queries, you can also annotate the method with the query to execute:

@Dao
public interface PersonDao extends EntityDao<Person, Long> {

    @Query("select p from Person p where p.ssn = ?1")
    Person findBySSN(String ssn);

    @Query(named=Person.BY_FULL_NAME)
    Person findByFullName(String firstName, String lastName);

}

Criteria API Simplifications

If you're not a big fan of query strings but rather prefer using the JPA 2 criteria API, we also allow to simplify this with a small utility API:

public abstract class PersonDao extends AbstractEntityDao<Person, Long> {

    public List<Person> findAdultFamilyMembers(String name, Integer minAge) {
        return criteria()
                    .like(Person_.name, "%" + name + "%")
                    .gtOrEq(Person_.age, minAge)
                    .eq(Person_.validated, Boolean.TRUE)
                    .orderDesc(Person_.age)
                    .createQuery()
                    .getResultList();
    }

}

All the code is hosted and documented on GitHub. Please:

  • Give feedback! If you find this useful or actually not so, we're happy to hear what is still missing.
  • Participate! Forking and creating pull requests are really a breeze on GitHub :-)


Credits:

  • Bartek Majsak for improving the initial code, taking care about quality reports and soon finalizing the stuff on the validation branch ;-) (just kidding, check out Bartek's cool work on the Arquillian Persistence Module!)
  • Grails GORM for inspiring me for this Java implementation
  • The CDI folks for a really great specification
  • Last but not least the Arquillian guys, developing and testing this stuff is pure fun with Arquillian!

Friday, November 11, 2011

JBoss Forge JRebel Plugin - Video Tutorial

It's actually out for quite a while now (and therefore already slighly out of date) but I have hardly found some time to blog about it: If you're using JRebel and JBoss Forge, have a look at this Video tutorial how to use both of them together.



Make sure to watch it in HD and full screen mode. Thanks to Chris "Kubrik" Reimann for putting the tutorial together!

As an addition, Forge now also supports checking out specific versions of a plugin, which is in our case version 1.0.0.Beta1 of the JRebel plugin. We've also added the plugin to the Forge plugin repository index so you can search for it.

By the way, if you're only using one of the tools or (even worse) none of them, this is the perfect opportunity to get a quick hands-on.

Monday, April 4, 2011

A CDI Extension for Query Generation

While the DAO pattern has come out of fashion with Java EE 6, it can still be a useful approach to centralize query related logic when you have to do more than just delegating to the entity manager. Often this approach leads then to small frameworks containing reusable code within a project, or also sometimes into a bigger framework spreading over complete IT departments within a company.

Inspired by features of Grails or the Spring Data JPA project, my plan was to learn more about CDI extensions by implementing a proof of concept on such a DAO framework based on CDI. CDI is part of Java EE 6 and, already by itself a powerful addition to the platform programming model, provides SPIs to extend it even further. A prominent sample is the Seam framework, which in its latest version contains a lot of those extensions. Just drop them in your classpath and they are ready for use. Impressive enough to learn more about the technology.

While I was getting my hands dirty it seemed to me the result is useful enough to share it here - and also of course to demonstrate the power and easiness of creating CDI extensions. In this article I’ll give you a quick start on CDI extensions as well as (hopefully) an idea on how a portable framework might look like based on this technology. All code presented here is available on GitHub.

The DAO Framework

Some common ingredients of a DAO framework are captured in the code snippet below:

@Dao
public interface SimpleDao extends EntityDao<Simple, Long> {

Simple findByNameAndEnabled(String name, Boolean enabled);

@Query(named=Simple.BY_NAME)
List<Simple> findByNamedQuery(String name);

@Query(named=Simple.BY_NAME)
List<Simple> findByNamedQueryRestricted(String name,
@MaxResults int max, @FirstResult int first);

@Query(named=Simple.BY_ID)
Simple findByNamedQueryNamedParams(
@QueryParam("id") Long id,
@QueryParam("enabled") Boolean enabled);

@Query("select s from Simple s where s.name = ?1")
Simple findByQueryString(String name);

}

Typically a DAO framework has a common interface concrete DAOs can extend from. Using generics here allows to have a standard set of methods like saving or retrieving all entities of a specific type, and can also be used during query generation. Usually this is nothing that cannot be done easily with an entity manager. But once you have injected a DAO in your service class - do you really want to inject the entity manager as well? In order to keep code leaner, a DAO base interface should provide this kind of methods and of course implement them all automagically - nothing you would like to rewrite again and again.

Some other features are shown in the method declarations above. Automatic query generation out of method names and parameters as GORM method expressions do, or creating queries based on annotation meta data and parameters will often allow just leaving those easy cases to the query generator and keep the code lean to focus on the complex ones.

The CDI Approach

One way to implement such a framework is over a CDI extension. CDI allows extensions to listen to various lifecycle events:
- Before CDI starts discovering beans.
- While it processes annotated types, injection targets, producers, beans and observers.
- And when it finishes with both discovery and validation.

As in our case we are dealing with plain interfaces, the easiest approach is to simply annotate the interface and then listen for annotation processing events. The sample above shows a Dao annotation on the interface, but this would be placed on the extended AbstractEntityDao interface so developers won’t have to worry about it.

So on the extension we listen for annotated types, check if it is our Dao annotation and register a proxy bean which implements the annotated type. Registering the extension a matter of two things:
1. Implementing the extension class and listen for the Dao annotation.

public class QueryExtension implements Extension {

<X> void processAnnotatedType(@Observes ProcessAnnotatedType<X> event, BeanManager beanManager) {
// all the required information on the type is found in the event
// the bean manager is used to register the proxy
}

}

2. Register the extension class as a service provider in the appropriate file (META-INF/services/javax.enterprise.inject.spi.Extension).

Registering the proxy with the bean manager is slightly more work, but luckily someone has already done that. If you work with CDI extensions, make sure to include Seam Solder - the Swiss army knife for CDI developers. Solder has built-in support for so called service handlers, where you annotate an abstract type with a reference to the handler class. The documentation use case looks probably kind of familiar ;-) All our extension will have to do is to override the handler lookup - and we’re done with registering the proxy! The reason we don't use the ServiceHandlerExtension directly is to separate the handler reference from the annotation, and that we can have a chance to e.g. validate and process further meta data in the extension class.

public class QueryExtension extends ServiceHandlerExtension {

@Override
protected <X> Class<?> getHandlerClass(ProcessAnnotatedType<X> event) {
if (event.getAnnotatedType().isAnnotationPresent(Dao.class)) {
return QueryHandler.class;
}
return null;
}

}

public class QueryHandler {

@Inject
private Instance<EntityManager> entityManager;

@AroundInvoke
public Object handle(InvocationContext ctx) throws Exception {
...
}

}


The handler class simply has to provide a public method annotated with @AroundInvoke, where you get all the required information to build up your query dynamically. Note that in the handler class you will also be able to use CDI services like injection.
As a framework user, all you will have to do is to drop the JAR in the classpath and annotate the interface. Look mom, no XML! Well almost, you still need an (empty) beans.xml somewhere to activate all the CDI magic.

Testing the Extension

Arquillian is a relatively new testing framework which allows you to create dynamic deployment units and run them in a container. A great feature for an extension as you can easily test it live in a unit test without leaving the IDE! This looks like the following:

@RunWith(Arquillian.class)
public class QueryHandlerTest {

@Deployment
public static Archive<?> deployment() {
return ShrinkWrap.create(WebArchive.class, "test.war")
.addClasses(QueryExtension.class)
.addAsWebInfResource("test-persistence.xml", ArchivePaths.create("classes/META-INF/persistence.xml"))
.addAsWebInfResource(EmptyAsset.INSTANCE, ArchivePaths.create("beans.xml"))
.addAsWebInfResource("glassfish-resources.xml")
.addClasses(SimpleDao.class)
.addPackage(Simple.class.getPackage());
}

@Inject
private SimpleDao dao;

@Produces
@PersistenceContext
private EntityManager entityManager;

@Test
public void shouldCreateQueryByMethodName() {
// given
final String name = "testCreateQueryByMethodName";
createSimple(name);

// when
Simple result = dao.findByNameAndEnabled(name, Boolean.TRUE);

// then
Assert.assertNotNull(result);
Assert.assertEquals(name, result.getName());
}

}

This creates a stripped down web archive deployment, in this sample for an embedded GlassFish. The test class itself is registered as a bean in the test and can therefore use injections (CDI or container injections, see the @PersistenceContext). Resources like persistence units can be added on demand as we see in the deployment method (persistence.xml for the persistence unit, glassfish-resources.xml contains a data source definition).

The container gets started and we can see our injected proxy in action. In the sample above we then create some test data and see if our generated query fetches the right data back.

Conclusion

Of course this article is a very simplified version of the whole setup - especially the Arquillian setup required some trial and error as the framework is still in Alpha (if anybody finds out how to create the data source without deploying the glassfish-resources.xml into a web archive - let me know).

Check out the project source on GitHub to get started. The module structure might look a little complicate but it follows common Seam module structure separating API (in our case the client annotations) from implementation (the extension code). Documentation is still "basic" but looking at the unit tests might give an indication on the usage.

Once the project setup is done, things get extremely productive though. Seam Solder provides a rich tool set you can use to create an extension, Arquillian lets you immediately test your code inside a container. The result is an easy to reuse and easy to distribute framework you will probably get back to in many of your following Java EE 6 projects.

Friday, March 18, 2011

Introduction to OSGi

As my first blog entry here I want to introduce you to the OSGi framework. It allows you to build Java applications with high modularity and I believe that this technology will become more and more important in the future. The goal of this introduction is that the reader has a quick overview about the OSGi framework and that he understands the benefits of it. Further, this should be an introduction to my planned series of Eclipse plug-in development.

1 What is OSGi?

The OSGi Alliance (formerly known as Open Services Gateway initiative, now an obsolete name), founded in March 1999, originally specified and maintains the OSGi specification. This specification allows you to write so called bundles, self contained modules that can provide functionality to other bundles. There are several implementations of this specification, the most common ones are:
  • Eclipse Equinox
  • Apache Felix
  • Knoplerfish

1.1 Why OSGi?

As mentioned before, the OSGi specification aims to create Java applications with extremely high modularity. Of course, this is what every developer aims at, but OSGi encourages and supports you in doing so. Also a great feature is that you can install or uninstall such bundles during runtime of the application without restarting it. It also supports lazy loading. This means that you can configure a bundle to be loaded when it’s called the first time. This truly increases the performance of an application. The OSGi specification also defines some useful services, that have to be provided by an OSGi implementation.
What does that mean to CTP? As high modularity is becoming more and more important, clients will have the need to extend their application. As seen in the sM-Client project, for example, the client wants to make small extensions like custom forms without packaging a whole new release. Which is quite understandable. Now a plug-in style mechanism is evaluated, but there may be problems, especially with classloading, which are well solved in OSGi.

2 Architecture

The OSGi framework is based on several Layers:

Security layer: The OSGi Security Layer is an optional layer that underlies the OSGi Service
Platform. The layer is based on the Java 2 security architecture. It provides
the infrastructure to deploy and manage applications that must run in finegrained
controlled environments.

Module layer: The Module Layer is responsible for the packaging of the modules (“bundles”), those are simple JAR files which follow some requirements. I will explain more about bundles in section 2.1.

Life cycle layer: This layer gives you the possibility to install, uninstall, start and stop bundles inside an application framework without restarting the whole framework, (e.g. if you contribute a bug fix inside a bundle). This also increases the availability of your application framework.

Service Layer: The Service Layer enables you to provide OSGi services to other bundles. An OSGi service is nothing but a POJO, that is registered in the Service Registry and can be referenced from everywhere outside the declaring bundle. The service has to implement an agreed interface.


2.1 Bundles

As the description of the Module Layer says, a bundle is a plain old JAR file. But it has some special characteristics. There has to be a MANIFEST.MF file in the META-INF directory, which is located at the root of the JAR file (see the JAR specification). This file provides several meta information about the bundle like dependencies, exported and imported packages etc. Note, that in difference to a normal JAR file, all packages are hidden by default to other bundles. So you have to define which packages are exported and therefore are accessible from other bundles. Additionally, you have to specify which packages you want to import from other bundles. Also you can define required bundles, which will import all exported packages of the required bundles. Let’s have a look at an example MANIFEST.MF file:

Bundle-Name: Hello World
Bundle-SymbolicName: com.ctp.helloworld
Bundle-Description: A Hello World bundle
Bundle-ManifestVersion: 2
Bundle-Version: 1.0.0
Bundle-Activator: com.ctp.Activator
Export-Package: com.ctp.helloworld;version="1.0.0"
Import-Package: org.osgi.framework;version="1.3.0"
Require-Bundle: com.ctp.other;bundle-version="2.5.0"
Bundle-RequiredExecutionEnvironment: JavaSE-1.6

These configuration elements are explained as follows:
  • Bundle-Name: Defines a human-readable name for this bundle, Simply assigns a short name to the bundle.
  • Bundle-SymbolicName: The only required header, this entry specifies a unique identifier for a bundle, based on the reverse domain name convention (also used by the java packages).
  • Bundle-Description: A description of the bundle's functionality.
  • Bundle-ManifestVersion: This little known header indicates the OSGi specification to use for reading this bundle.
  • Bundle-Version: Designates a version number to the bundle.
  • Bundle-Activator: Indicates the class name to be invoked once a bundle is activated.
  • Export-Package: Expresses what Java packages contained in a bundle will be made available to the outside world.
  • Import-Package: Indicates what Java packages will be required from the outside world, in order to fulfill the dependencies needed in a bundle.
  • Require-Bundle: Indicates what OSGi bundles will be required from the application framework. This will import all exported packages from the specified bundle(s).
  • Bundle-RequiredExecutionEnvironment: Specifies the minimum execution environment(s) required to run this bundle.

2.1.1 Bundle life cycles

A bundle’s life cycle could have the following states (managed by OSGi’s Life Cycle Layer):
Bundle StateDescription
INSTALLEDThe bundle has been successfully installed.
RESOLVEDAll Java classes that the bundle needs are available. This state indicates that the bundle is either ready to be started or has stopped.
STARTINGThe bundle is being started, the BundleActivator.start method will be called, and this method has not yet returned. When the bundle has an activation policy, the bundle will remain in the STARTING state until the bundle is activated according to its activation policy.
ACTIVEThe bundle has been successfully activated and is running; its Bundle Activator start method has been called and returned.
STOPPINGThe bundle is being stopped. The BundleActivator.stop method has been called but the stop method has not yet returned.
UNINSTALLEDThe bundle has been uninstalled. It cannot move into another state.

2.2 OSGi Services

OSGi Services are Java objects, that implement agreed interfaces. This means that the Service itself is defined by a Java interface and the bundles which use the service don’t have to know where the specific implementation actually is. The following is a simple OSGi Service:

The service interface:
public interface HelloWorld {
public String getMessage();
}
The service implementation:
package com.ctp.helloworld.service.impl;

import com.ctp.helloworld.service.HelloWorld;

class HelloWorldImpl implements HelloWorld {
@Override
public String getMessage() {
return “Hello World”;
}
}
Now, without any tools like Spring Dynamic Modules, iPOJO or Declarative Services, we have to register our service manually so that it can be accessed by other bundles. I will explain this in the following section.
Please note that the -Impl naming convention does not really make sense in OSGi, since it’s possible to have multiple implementations of one service, it was just the easiest for this example.

3 Example

Let’s write a simple bundle that contains the HelloWorld service we created above. First, download an OSGi framework (I’m using Equinox 3.6, but this example should work on all implementations since it’s very basic). This example is made as you could get going without any special IDE. The source code can be downloaded here.

Copy the downloaded jar file to a location where you would like to start your application (further called $APP_HOME).

Now create a work folder, where your source files are (further called $SRC_HOME).
Create the following folders and files:
$SRC_HOME/com/ctp/helloworld/service/HelloWorld.java
$SRC_HOME/com/ctp/helloworld/service/impl/HelloWorldImpl.java
$SRC_HOME/com/ctp/helloworld/Activator.java
$SRC_HOME/META-INF/MANIFEST.MF

The MANIFEST.MF file is very basic and should have the following contents:

Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Hello World
Bundle-SymbolicName: com.ctp.helloworld
Bundle-Version: 1.0.0
Bundle-Activator: com.ctp.helloworld.Activator
Bundle-Vendor: CTP
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
Import-Package: org.osgi.framework;version="1.3.0"
Bundle-ActivationPolicy: lazy


Make sure you have a carriage return and/or newline character at the end of the last line as this is required (see manifest specification in the JAR specification).

Let’s create the OSGi Service as the example in section 2.2:

$SRC_HOME/com/ctp/helloworld/service/HelloWorld.java:

package com.ctp.helloworld.service;

public interface HelloWorld {
public String getMessage();
}

$SRC_HOME/com/ctp/helloworld/service/impl/HelloWorldImpl.java:
package com.ctp.helloworld.service.impl;

import com.ctp.helloworld.service.HelloWorld;

public class HelloWorldImpl implements HelloWorld {

@Override
public String getMessage() {
return "Hello World";
}
}


The defined activator ($SRC_HOME/com/ctp/helloworld/Activator.java) should have the following contents so that it is able to run:
package com.ctp.helloworld;

import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;

public class Activator implements BundleActivator {

private static BundleContext context;

static BundleContext getContext() {
return context;
}

public void start(BundleContext bundleContext) throws Exception {
Activator.context = bundleContext;
}

public void stop(BundleContext bundleContext) throws Exception {
Activator.context = null;
}

}

At this state, the OSGi bundle should be ready to compile and deploy it. But first, we want to register our service. We do this in the method com.ctp.helloworld.Activator#start:
import org.osgi.framework.ServiceReference;
import com.ctp.helloworld.service.HelloWorld;
import com.ctp.helloworld.service.impl.HelloWorldImpl;
...
public void start(BundleContext bundleContext) throws Exception {
System.out.println("Registering HelloWordld service...");

// this will register the service in the Service Registry
bundleContext.registerService(HelloWorld.class.getName(), new HelloWorldImpl(), null);

// Test the availability of the service
ServiceReference ref = bundleContext.getServiceReference(HelloWorld.class.getName());

if(ref == null) {
System.out.println("Service is not registered...");
} else {
// it’s also possible to get an array of services...
HelloWorld service = (HelloWorld)bundleContext.getService(ref);
System.out.println("HelloWorld#getMessage(): " + service.getMessage());
}
Activator.context = bundleContext;
}
...

Now we can compile the code and create a JAR file:
cd $SRC_HOME;
javac -cp $APP_HOME/org.eclipse.osgi_3.6.0.v20100517.jar com/ctp/helloworld/service/HelloWorld.java com/ctp/helloworld/service/impl/HelloWorldImpl.java com/ctp/helloworld/Activator.java;
jar cvfm example.jar META-INF/MANIFEST.MF com;
cp example.jar $APP_HOME/example.jar;

If your bundle was built correctly you could start your test application:
cd $APP_HOME;
java -jar org.eclipse.osgi_3.6.0.v20100517.jar -console;

Now you can administrate the OSGi platform with the console. Test the bundle like the following:
osgi>install file:///$APP_HOME/example.jar
Bundle id is 4

osgi>start 4
Registering HelloWorld service...
HelloWorldService#getMessage(): Hello World

osgi>

NOTE: If you want to use the service inside another bundle, you have to add the package com.ctp.helloworld.service to the Export-Package directive in the MANIFEST.MF file.

4 Multiple Implementations

This example shows you how to use multiple implementations of a service. It’s a solution which I think, personally, is quite elegant. I won’t go through all steps that we already did in the previous example. A source code can be downloaded here. Let’s create a bundle with the symbolic name com.ctp.playground.multiple and the following classes:
com.ctp.playground.multiple.Activator
com.ctp.playground.multiple.math.OperationService
com.ctp.playground.multiple.math.OperationServiceFactory
com.ctp.playground.multiple.math.internal.Addition
com.ctp.playground.multiple.math.internal.Substraction

Also create the following Enum:
com.ctp.playground.multiple.math.Operation

The bundle activator should have the following contents (don’t forget the imports):
public class Activator implements BundleActivator {

private static BundleContext context;

public static BundleContext getContext() {
return context;
}

public void start(BundleContext bundleContext) throws Exception {
Activator.context = bundleContext;
}

public void stop(BundleContext bundleContext) throws Exception {
Activator.context = null;
}

}

The interface com.ctp.playground.multiple.OperationService has one method:
public Double doOperation(Double a, Double b);

The classes Addition and Substraction implement this interface and are our actual services:
@Override
public Double doOperation(Double a, Double b) {
return a + b;
}
and:
@Override
public Double doOperation(Double a, Double b) {
return a - b;
}
The Enum Operation looks like the following (don’t forget the imports):
public enum Operation {
ADDITION {
@Override
public Class<?> getServiceClass() {
return Addition.class;
}
}, SUBSTRACTION {
@Override
public Class<?> getServiceClass() {
return Substraction.class;
}
};
public abstract Class<?> getServiceClass();
}

And now comes the part where we get our service implementations and decide which one to take. For that we created the Class OperationServiceFactory. Now extend it with the following method:
public static OperationService getService(Operation o) {
BundleContext bundleContext = Activator.getContext();
ServiceReference[] refs = null;
try {
// get the service references
refs = bundleContext.getServiceReferences(OperationService.class.getName(),null);

} catch (InvalidSyntaxException e) {
// this should never happen, because the second argument
// of bundleContext.getServiceReferences() is null.
// You have the possibility to give filter criterias with
// this parameter
}
OperationService service = null;
if(refs != null) {
for(int i = 0;i<refs.length;i++) {
service = (OperationService) bundleContext.getService(refs[i]);
// check, if the service is the wanted one
if(service.getClass().equals(o.getServiceClass())) {
return service;
}
}
}
return null;
}

All we have to do now is to register the services in the Activator class and test them. Extend the com.ctp.playground.multiple.Activator:
...
public void start(BundleContext bundleContext) throws Exception {

Activator.context = bundleContext;
// register services
bundleContext.registerService(OperationService.class.getName(), new Addition(), null);
bundleContext.registerService(OperationService.class.getName(), new Substraction(), null);

// test the services
OperationService operation = OperationServiceFactory.getService(Operation.ADDITION);
Double result = operation.doOperation(10.0, 20.0);
System.out.println(result);
operation = OperationServiceFactory.getService(Operation.SUBSTRACTION);
result = operation.doOperation(10.0, 20.0);
System.out.println(result);
}
...

The MANIFEST.MF file should have the following contents:
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Multiple Service implementations Example
Bundle-SymbolicName: com.ctp.playground.multiple
Bundle-Version: 1.0.0
Bundle-Activator: com.ctp.playground.multiple.Activator
Bundle-Vendor: CTP
Bundle-RequiredExecutionEnvironment: JavaSE-1.6
Import-Package: org.osgi.framework;version="1.3.0"
Bundle-ActivationPolicy: lazy
Export-Package: com.ctp.playground.multiple.math

Now build the bundle (compile and make jar file), put it in the $APP_HOME directory, run the platform and install / start it like we did in the previous example.
You should have the following output after starting the bundle:

osgi> 30.0
-10.0

5 Conclusion

Let’s summarize and see if you reached your targets:
  • OSGi is a framework specification for creating highly modular Java applications
  • A bundle is a simple jar file, defined through manifest headers
  • An OSGi Service is a simple POJO that needs to be registered in the OSGi Service Registry
  • You can create a simple bundle
  • You are able to create a simple OSGi Service
  • You can manage multiple service implementations


As I mentioned at the top of this blog, this is the first part of my series about OSGi and Eclipse plug-in development. I will publish the blogs like the following (there may be changes):
  1. Introduction to OSGi
  2. OSGi in a declarative way
    How to use additional frameworks like Spring DM or Declarative Services
  3. OSGi and Eclipse
    The way how Eclipse uses OSGi and introduction to Eclipse plug-in development
  4. Eclipse UI
    How the Eclipse UI works and how to use it
  5. Eclipse SDK 4.0 - Model driven development
    All about the new Eclipse SDK and its model driven approach
  6. Eclipse SDK 4.0 - New features
    New features of the SDK like XWT and usage of dependency injection

I hope you enjoyed this little introduction to OSGi and I’m looking forward to read your comments, so I could improve my further planned blogs.