All posts by steinarbang

Pluggable databases for apache karaf applications

When creating databases for my apache karaf based web applications, I want the following things:

  1. A database-independent schema creation using liquibase (i.e. supporting multiple relational database systems)
  2. Pluggable databases (each web application typically has an in-memory derby database for testing and evaluation and a PostgreSQL database for production use)
  3. The ability to run multiple applications in the same apache karaf instance without the application databases colliding
  4. The possibility to configure the JDBC connection information using the karaf configuration

The way I accomplish this, is:

  1. Create an application specific OSGi service for connecting to the database
  2. Create declarative services (DS) components implementing the service for each database system I wish to support
  3. Create a bundle implementing the calls to liquibase (and containing the liquibase scripts as resources) and use this bundle from all DS components implementing the database service
  4. Use high level karaf features to compose my webapps, with variants for each supported database system

OSGi 101: when I talk about “OSGi service” I mean “Java Interface”. Java interfaces form the basis of the OSGi plug-in model. When an OSGi bundle (a “bundle” is a JAR file with some OSGi specific headers added to its MANIFEST.MF) is loaded it can have dependencies to services provided by other bundles.  When those dependencies are satsified the bundle can go active and provide its own services to other bundles that may be listening.

I have created a Java interface called DatabaseService that implements the necessary operations I do against a JDBC database, and they aren’t many:

  • getDatasource() that will return a DataSource  connected to a relational database with the expected schema (Note: if connecting to the database fails, the DS component providing the service never goes active, so the only way code ever gets to call this method is if the connection was successful. This simplifies the code using the method)
  • getConnection() that returns a Connection to the database. This is a convenience method, because what the implementations actually do, is call Datasource.getConnection().  It makes sense to have this method in the interface, because:
    1. Fetching the connection is the only thing done with DataSource, so that it makes all usages a little bit simpler and shorter
    2. The DS components providing the service all have ended up implementing this method anyway, in the calls to liquibase

The application specific DatabaseService definitions are mostly without methods of their own, like e.g.

However, I’ve had one case where I needed to have very different syntax for the different database systems supported, and then I added methods returning database specific versions of the problematic SQL queries to the application specific database service definition:

For the curious: the SQL with different syntax had to do with aggregation over months and years where the SQL looks like this in derby and the SQL looks like this in PostgreSQL.

The next step is to create a bundle holding the liqubase schema definition of the database.  This bundle neither expects OSGi service injection, not exposes any services.  It’s an OSGi library bundle that is like using a regular jar, except that the exports and imports of packages is controlled by OSGi and OSGi-headers in the bundle’s MANIFEST.MF.

More OSGi terminology: “bundle” means a jar-file with an OSGi-compliant MANIFEST.MF.

The liquibase bundles contain a single class and liquibase scripts as resources. In OSGi the resources are bundle local, so other bundles can’t load them directly. But they can instantiate a class from the liquibase bundle and that class can in turn load the resources that resides in the same bundle as the class.

The classes in the liquibase bundles typically contains a method for initial schema setup, and a method for schema modifications.  In addition to the scripts in the liquibase bundle, the database plugins contain their own liquibase scripts (the liquibase scripts of the in-memory derby test database plugins contains sets of hopefully realistic test data, and the liquibase scripts of the PostgreSQL plugins contains initial data such as e.g. a single initial user).

The declarative services (DS) components of the database plugins expect an injection of the DataSourceFactory OSGi service. The DataSourceFactory interface isn’t part of JDBC (like DataSource itself is), but is part of the OSGi service platform.

To ensure that the correct DataSourceFactory is injected it is possible to filter on the osgi.jdbc.driver.name service property (service properties is a Map<String, String> associated with an OSGi service), like e.g.


and

Once all @Reference service injections are satisfied, the method annotated with @Activate is called. This method needs to create a successful connection to the database and run liquibase scripts. A method for a derby memory database can look like this (the full source of the UkelonnDatabaseProvider class is here):


What the method does is to first create a DataSource object (stored in a field and accessible through the DatabaseService interface), and it then creates a connection in a try-with-resource and runs the liquibase scripts using that connection before the try-with-resource releases the connection. The scripts are typically an initial schema, followed by initial data for the database, followed by modifications to the schema.

It is possible to create configuration for a DS component using the karaf console command line, and creating JDBC connection info for a PostgreSQL database can be done with the following commands:

The resulting configuration will be injected into the @Activate method like e.g. so (the full source of the PGUkelonnDatabaseProvider class is here):

The injected config is sent into the method creating the DataSource instance.

Important: the DataSource instance is cached and is used to represent the connected database.  The Connection instance are not cached. The Connection instances are created on demand in try-with-resource, to ensure that they are closed and all transactions are completed.

The method creating the DataSource instance passes the config on to a method that picks the JDBC connection info out of the injected config and puts into a Properties instance that can be used to connect to a JDBC database.

Using the pluggable database is done by having the DS component implementing the application’s business logic have the application specific database service injected, e.g. like so (full UkelonnServiceProvider source here):

Using the injected service is calling the getConnection() method in a try-with-catch, do the SQL operations and then let the end of the try-with-catch release the connection, like e.g. so

It probably helps for this approach for pluggable databases that I’m using JDBC prepared statements directly instead of using an ORM like e.g. hibernate (but that’s a story/rant for another day…).

Picking what database to use is done using karaf features.

The karaf-maven-plugin is used on all maven modules that create an OSGi bundle, to create a karaf feature repository file and attach it to the OSGi bundle artifact.

In addition I create a handwritten feature repository file where I include all of the feature files of the OSGi bundle artifacts using their maven coordinates. And then in the same file, create high level features that start the application with either database.

Examples of hand-written feature repositories:

  1. aggregate feature repository for ukelonn (maven URL mvn:no.priv.bang.ukelonn/karaf/LATEST/xml/features )
  2. aggregate feature repository for authservice (maven URL mvn:no.priv.bang.authservice/authservice/1.6.0/xml/features )

Using the authservice as an example:

  1. First define a feature that loads the features required for the application, except for the database
  2. Then define a feature that uses that feature in addition with the feature created when building the OSGi bundle for the derby test database
  3. Finally define a feature that uses the first feature in addition with the feature created when building the OSGi bundle for the PostgreSQL database

This means that trying out the authservice application can be done by:

  1. Download and start apache karaf using the quick start guide
  2. Install authservice with derby by loading the feature repository using the maven coordinates, and then loading the feature that pulls in the application and the derby database
  3. All bundles required by the features are pulled in from maven central and the application is started and becomes available on http://localhost:8181/authservice

Installing and starting the PostgreSQL takes a little more work (but not much):

  1. Install and start PostgreSQL (left as an exercies for the reader)
  2. Create a PostgreSQL user named karaf with a password (in the examples “karaf”)
  3. Create a blank database named “authservice” owned by user karaf
  4. Download and start apache karaf using the quick start guide
  5. Create database connection configuration from the karaf console
  6. Install authservice with PostgreSQL by loading the feature repository using the maven coordinates, and then loading the feature that pulls in the application and the PostgreSQL database
  7. All bundles required by the features are pulled in from maven central and the application is started and becomes available on http://localhost:8181/authservice

A Java programmers guide to delivering webapp frontends

When I started investigating (googling) web frontends, I found tutorials for various frameworks and I found tutorials using node.js to deliver frontends to web browsers on localhost.

What I did not find, was something telling me how I should pack up and deliver the webapp frontend from my production system, so that comes here:

  1. A web application frontend  is a piece of javascript code, that builds its user interface by rebuilding the DOM tree of the webpage that loaded it
  2. The javascript needed for a webapp is packaged up as a big javascript file, typically named bundle.js,
  3. The initial HTML file references bundle.js
  4. The initial HTML file also contains an element that matches what the javascript code expects to be its start element (i.e. the point in the DOM of the parsed HTML file, where the javascript program starts rebuilding its own DOM)

Most react.js tutorials start with an example like this (this is the “main()” of the javascript program):

The above code example tries to find the element with id root in the DOM of the surrounding HTML file, and replacing it with the DOM of the <App/> react component (i.e. running the render() method of the component).

This means that for anything to appear in the HTML page there must be an element in the HTML page with id “root”.  If not, react won’t find a place to start rewriting the DOM and won’t do anything.

A minimal HTML page loading bundle.js and with an element where javascript can start rebuilding the DOM, looks like this:

This page will load the bundle.js and contains the element <div id=”root” /> where react can start rewriting the DOM.

It is possible to serve this two part web application as static files, i.e. dropping the index.html and bundle.js files into a directory served by nginx or apache will be sufficient to make the webapp load and start rendering in a web browser.

The frontend people and most frontend tutorials use node.js to build the bundle.js and serve the index.html and the bundle.js. That’s fine for developing the actual frontend application, but not for delivering the application to the production enviroment.

Working in java my first thought was “how do I package up the frontend in a jar file that can be deployed to a maven repository?”. The rest of this article attempts to answer this question.

  1. The frontend-maven-plugin is used to build a bundle.js from the src/main/frontend/ directory of a project
  2. The bundle.js created by frontend-maven-plugin is included as a resource in the jar (handled by having frontend-maven-plugin drop the bundle.js in the target/classes/ directory)
  3. The index.html file is put into src/main/resources/ and end up as a resource in the jar
  4. A servlet is added to the code.  The servlet will return “index.html” for the request path “/” and otherwise look through the classpath for a match

And that’s it, basically.

You will need stuff to wire up your servlet to a path in the servlet container.   This could e.g. be a web-inf/web.xml deployment descriptor of a war file, or it could be annotations of servlet 3.0. Personally I use the OSGi web whiteboard: the servlet becomes a declarative services (DS) component that is started and plugs into a path (web context) in the servlet container.

Another thing to consider, is if you use the react-router or similar to rewrite the URL when moving to a different page, then you would like to enter the frontend application at all of the paths matching a page.

Some frontend examples (all using OSGi web whiteboard):

  1. The frontend-karaf-demo (a single maven module creating a declarative services component plugging into the web whiteboard on the webcontext “frontend-karaf-demo” and would become available at http://localhost:8181/frontend-karaf-demo/ when being loaded into apache karaf)
  2. The authservice user administration GUI
  3. The ukelonn weekly allowance web application

Use Jersey to provide REST APIs from karaf applications

The sample application https://github.com/steinarb/jersey-demo demonstrates how to use Jersey to provide a REST API from a Declarative Services (DS) component registering with the Pax Web Whiteboard Extender in apache karaf.

The way Jersey works is that it initially scans a package in the classpath for classes with JAX-RS annotations (e.g. @Path, @GET, @POST), and matches @Path annotations to possible endpoints in the REST API. When Jersey receives a REST call to a endpoint that matches a resource, Jersey will instanciate the appropriate resource class, call a method on the instanciated resource and then release the resource to Java garbage collection.

To be able to do something useful in these resource objects we use HK2 to dependency inject OSGi services.

To be able to dependency inject OSGi services into Jersey resources the following is done:

  1. Create a whiteboard DS component exposing a Servlet service from Jersey ServletContainer, configuring the package to scan for API resources:
    @Component(
        service=Servlet.class,
        property={"alias=/jerseyinkaraf/api",
                  HttpWhiteboardConstants.HTTP_WHITEBOARD_SERVLET_INIT_PARAM_PREFIX+ServerProperties.PROVIDER_PACKAGES+"=no.priv.bang.demos.jerseyinkaraf.webapi.resources"
        } )
    public class CounterServiceServlet extends ServletContainer {
        ...
    }
  2. Make the DS component of ServletContainer depend on OSGi services Counter and LogService
    public class CounterServiceServlet extends ServletContainer {
        ...
        @Reference
        public void setCounter(Counter counter) {
            this.counter = counter;
        }
    
        @Reference
        public void setLogservice(LogService logservice) {
            this.logservice.setLogService(logservice);
        }
        ...
    }
  3. Override the ServletContainer.init(WebConfig) method, and:
    1. Call super.init(WebConfig) to make sure that a ServletConfig containing the information set up by the http whiteboard is created (contains the servletcontext, the servlet name and the package to scan for Jersey resources)
              super.init(webConfig);
    2. Copy the ResourceConfig of the ServletContainer (because that ServletConfig is immutable after the setup, and calling ServletConfig.register() will cause an IllegalOperationException)
              ResourceConfig copyOfExistingConfig = new ResourceConfig(getConfiguration());
    3. On the ServletConfig copy, register an anonymous inner inheriting AbstractBinder that in its configure() method registers the OSGi services injected into the ServletContainer as JSR330 injections in the Jersey resources
              copyOfExistingConfig.register(new AbstractBinder() {
                      @Override
                      protected void configure() {
                          bind(logservice).to(LogService.class);
                          bind(counter).to(Counter.class);
                      }
                  });
    4. Call ServletContainer.reload(ResourceConfig) with the ResourceConfig copy as the argument
              reload(copyOfExistingConfig);

      Note: The copyOfExistingConfig object at this point contains both the initial configuration created by the ServletContainer itself, and the two added OSGi services for dependency injection.

  4. In the resources use JSR330 injections of the registered services
    @Path("/counter")
    @Produces(MediaType.APPLICATION_JSON)
    public class CounterResource {
    
        @Inject
        Counter counter;
    
        @GET
        public Count currentValue() {
            return counter.currentValue();
        }
    
        @POST
        public Count increment() {
            return counter.increment();
        }
    
    }

To try out the application:

  1. Clone this project and build it:
    git clone https://github.com/steinarb/jersey-demo.git
    cd jersey-demo
    mvn clean install
    
  2. Install apache karaf and start it  according to the karaf quick start guide (alternatively, see Develop OSGi applications using karaf)
  3. At the karaf command line give the following commands
    feature:repo-add mvn:no.priv.bang.demos.jerseyinkaraf/jerseyinkaraf/LATEST/xml/features
    feature:install jerseyinkaraf.webapi
    feature:install jerseyinkaraf.webgui
  4. Open http://localhost:8181/jerseyinkaraf in a web browser and click on the button to increment the counter value

The demo consists of the following maven project and modules:

  1. jerseyinkaraf The top project which in addition to containing common configuration, and the list of modules, also creates a karaf feature repository containing the features of all bundles created in the modules, and attaches the karaf feature repository to the maven artifact
  2. jerseyinkaraf.servicedef which is an OSGi bundle containing the interfaces and bean defining the OSGi service for doing the counting
  3. jerseinkaraf.services which is an OSGi bundle containing a DS component implementing the services
  4. jerseyinkaraf.webapi which is an OSGi bundle containing a DS component that defines a REST API that plugs into the Web Whiteboard Extender and exposes the OSGi services
  5. jerseyinkaraf.webgui which is an OSGi bundle containing a DS component that exposes an HTML and JavaScript application that plugs into the Web Whiteboard Extender

Deliver react.js from apache karaf

There’s a small barebones demo application/testbed that delivers a single-page web application, created with react.js from apache karaf, at https://github.com/steinarb/frontend-karaf-demo

Screenshot of the demo open on the “counter” page

The react.js application is webpack‘d into a single bundle.js file. The packing is done by the frontend-maven-plugin (i.e. no local node.js installation required).

The bundle.js and its wrapping index.xhtml page, are served as static resources from the OSGi bundle’s classpath by a DS component that registers a Servlet service with the web whiteboard extender on the path /frontend-karaf-demo

The react.js features featured in this application, are:

  • react.js to render the application’s page
  • redux to hold the application’s state
  • redux-saga and axios for REST API communication and asynchronous update of the application’s state (the REST service used by axios, is provided by an OSGi DS whiteboard Servlet)
  • react-router to route between different pages (however this requires hardcoding the ReactServlet’s path into the react app, and requires the ReactServlet to return the top level index.html from all routes)
  • react-bootstrap and bootstrap v3 to style the application in a responsive way (I haven’t made any effort to make it look good. I just include the boostrap style and activate the responsive formatting in a meta header tag in the index.html page)

To try out the application:

  1. Clone this project and build it:
    git clone https://github.com/steinarb/frontend-karaf-demo.git
    cd frontend-karaf-demo/
    mvn clean install
  2. Install apache karaf and start it  according to the karaf quick start guide (alternatively, see Develop OSGi applications using karaf)
  3. At the karaf command line give the following commands
    feature:repo-add mvn:no.priv.bang.demos/frontend-karaf-demo/LATEST/xml/features
    feature:install frontend-karaf-demo
  4. Open http://localhost:8181/frontend-karaf-demo in a web browser and try it out

Faking a debian repository for package development

I use aptly to deliver my unofficial debian packages both to myself and others that might be interested.

However I’ve found that using aptly to do package development is a bad idea, because you can’t (by design, probably) overwrite packages in an aptly archive.  You can only create new versions.

For some installation tests it’s OK to use “dpkg –install”. But if your package needs to pull in depdencies, or if you wish to test a package upgrade, you need to use APT.

This article explains how to create a fake debian repository for use in package development.

Initial setup

Initial setup:

  1. Create the repository directory (this should be done as the same user you use to build the package)
    mkdir -p /tmp/repo
  2. Open /etc/apt/sources.list in a text editor and add the following (you need to be root to do this)
    # Test install apt archive
    deb file:///tmp repo/
    

Add new package to the repo

This the development cycle:

  1. Build the package (the example builds karaf)
    cd ~/git/karaf-debian/
    dpkg-buildpackage
  2. Copy the package to the fake repo (this example uses karaf, replace with your own package and package version):
    cp ~/git/karaf_4.1.5-0~9.30_all.deb /tmp/repo
  3. Generate a Packages file
    (cd /tmp; dpkg-scanpackages repo >repo/Packages)

Upgrading an existing package

This has to be done as root:

  1. First update APTs database of packages and archives
    apt-get update
  2. Run an upgrade with the following command
    apt-get dist-upgrade
  3. There will will be a question for if you wish to continue, to continue is the default, just press ENTER
    Do you want to continue? [Y/n]
  4. There will be a warning about that the package to be installed cannot be authenticated, the default here is not to install, so press “y” (without the quotes) to continue
    Install these packages without verification? [y/N]

Installing a package

This is used to e.g. test that a package is able to install its dependencies.

These operations have to be done as root:

  1. First update APT’s database of packages and archives
    apt-get update
  2. Use the following command to install a package and its dependencies, this example installs apache karaf:
    apt-get install karaf
  3. If the package pulls in new dependencies there will be a prompt for if you wish to continue. The default is to continue, so just press ENTER
    Do you want to continue? [Y/n]
  4. There will be a warning about that the package to be installed cannot be authenticated, the default here is not to install, so press “y” (without the quotes) to continue
    Install these packages without verification? [y/N]

Develop OSGi applications using karaf

Apache Karaf is a good platform for deploying OSGi based applications. Karaf is also a good platform for testing and debugging these applications. This article describes how to test and debug OSGi bundles and OSGi applications with karaf and eclipse.

The basic flow of development, is:

  1. Build the application with maven
  2. Start karaf as your own user in a way that makes it listen for remote debug connections
  3. Install the application from the karaf console
  4. Connect eclipse to karaf for a remote debug session
  5. Make karaf listen to updates to bundles with SNAPSHOT versions, and automatically load the updated bundles
  6. Make a change in eclipse
  7. Run a maven build of the project
  8. Observe that the change appears in karaf or in the debugger

For more detail, read on.

Preparations

Do the following:

  1. Install eclipse
    1. Open a web browser on https://www.eclipse.org/downloads/ and download the installer for your system
    2. Run the installer to the end, using defaults for everything
  2. Install apache maven
    1. On debian GNU/linux, just do
      apt-get install maven
  3. Download and unpack the binary karaf distribution. In bash on GNU/linux (or in git bash on Windows), do:
    mkdir -p ~/karaf
    cd ~/karaf/
    wget http://apache.uib.no/karaf/4.1.4/apache-karaf-4.1.4.tar.gz
    tar --strip-components=1 -xf apache-karaf-4.1.4.tar.gz

Clone and build the sample application

  1. Clone and build the sample application
    mkdir -p ~/workspaces/ws01
    cd ~/workspaces/ws01/
    git clone https://github.com/steinarb/hello-karaf-demo.git
    cd ~/workspaces/ws01/hello-karaf-demo/
    mvn clean install
  2. Import the sample application into eclipse:
    1. Start eclipse
    2. Open the workspace at ~/workspaces/ws01
    3. Import the existing maven project hello-karaf-demo into the workspace (File->Import… then select Maven->Existing Maven Projects and click the wizard through to the end)

Install the application into karaf

  1. Open a command line window and start karaf with an argument that makes karaf listen for remote debug connections
    cd ~/karaf/
    bin/karaf debug
  2. From the karaf console prompt:
    1. Install the run-time requirements for the test application
      feature:repo-add mvn:no.priv.bang.osgi.service.adapters/service-adapters-karaf/1.0.0/xml/features
      feature:install adapter-for-osgi-logservice
      feature:install pax-http-whiteboard
    2. Install the sample application with the following commands
      bundle:install mvn:no.priv.bang.demos/hello-karaf-demo/1.0.0-SNAPSHOT
      bundle:start mvn:no.priv.bang.demos/hello-karaf-demo/1.0.0-SNAPSHOT
  3. Open a web browser on the following URL http://localhost:8181/hello

Debug into the application running in karaf

  1. Set up a remote debug connection in karaf
    1. Open the menu Run->Debug Configurations…
    2. In the debug configurations dialog:
      1. Select “Remote Java Application”
      2. Click the “New launch configuration”
      3. In the Name field, write
        Remote karaf debug
      4. In the Port field, write
        5005
      5. Select the “Source” tab
      6. Click the button “Add…”
      7. In the “Add Source” dialog:
        1. Select “Java Project”
        2. Click the button “OK”
        3. In the “Project Selection” dialog
          1. Click the checkbox of “hello-karaf-demo”
          2. Click the button “OK”
      8. Click the button “Debug”
  2. Set a breakpoint in the code
    1. Open the HelloServlet.java file (press Ctrl-Shift-t to open the “Open Type” dialog, type HelloServlet and select the HelloServlet class)
    2. Go to line 60 (press Ctrl-l, then type 60, and press Enter), i.e. this line
                  response.setStatus(200);
    3. Press Ctrl-Shift-b to set a breakpoint on line 61
  3. Reload the web page in the web browser
  4. Observe that the debugger stops at the breakpoint
  5. Press F8 to make the code continue to run

Make a modification in the code, picked up by karaf

    1. In the karaf console, give the following command to make karaf listen in the local repository for new versions of bundles with SNAPSHOT version
      bundle:watch *
    2. Modify the code in eclipse
      1. In the HelloServlet.java file, change
            private static final String TITLE = "Hello world!";
            private static final String PARAGRAPH = "This is sent via PAX Web Whiteboard Extender.";
        

        to

            private static final String TITLE = "Hello karaf world!";
            private static final String PARAGRAPH = "Powered by Apache Karaf.";
        
      2. Save the HelloServlet.java file
    3. Build the project with maven
      1. In eclipse, select the menu Run->Run Configurations…
      2. In the dialog Run Configurations:
        1. Select “Maven Build” in the list on the left
        2. Click the “New launch configuration” icon (top left in the dialog)
        3. In the “Name:” field, type:
          mvn install hello-karaf-demo
        4. In the “Base directory” field, type:
          ${project_loc:hello-karaf-demo}
        5. In the “Goals” field, type:
          install
        6. Click the button “Run”
    4. Reload the web page in the web browser and observe that the change now is present

Installing with a karaf feature

In the installation example earlier in this article the runtime requirements of the sample application were first installed into karaf (pax-http-whiteboard and adapter-for-osgi-logservice).

It is possible to install this example application in a way that also pulls in the dependencies:

  1. The first thing to do, is to remove the existing installation in karaf:
    1. Stop karaf
      logout
    2. Remove the “data” subdirectory
      rm -rf data
    3. Start karaf
      bin/karaf debug
  2. Removing the data directory removes all state in karaf, and karaf is back to a fresh installation, this can be verfied by:
    1. listing the bundles in the karaf console
      bundle:list
    2. Verifying that http://localhost:8181/hello results in a 404 Not Found
  3. The first thing to do, is to add the feature repository for the hello world application. The feature repository is an XML file containing one or more “karaf features”. A karaf feature can be a list of bundles to load, and it can also contain references to other features.  The feature file for the hello world application is attached to the maven bundle artifact and can be installed with the following command:
    feature:repo-add mvn:no.priv.bang.demos/hello-karaf-demo/1.0.0-SNAPSHOT/xml/features

    Note: This is the same karaf commando as the first command of the dependencies install. The pax-http-whiteboard feature is built-in to karaf and doesn’t require a feature repository install

  4. Now the feature for the hello world application can be installed
    feature:install hello-karaf
  5. Verify that the feature install has pulled in dependencies
    bundle:list
  6. Verify that the application is running on http://localhost:8181/hello

Some things to try on your own:

  1. Remove the data directory again and try installing the bundle without the dependencies and see what happens
  2. Start karaf watching for bundle modifications in the local maven repository
  3. List bundles on different levels with the bundle:list command with the threshold argument (hint: all karaf commands take the “–help” argument)
  4. Try launching the “mvn install” for the bundle using eclipse hotkeys after modifying the java file and observe that karaf loads the rebuilt module

Installing apache karaf on debian

Until the RFP (Request For Packaging) bug for karaf in the debian bug tracker is resolved, here is an APT archive with a karaf package for debian (architecture “all”).  The package is created using native debian packaging tools, and built from a source tarball and the APT archive itself is created, using aptly.

The package has been tested on Debian 9 “stretch” (the current stable), amd64.

Do the following commands as root on a debian GNU/linux system:

  1. Add the key for the APT archive
    wget -O - https://apt.bang.priv.no/apt_pub.gpg | apt-key add -
  2. Open the /etc/apt/sources.list file in a text editor, and add the following lines:
    # APT archive for apache karaf
    deb http://apt.bang.priv.no/public stable main
  3. Install karaf with apt-get
    apt-get update
    apt-get install openjdk-8-jdk karaf
  4. Log in with SSH (password is “karaf” (without the quotes)) and try giving some commands:
    ssh -p 8101 karaf@localhost