Sakila: Dagger 2 Dependency Injection

REST web server and dependency injection

The code from this article is available here.

I described basic usage of Dagger 2 already in this article, now we need to implement dependency injection mechanism in the web application. Typical web application would have at least two layers, one for the web server itself and another for client request processing.

Spark Java web server internally use embedded Jetty web server. To setup and start the server we provide some central services like configurations, authentication, statistics etc. Those services are usually instantiated once per whole application.

For each client request a new thread is allocated for the whole duration until the request is processed. Each client request trigger a method defined in the router.

Depend on the design decisions but we usually want that each request will be processed by new object instance (an example is ActorResource in the picture). We will probably have multi threaded problems or simply data leaks from one user to another if we don’t create new instance each time the request is received.

Some objects needed in the typical request processing scenario have different lifecycle requirements, for example database transaction object must be the same for the whole duration of the started transaction but different for each user. There are times when we need two independent transactions in one service request for example. Transactions are span over multiple service objects for example.

On the other side when we do not require transaction (read only processing) we would be better of if we use first available connection allocated to us from the connection pool for the smallest amount of time possible.

As we see from the use cases above there are very different lifecycle scenarios and dependency injection must support them all.

Scopes

We will need at least two scopes for our web application. First scope is application level scope. In the dagger this scope is on by default. Each time we tag a class as a “@Singleton”, the object will be instantiated on the application level and all subsequent requests for this object will return the same instance. So the singleton representing an “application scope” by default. No need for specific scope definitions.

Classes without any scope annotations (no @Singleton or any other scope) are always provided with a new instance.

To manage injection on the application level we create ApplicationComponent interface and ApplicationModule class.

Application module class:

At the application level example we present next use cases:

  • creating an instance with the supplied constructor parameter (ConfigProperties service)
  • instantiate objects  from the external dependencies with provide method (Gson)
  • instantiate specific implementation for an interface (ResponseTransformer interface)
  • instantiate an String object with named annotation (using name as differentiator)

Request scope

To create “request scope” we write one annotation interface (“@interface”) one component (RequestComponent) and at least one module class (RequestModule). The component must have sub-component annotation.

To manage DI on the request scope level we create annotation type interface RequestScope , RequestComponent interface and RequestModule class.

Just to be clear I want to emphasize that each class annotated with the “@RequestScope”  annotation will be instantiated exactly once per created instance of RequestComponent class.  It means that scope annotations represent local singletons.

Module class:

We use localized singletons especially for the transaction and jooq data access support.

Provide methods are optional, we can decorate classes in the source code with the corresponding annotations.

Service classes

If we analyze the code in the consumer classes, it become ridiculously simple.  All externalized requirements are created by the dagger code almost hassle free.

In the ActorResource class for example we analyze received request,  extract parameters and start business logic. The transaction object is created on the request scope and pass down to all service objects in need to collaborate in the same database transaction.

In the ActorService class we receive all constructor parameters from the dagger automatically.

The class ActorService require two objects at the constructor: jooq DSLContext  class and ActorDao class.

DSLContext class is part of the Jooq data access library and is instantiated with the provider method “provideDSLContext”. It is annotated as @RequestScope it means the RequestComponent will keep single instance of it for the duration of one request cycle.

ActorDao object is also generated by Jooq library so we couldn’t tag it with scope annotation in the source (so we wrote the provideActorDao method in the request module).

Summary

Dagger calculate all dependencies in the compile time and generate the required code for the whole graph of dependencies and is able to instantiate appropriate objects at appropriate times really fast.

 

The code from this article is available here.

Other resources:

More about dagger scopes, sub-components .

 

Sakila: Sample app project setup

This sample application will integrate quite a few very nice open source tools available to every developer:

  • Postgresql – database
  • Flyway – database migration tool
  • Jooq – Java object oriented querying + HikariCP connection pool
  • Dagger 2 – dependency injection
  • SparkJava – fast and simple web server + GSON Json serializer
  • JavaScript Polymer SPA application framework
  • Vaadin Elements components

The application will consist of many modules :

Postgresql – database

Initialize sample database

For start we will install sample sakila database  in the local Postgresql database. Restore downloaded file into locally created database.

Sakila sample database is open source demo database and represent database model of DVD rental store. It  consist of  15 relational tables, 7 views and few other database details and it’s full of data.

Well that’s database with full of test data for development environment. If we need empty database (for production for example) we need to start initialization DDL script to build the database.

To create the script from the existing database use the command pg_dump which is capable of exporting database in the form of sql commands :

To export database without any data , only schema definitions we use “schema only” (-s) option.

Flyway migrations

Create flyway config file and “migrations” folder under the project root.

Add “fw” command somewhere on the path.

Put the “V1.0.1__sakila_init.sql” file in the migrations folder. If everything works as expected the “info” command should report the pending migration.

Flyway migration and initial database state after database restore

After restoring the database with test data in it we need to “baseline” initial migration. Initial sql script to create empty database was bypassed with restore. The V1.0.1__sakila_init.sql migration script was still pending.

With the baseline command we agree that the migration is not needed and you mark the migration version as migrated.

 

Setup java server project

In the IDE (IntelliJ IDEA Community 2017.2) create new console type project “sakilaweb/server”.

Setup git-bash terminal as default intellij terminal

Jooq – object oriented querying

Create jooq config file and add jooq command somewhere on the path.

Bash command:

Add “jooq-3.10.1.jar” library to project dependencies. Add “postgresql-42.1.4.jar” if you use the same database.

Run code generation tool with “jooq” command in the terminal at the project root.

After code was successfully generated in the “./database” folder you will get a bunch of database related code ready made ( database schema, POJOs, and DAOs).

The project with generated code will now look like :

Setup Dagger 2

Configure IDEA for annotations processor.

Add dagger dependencies (dagger-compiler only as “Provided” because it is used only for code generation ).

Setup SparkJava web server

Add few references to the project dependencies and setup “hello-world” web sample just to be sure everything is setup us expected before start real coding.

Create main procedure as :

Now if you run the application you should already get the first page:

Publish to the github

First enable VSC support in the local project and add .gitignore file to the project. Next we add files to the local git repository created in the project.

If we want  to push code to the remote repository we need to create it to have repository to commit to. Login to the github and create new empty repository.

The code for the server side project is available here.

 

 

Next : In the next installment I will put the generated database layer into the use and expose first REST service.

 

 

 

 

 

Dependency injection with Dagger 2

Prepare project to use dagger DI in IntelliJ IDE

The IntelliJ IDEA Community version 2017.2.5 is used in this example.

Open project structure and add dagger as project dependency from Maven repository.

  • com.google.dagger:dagger:2.10
  • com.google.dagger:dagger-compiler:2.10

The dagger compiler has to be available only for development phase (annotation processing will generate code) so we added it as “Provided” dependency.

Enable annotation processing in the project settings and select content root for generated code.

If you wish to inspect generated code in the project tree it is wise to open “generated” folder manually and mark it as “Source root”.  After compiling the project tree will look like this:

Create simple service application

First we will create very simple application without DI. The service application will consist of one Application class starting everything up and with single Service class doing some work.

Implement dagger dependency injection

To start as simple as possible we will not change how the application is instantiated and started.  The dagger is initialized in the Application object and the only injected instance will be the instance of Service class.

Now we need to create  “Module” object where we implement “provide” method to provide new instance of the service class. The module class must have @Module annotation and factory methods must have @Provides annotations.

Now we need a “Component” object which is an interface from which dagger generate injector class.  The component must contain a @Component annotation with defined modules and inject function signature.

The service class was not changed because we do not inject anything there, the service has no external requirements, so the code stay exactly the same as before:

Connect everything together

In the Application constructor we initialize the dagger with the build() and inject() functions. The injection is achieved with the @Inject annotation in front of the local “service” member declaration.

If a dagger want to inject a class member it must be publicly accessible because it is not using any reflection (that’s good for speed and debugging!). But we could also initialize private variables with the use of constructors marked with annotation @Inject taking in all required parameters.

Extending our knowledge

@Inject on the constructor instead of provide method

Now we could explain that provideXXX methods in the modules are not the only way how to explain to dagger how to construct the objects … We can instead of the method simpy annotate constructor of the object with @Inject.

Because only one constructor can have the @Inject annotation the dagger know exactly how the object should be constructed. The special provideXXX method on the module is not needed anymore.

Try to add constructor to Service class annotated with @Inject and delete provideService() method from the “SimpleModule” class , rebuild the project and everything should work as before. You can observer generated classes because know the daggergenerate special factory object with construction mechanism.

The code after changes should look like:

and the module:

 

More about the topic

DI and the dagger tool has many features not covered in this article. You can find more about them here .

 

Java Spring Boot project setup

You can get source code of this project from github repository “SpringBootMyApp” .

Create Spring Boot Maven project

Go to Spring Boot project generator web site and select minimal project definition with dependencies :

  • Web:  dependency for embedded tomcat server
  • jOOQ: integrated SQL query language and data model code generator
  • Flyway: database model migration tool
  • Postgresql JDBC driver

After downloading ZIP project file, unzip it to some folder and open folder in IntelliJ IDE environment.

Startup class

The project will automatically include embedded tomcat server and spring application start will configure whole application at startup.

Setup project properties

Under src/main/resources find file application.properties and add configuration for database access  and flyway configuration. Your application will start migration procedure at every application start and syncronize to the correct version automatically.

Flyway migration files

You need a folder where your SQL migration files will reside:

Add first migration file, for example “create currency table” will be saved in file V1_0_0__currency.sql:

Be extra careful with names , first part (V1_0_0) is version of migration file, second part of the name is a description, separated with two underline characters.

The scripts must be correct SQL! Validate it first in database environment. If script is not valid, migration procedure will break and you will probably have hard time to figure out what was wrong.

Now you run your application and your database must be prepared according to migration script changes automatically. Flyway start automatically with your application.

More information on this video tutorial for flyway and spring boot..

Maven plugin settings

If you want more control around code generation and migration life cycle, you add flyway and jooq plugins:

Add this properties to the properties section in pom.xml file:

  • Database connection settings in pom.xml will be used by jooq code generator and flyway migration maven plugins.

Flyway and Jooq dependency and version

You probably already have it from generated pom.xml, add it if you don’t have it.

Version was not required by spring boot but I put version in because I encounter differences between plugin version and spring boot version of the libraries.

Now you need to define plugin definitions as :

Flyway and Jooq maven commands

After you add plugin definitions to maven pom.xml file you get new lifecycle commands :

When flyway commands are used directly they search for SQL files in the target “classes” folder and not in the source tree (“src”). When project compiled the files, target folder is synchronized with the current version of code in the source tree.

After compiling project you could run “flyway:migrate” command for example. You can always check files in the target folder and delete it if you are not sure you have the latest version of the files.

Jooq code generation

Jooq generate data model code from your database. The code is added to the project in the namespace as you defined in plugin definition.  It’s wise to split plugin definition to properties and plugin definition part.

Every time you compile project, database model code is regenerated. Check here for more information about jooq.

You can check how database model code is regenerated simply by deleting a line from one of generated files and recompile the project again.

More about jooq & spring boot can be found here.

Create JSON service

Now I need to test the environment if everything is set in place as should be. I will create simple currency service as sample application and test basic CRUD operations.

Create simple data container class

To be able to send JSON payload back and forth from client to server we create a simple transport class:

Service class and database interaction with Jooq

Now we create service class with which we will interact with database :

REST controller with router mapping

On the end we need a controller class to expose access points for REST service:

Don’t forget to instantiate services with DI (dependency injection) with @Autowire annotation. This simplify development a lot !

Interactive testing

The service should in this point work as expected, just run application and navigate to “localhost:8080/api/currency”.

You can search for a specific currency with added path variable appended to Url address:

Postman application for Http API testing

To inspect service in more details on the client side, you can use Postman application.

Look at example ofadding a currency with a POST message:

In the headar I add “content-type” variable with value “application/json” and in the payload a json message with a new currency json structure. On the right side I received “200-OK” and a currency record with a record and new row identifier registered in the database.

 

Prepare JavaEE/JavaScript development environment

Environment

My current environment consist of windows 10 on HP Pavillion i7 notebook with 8GB RAM, 2 x 500 GB SATA disk.

Java

Install java JDK and set JAVA_HOME and JRE_HOME environment variables.

IDE: Netbeans 8.2

Download and install Netbeans for Java EE development.
Netbeans IDE have very small font for menus and project browser etc. You can change size of  base font with parameters on netbeans program start :

You can add this parameter in etc\netbeans.conf file like this :

Netbeans plugins

QuickOpener plugin

Open OS shell (windows command window) on the location of the selected file is important feature.

Open /Tools/Plugins and search and install QuickOpener plugin.

Select a file in a project tree and hit Alt+1, command window will open.

Markdown

Markdown is a text-to-HTML conversion tool for web writers. Markdown allows you to write using an easy-to-read, easy-to-write plain text format, then convert it to structurally valid XHTML (or HTML).
This plugin is installed in Netbeans 8.2 as downloaded installation  “nbm” file and installed from there.

NB MindMap Editor

The Plugin keeps information in formatted markdown compatible text files (*.MMD) and show the information content as graphs. This plugin is installed from the netbeans plugin repository.

 

Java EE Server: Apache TomEE Plume 7.0.2

Download and install TomEE Plume server, unzip to some folder, for example:

and install as service (with a batch file command).

Prepare tomcat_users.xml file

Username and password are written in tomcat-users.xml file under conf folder.

Install TomEE server as windows service

Run as administrator: service.install.as.admin.bat

After installation check all service settings with “tomee.exe” application. In Java tab you should have :

Screen captures:

2016-11-21-21_16_07-c__programs_apache-tomee-plume-7-0-2_bin 2016-11-21-21_16_30-c__programs_apache-tomee-plume-7-0-2_bin 2016-11-21-21_16_56-apache-tomee-properties 2016-11-21-21_17_10-c__programs_apache-tomee-plume-7-0-2_bin 2016-11-21-21_17_24-apache-tomee-properties 2016-11-21-21_17_35-c__programs_apache-tomee-plume-7-0-2_binIf service didn’t want to start as service (windows/services), check log files and react accordingly.

Correct IDE proxy setting in server properties

If TomEE server will not want to start or stop propertly from inside Netbeans IDE environment,  then you need to change proxy settings.

Go to:
Servers> (Apache Tomcat EE server)> plataform > “used ide proxy settings”

Uncheck “used ide proxy settings” !

Why use TomeEE Plume server ?

Hot deployment !

TomEE is highly compatible with other JavaEE servers (JBoss, WildFly, etc.) but is simpler for development because of build in hot deployment support ! Immediately after changes in resources (java files, html files, javascript files ..etc.) there is no need to rerun and restart  server. Browser window opened in the debug mode (in chrome with netbeans extension) will refresh automatically.

You need to run application in Debug mode to be able to benefit from hot deploy and unfortunately not all changes in java files are allowed.

 

WildFly/JBoss server for production

For production application server I will use WildFly.
Download ZIP formated file (wildfly-10.1.0.Final.zip) and unzip it to some working folder where your standalone server wil run.  You can also download RedHat JBoss EAP product, it is essentially the same but with some additions.

It is a good idea to have one instance of production Java EE server installed to be able to test applications under production server.

Setup chrome browser

For debugging you will need latest chrome browser.

Add it to the PATH
Add folder of “chrome.exe” program to the environment variable  PATH (for example: “C:\Program Files (x86)\Google\Chrome\Application” )
You can check if chrome folder is already in the path with this windows power-shell command :

Debugging Java and JavaScript

When you run web application with “Chrome with Netbeans Connector” for the first time and you don’t have connector installed, you will get dialog for installing chrome extension.  After selecting install options in the dialog, chrome will open and proper extension will be already prepared for installation. Just add extension to chrome and rerun web application under debug.

Program will now stop at any breakpoint in java or javascript files.

Gradle build environment

Download binary distribution file of gradle and unzip it to some folder (C:\Programs\gradle-3.2).

In the Netbeans go to Tools/Plugins and install gradle plugin.  After netbeans restart go to Tools/Options/Miscellaneous/Gradle in in Gradle Installation change gradle installation directory to local folder to where you install gradle before. Put same folder for gradle home directory to.

2016-11-21-23_27_41-helloworld-netbeans-ide-8-2

Flyway database migrations tool

Go to flyway web site and download it. Unzip it to some location (C:\Programs\flyway-4.0.3), now you can config it directly in installed folder or do some extra work to configure it locally inside your java project.

Cygwin environment

If you are using windows, you will probably install cygwin.  After download and run installation program, just check under “Shell” group if you have bash shell selected. You can add additional packages, for example “bash command completion” package.

You can always add additional packages to an existing cygwin installation, just repeat installation procedure again.

Additional packages

All/Utils/tree: Display graphical directory tree

 

Netbeans will open bash terminal if you open Window/IDE Tools/Terminal window menu command.

Home folder for your locally installed cygwin will be at the location of environment variable “HOME”.  Set this variable at desired location on your disk. Create “home” folder and subfolder with your user name.

Create additional ‘bin’ folder inside home folder and add it to the path.  You will be able to write automated bash scripts and put them there and use them as instantly accessible commands.

No spaces in PATH variable

You should check PATH  environment variable in windows, cygwin doesn’t convert path correctly if you put a blank space between  folders. Path will still work in windows, but not in cygwin !

For example the path in example will be converted as

If any part of the path is not properly converted to /cygdrive/ mappings, you will not be able to rely on the system path.

Netbeans – TerminalExtras plugin

This plugin enable shortcut keys to change current folder in already open terminal in to the node’s directory (with Alt+.)

2016-11-23-00_53_54-pluginsTo be able to change current folder with “Alt+.” command you need to have:

  • Opened local “Terminal” window
  • selected node in the project tree (any node)
  • press “Alt” and “dot” key

    After key command on the “Source Packages” projects node, “cd” command will be executed and focus will move to the terminal.