2. Assignment 2

In this task you will learn how to build and integrate services with different technologies. You will build Java EE services and a service facade that communicate with different remoting techniques. Furthermore, you will learn about inversion of control (IoC), aspect-oriented programming, and metaprogramming – mechanisms that are a key component of many modern application frameworks. Before you get started, don’t forget to pull the updates from the code Template repository.

Points

Task

Points

Service Oriented Architecture

13

Dependency Injection

5.75

Aspect Oriented Programming

6.25

Assignment Interview

25

2.1. Service Oriented Architecture

Fig. 2.1 shows the high-level architecture and interaction of our services. An authentication service provides credential validation and token management. The trip service is a backend service providing business logic. Additionally, a service facade provides a homogeneous interface to access the service backend. The services are integrated via Java EE CDI, JAX-RS REST web services, and gRPC Remote Procedure Call.

_images/ass2-overview.svg

Fig. 2.1 Service and integration architecture

For this task, the template contains five modules that reflect a typical modularization scheme for similar applications.

Module

Description

ass2-service/api

contains the specifications and interfaces for our application which all other modules depend on.

ass2-service/auth

contains the implementation of the authentication service and the gRPC service implementation to expose it.

ass2-service/auth-client

contains the code used to call the authentication service.

ass2-service/trip

contains the trip service implementation and the corresponding REST resource implementation.

ass2-service/facade

contains the code and configuration for the service facade. It uses the auth-client to authenticate requests.

The two services and the facade are isolated applications that can be deployed individually into individual application containers. The automated tests deploy your applications into a Spring Boot container, although any Java EE compatible application container would work (e.g., GlassFish or TomEE). You should take a look at the automated tests and see how the application container is created and configured.

Hint

If you are already familiar with building applications using Spring Boot, then using Java EE in combination with Spring Boot may seem awkward. Please note that you will not be using the org.springframework facilities for implementing these tasks. Unbeknownst to many Spring users, Spring supports Java EE to a high degree. We use Spring Boot simply because it allows us to easily spawn an application container and deploy our applications there.

Testing

We provide unit and integration tests for individual components, and runnable applications for the auth service, the trip service, and the facade. Because of the relatively complex setup, there’s no single mvn test target you can execute for this task. Instead, to test the implementation of your components, run the unit tests in the respective module from the IDE (or you can run mvn test -Pass2-service -Dtest=<test> where <test> specifies the class name of the test you want to execute, for example: mvn test -Pass2-service -Dtest=ProtoSpecificationTest). To manually test the interplay of components, you can run the three Spring Boot applications individually (find them in the test directory of their respective module). The automated tests are configured to insert testdata on application startup. To insert testdata before running the application manually, run the Spring Boot application with the profile testdata. You can do this by passing --spring.profiles.active=testdata as program argument (you will need to edit the run configuration your IDE). Remember that if you run multiple applications, you should only let your first application insert testdata (i.e., either your TripApplication or the AuthenticationServiceApplication).

You are of course encouraged to add your own tests, but please do not change existing ones. Also, if you want to change configurations for development purposes that is fine, but please make sure the provided set of tests run with the provided configuration before you submit.

2.1.1. Services

In this task, you will implement the business logic of the services, and configure the application context via Java EE CDI. In later tasks, you will glue the services together by exposing REST resources and using gRPC.

2.1.1.1. TripService

The Trip Service is the backend service for other services to manage trips. It allows riders via the client application to create and plan trips, and provides an interface to other backend services to change the status of trips and matches.

Implement the ITripService and put your implementation in the ass2-service-trip module in the impl package. The expected behavior of the methods is described in the Javadoc of the respective methods. Use the appropriate Java EE CDI annotations to make your service available as singleton to the application container, and to inject the required services. The application context already provides, for example, an IDAOFactory and IModelFactory instance (see the TripApplicationConfig in the test directory). The application container also creates a persistence context that provides an EntityManager. Furthermore, the application container also provides a Java Transaction API (JTA) transaction manager that you should make use of via the javax.transaction API.

Users (riders) can initially create trips which requires the create function that creates a Trip object in the database, and returns a TripDTO. The TripDTO’s purpose is to hide database internals from any callers of the service. As long as the Trip is in CREATED state, users can then continue to set additional stops via the addStop or removeStop method. Each time the route of a trip changes, you should update the fare estimate using the IMatchingService. The IMatchingService is provided by the application context and can be injected via CDI. The calculateFare method also checks whether the trip is valid or not (e.g., if one of the stops is outside the mobility service zone), and will throw an exception if it the trip is invald. The create, addStop, and removeStop methods do not have to check the validity of a trip. You also do not have to use locks for addStop and removeStop to ensure that the trip’s state does not change. Instead, set the fare estimate of the TripDTO to null if the calculateFare returns an exception.

The confirm function validates the trip (see the exceptions in the Java Docs for details) and indicates that a rider has accepted the fare estimate and wishes to initiate a matching to a driver. To that end, use the IMatchingService to put the trip’s ID into the matching queue via queueTripForMatching. The idea is that, once a match has been found, a service will call the match function of ITripService with a MatchDTO containing the necessary information to finalize the match. However, you need to transactionally create the Match entity in the database. Confirm that the selected Driver wasn’t assigned in parallel (e.g., by writing a query to check whether the driver has ever been assigned or if the driver’s last trip is completed or cancelled), or that other involved database entities (excluding locations) were removed by a different transaction. The service should only create the Match entity and change the Trip state to MATCHED if the conditions are met. Furthermore, if an exception occurs, you should re-queue the Trip object. You will also need make sure that writes to the involved database entities are guarded against concurrent access (other distributed components could be reading or writing the entities in the database concurrently). To that end, make use of JPA locking mechanisms. You should be able to reason about the selected locking mechanism during the submission interview.

The complete method completes the Trip lifecycle and creates a TripInfo from the given DTO. All other methods should be straight-forward to implement from the Javadoc descriptions.

Hint

You are allowed to extend the DAO interfaces from Assignment 1. However, please do not modify existing method declarations.

To test this stage, run the TripServiceTest, TripServiceCDITest & TripStateManagementTest. It will spin up the Spring Boot application, insert the test data, and execute your service implementation.

Hint

Make sure to use Java 11. Specifically, the TripStateManagementTest will fail otherwise due to Mock’s being incompatible.

2.1.1.2. AuthenticationService

Many web applications use token-based authentication, where a user supplies their credentials and receives in return a token that identifies them. The token is then passed to each subsequent service call. This type of authentication has become particularly popular for REST applications that are typically stateless (and hence don’t retain login session information themselves). To facilitate token-based authentication for our application, you will now build a service that validates user credentials, and manages authentication tokens. Later we will expose this service via gRPC.

The general interface IAuthenticationService in the API has an extension in the ass2-service-auth module that you should implement. Find the ICachingAuthenticationService in the ass2-service-auth module, and put your implementation in the subpackage impl. Implement the methods according to their specification, and make the service available to the container as a singleton. For the authentication, remember that the passwords are stored as (unsalted) SHA-1 hash sums in the database. As tokens, you can again use randomly generated UUIDs.

To speed up access to the participant data, the service should load all email and passwords from the database into memory once at startup (hence the additional ‘Caching’ name in the service). Use the appropriate CDI annotation to run the loadData method after the bean has been constructed by the container.

When changing a password, your service should both update the cache and the respective database entry. Make sure that changePassword and authenticate methods are mutually exclusive. However, you should keep performance mind: authenticate will be called quite often, whereas changePassword will not. A simple synchronized block is therefore not a good solution.

When authenticating a user, feel free to implement and use the findByEmail method in the IRiderDAO.

To test this stage, run the AuthenticationServiceTest, AuthenticationServiceCDITest & AuthenticationServicePersistenceTest. It will spin up the Spring Boot application, insert the test data, and execute your service implementation.

Hint

Make sure to use Java 11. Specifically, the AuthenticationServicePersistenceTest will fail otherwise due to Mock’s being incompatible.

2.1.2. Remote Procedure Call

Remote Procedure Call (RPC) is a very well-known remoting technology. There exist a large variety of implementations, some of which you are probably already familiar with (Java Remote Method Invocation for example). For this task, we will use Google’s cross-platform RPC framework gRPC to enable communication with our authentication service. Specifically, you will create a service definition using Protocol Buffers, and implement client and server classes for exposing the authentication and token validation mechanism.

Before you get started, we suggest you read at least the gRPC Java tutorial, which covers most of the aspects used for this task. Note that you will not have to start the actual gRPC client or server, as this is done by the application container instantiated by the automated tests.

2.1.2.1. Service Description

Before generating server or client code, we need to have a service definition. In gRPC, such service definitions are written via the Protocol Buffers platform. Implement your authentication service definition in the auth.proto file. The gRPC framework generates Java code from this definition during build-time. The build integration for gRPC is already configured. If you change the proto file, first build your application with Maven. You can then find the generated source files in target/generated-sources/protobuf. In IntelliJ IDEA, you will need to include the generated-sources directories to the class path. You can do this by simply running the action “Generate Sources and Update Folders” (either via the “Find Action” menu, or right click on the folder in the project view, select maven and select “Generate Sources And Update Folders”).

The gRPC service should be named AuthService, placed in the package dst.ass2.service.api.auth.proto and expose the following methods:

  • authenticate: takes as an input a AuthenticationRequest that contains a user’s email and the password as string, and returns an AuthenticationResponse that contains, if the user was authenticated successfully, the authentication token returned by the service.

  • validateToken: takes as an input a TokenValidationRequest that contains the token to validate as a string, and returns an TokenValidationResponse that contains information about the token validity.

Make sure you name the methods and messages just as described here, but how exactly you structure the messages is up to you. Also, make sure to enable the java_multiple_files option in the proto file. The unit tests for this task only do basic sanity checking of your description.

The ProtoSpecificationTest performs sanity checks of the generated class.

2.1.2.2. Authentication Server

Now that we have our service description, you should use the generated code to implement the service methods. Find the respective generated gRPC class for implementing the service, and extend it in the ass2-service-auth module. The implementation should make use of the IAuthenticationService, which is available through the application context. Once you make your implementation available to the container via CDI, you can execute the test application in the test folder, which will start a gRPC server and register your service. Furthermore, implement the IGrpcServerRunner interface (i.e., build & start the gRPC server) and make it available to the container via CDI.

The GrpcServerRunnerTest will test if a connection is possible, but the best way to test the server implementation is to implement the client and run its AuthenticationClientTest integration scenario.

2.1.2.3. Authentication Client

Now, implement a gRPC client for the authentication service, and hide the implementation behind the IAuthenticationClient interface. Extend the GrpcAuthenticationClient implementation in the ass2-service-auth-client module, and make use of the generated gRPC stubs to call the server.

To test your implementation, execute the AuthenticationClientTest. The test requires a functioning authentication server. The test will start up the authentication server you implemented previously, and use your client to call the server.

2.1.2.4. Exception handling

Make sure that the domain specific exception (i.e., NoSuchUserException or AuthenticationException) are propagated from the server to the client. Have a look at the gRPC Java tutorials and examples for solutions on how to handle exceptions, or think of your own solution.

While we do not provide tests that verify the proper handling of exceptions, we expect you to know why this is important how you can implement this in Java during the assignment interview.

2.1.3. REST Web Services

Web services are an important system for providing interoperability between computing systems on the Internet. In this task, you will implement a REST web service that acts as a service facade to our service backend. This allows controlled exposure of our backends service for the use over HTTP by, for example, a frontend web application.

Specifically, you will use JAX-RS to create REST web services that expose your backend services. The tests deploy the application into a Jersey runtime. We suggest that you browse this great book: RESTful Java with JAX-RS 2.0.

2.1.3.1. Trip Web Service

We will now expose the trip service as a REST service. The service should provide the following methods via the specified HTTP verbs (where {id} is the trip id). Unless otherwise specified, the return body of methods can be left empty.

  • getTrip: GET /trips/{id} (returns a TripDTO as a JSON document).

  • createTrip: POST /trips (form parameter: riderId, pickupId, destinationId) (returns the ID of the created trip object).

  • deleteTrip: DELETE /trips/{id}

  • addStop: POST /trips/{id}/stops (form parameter: locationId) (returns the new fare).

  • removeStop: DELETE /trips/{id}/stops/{locationId}

  • confirm: PATCH /trips/{id}/confirm

  • match: POST /trips/{id}/match (body: MatchDTO)

  • cancel: PATCH /trips/{id}/cancel

  • complete: POST /trips/{id}/complete (body: TripInfoDTO)

Implement the ITripServiceResource (put your code into the ass2-service-trip module) and make use of your ITripService implementation. Expose the REST endpoints by annotating the ITripServiceResource with the correct JAX-RS annotations. Furthermore, make the JAX-RS runtime aware of your implementation with the appropriate JAX-RS annotation.

Return status

Make use of appropriate HTTP status codes. Your services should never return 500 errors for error cases that are part of the specification. For example, if a resource (e.g., an entity with a given id), cannot be found, return a 404 error. Think of appropriate response code in other cases, e.g., on transactional errors.

Exception handling

Instead of catching the EntityNotFoundException and others in the package from the service interface within your resource implementation, implement four (including IllegalStateException) respective ExceptionMapper to map the exceptions to an appropriate status code. Don’t forget to make the JAX-RS runtime aware of your mappers (including!

Testing

To test this stage, either run the TripServiceResourceTest and the TripServiceExceptionMapperTest, or execute the TripApplication manually. Note that the automated tests depend on a functioning instance of ITripService. If you start the application manually, the Spring Boot application will start a Web server on port 8091 and expose your resources. You can then test your REST interface by manually executing requests via a client library, e.g., Postman. Remember to use the Spring Boot profile testdata if you want to have test data inserted.

2.1.3.2. Service Facade

The service facade exposes services to the public network. It exposes the entire ITripServiceResource, and the IAuthenticationResource. Implement both resources (IAuthenticationResourceFacade, ITripServiceResourceFacade) and place them into the ass2-service-facade module.

  • The facade’s ITripServiceResource implementation should forward all request to the actual service implementation implemented earlier. To that end, call the REST service you created in the previous task via a JAX-RS REST client (i.e., the delegate). The base URI of the remote service is available via the URL bean tripServiceURI.

    The straight forward approach would be to write a HTTP client that calls the remote REST endpoints, and then call the HTTP client from a wrapping REST controller. This would result in a lot of boilerplate code. Because we have previously written a clean specification of our resource via JAX-RS and an interface, the specification can be re-used by other frameworks to generate clients for it. Jersey provides an extension called jersey-proxy-client, which creates a dynamic proxy instance that exposes the resource interface, but internally makes a REST call to a specified URL, which is exactly what we need, and we encourage you to use it. Check out these two online resources on how to use the library: [1], [2]

  • The IAuthenticationResource provides a REST resource:

    authenticate: POST /auth/authenticate (form parameter: email, password) (returns the auth token as string)

    For your implementation, inject and call the IAuthenticationClient (i.e., the delegate) which is made available by the tests. Again, make use of appropriate status codes.

  • Make all facade implementations managed via the @ManagedBean annotation. Only then the tests can verify proper usage of the delegates.

To this stage, test your implementation using the AuthenticationResourceTest, TripServiceResourceFacadeTest & AuthResourceFacadeTest.

You can also test the integration of the facade manually. Run the Spring Boot trip service application and the authentication service application (remember to run one of them with the testdata profile and the authentication service additionally with the profile grpc). Then, run the ServiceFacadeApplication, which starts a Web server with your resources on port 8090. You can then execute commands on the facaded resources via a HTTP client, or write your own integration tests.

Manual Testing Disclaimers/Caveats

In the following, we address known caveats, mostly revolving around the manual testing of the facade. Please be aware that the manual testing is not what our grading focuses on - make sure that the delivered unit tests are passing. However, there are some known caveats in case you want to manually verify the facade implementation (which we encourage!):

  • PATCH:

The PATCH method is not easy to configure/setup as the Jersey implementation (which you use to create the WebClient) needs to be manually configured to enable this HTTP method. A solution can be found in the discussion but here are the necessary properties that you need to set:

.property(HttpUrlConnectorProvider.SET_METHOD_WORKAROUND, true) .property(ClientProperties.SUPPRESS_HTTP_COMPLIANCE_VALIDATION, true);

This does not necessarily affect the unit tests (which should run just fine), this just happens when you call the PATCH method when starting the application manually (as suggested in the assignment). We are aware of this issue and you therefore do not need to worry about that as long as the unit tests run.

  • Do not forget to set the Consume annotation.

  • In case of the the DELETE request, you can choose an appropriate way to handle the request (e.g., implement a cascading remove).

  • In our implementation, the match endpoint does not have to be manually called but is via the confirm method invoked.

  • If you run into a UnsupportedOperationException: Not a resource exception during testing the facade, try moving the Annotations from the class to the interface.

  • If you want to manually test the AuthService via the facade, you might have to modify the GrpcServerRunner.run method to prevent it from shutting down immediately.

2.1.3.3. Authentication Filter

We will now include an authentication layer into our service facade, by implementing access restriction based on our token-based authentication approach. To do this in a non-intrusive way (i.e., without having to change the interfaces or the service implementation), implement a custom ContainerRequestFilter to intercept incoming requests to the facade. The filter should:

  • Extract from the request header the HTTP Authorization header field, and from that the auth token. The authorization scheme is Bearer, so an example header could look like Authorization: Bearer <token>.

  • Validate the token via the IAuthenticationClient.

  • Abort the request and return an appropriate HTTP status if no token was given or the token could not be validated.

Put your implementation in the ass2-service-facade module, and make it available to the JAX-RS runtime. Note that, if the filter is made available you can inject beans (e.g., the IAuthenticationClient) managed by the runtime in the same manner as you did when implementing your services.

Only requests to the trip service resource should be filtered. A request to the authenticate method in the IAuthenticationResource does of course not require an authentication token. JAX-RS provides several ways to achieve this, e.g., by binding your filter to target components.

The AuthenticationFilterTest and AuthenticationFilterResourceTest will verify your implementation.

Note

The AuthenticationFilterResourceTest will internally fail due to server exception because no Trip service has been be spawned to which the facade’s delegate can forward the request. This is expected behavior - the important part is that the authentication token is properly verified.

2.2. Inversion of Control

Inversion of Control (IoC) is an important principle in application frameworks to achieve separation of concerns between general framework functionality, and application specific functionality. There are several IoC mechanisms which we will explore in the remainder of the assignment. The solution for this task has no direct relationship to the application scenario, as these tasks concern framework-level functionality.

2.2.1. Dependency Injection

Dependency Injection (DI) is a key IoC mechanism implemented by frameworks to achieve dependency inversion, e.g., in Spring, EJB, or OSGi. Many other IoC techniques rely on DI, because it allows the application container (the thing running your application) to retain control of the objects of the application to invoke them or return potentially modified versions of the objects. We often say that such objects are container managed. We will take a detailed look at the DI pattern and implement a simple custom DI container using Java’s metaprogramming facilities.

Make yourself familiar with the basics of the Java Reflection API. Your task is to implement the dst.ass2.ioc.di.IObjectContainer interface, and annotate the custom annotations in dst.ass2.ioc.di.annotation with the appropriate retention policies and target types. Return an instance of your IObjectContainer implementation from ObjectContainerFactory#newObjectContainer(Properties). Add all your implementations to the dst.ass2.ioc.di.impl package.

Example

The following listing shows a small functioning example

public class Main {
    public static void main(String... args) {
        IObjectContainer container = ObjectContainerFactory.getInstance();
        container.getProperties().setProperty("my_date_format", "yyyy-MM-dd HH:mm:ss");

        DateLogger logger = container.getObject(DateLogger.class);
        logger.logNow(); // prints for example "1985-10-26 01:16:00"
    }
}

@Component
class DateLogger {

    @Inject(targetType = JavaDateFormatter.class)
    private DateFormatter formatter;

    public void logNow() {
        System.out.println(formatter.format(new Date()));
    }

}

interface DateFormatter {
    String format(Date date);
}

@Component
class JavaTextDateFormatter implements DateFormatter {

    @Property("my_date_format")
    private String dateFormatString;

    private Format format;

    @Initialize
    public void init() {
        format = new SimpleDateFormat(this.dateFormatString);
    }

    public String format(Date date) {
        return format.format(date);
    }

}

2.2.1.1. Creating and injecting objects

Annotating any instantiable class with @Component indicates that objects of this type can be managed by the IObjectContainer. There are two different types of components: singletons and prototypes (defined by the scope attribute of Component). If a component has the scope singleton, then the object container ensures that there is only one instance of the class within the container. In other words, calling IObjectContainer.getObject(MySingletonComponent.class) will always return the same object instance. In the case of a prototype component, getObject will always return a new instance. Each object container instance manages its own pool of singletons, so calling getObject on different IObjectContainer instances will return different object instances. Annotating a field of a component class with @Inject indicates that an instance of the field’s type should be injected into the field by the container when creating the component. Annotating a method with @Initialize indicates that the container should run this method after the object has been created. Creating an object’s dependencies and then automatically injecting the dependencies into the respective fields is the process of auto wiring.

The getObject function that implement this functionality should behave as follows:

  • Trying to create an object of a class that is not annotated with @Component should lead to an InvalidDeclarationException

  • Our DI mechanisms depends on default constructors. If, during the autowire process, a class cannot be instantiated, e.g., because it does not have a default constructor, throw an ObjectCreationException.

  • Every declared (public or private) and inherited field in a components class hierarchy should be processed

  • You need to get or create all necessary dependencies in the entire object graph (i.e., multiple levels of dependencies)

  • Injections can be optional, indicated by the optional flag in the @Inject annotation. If true, no exception is thrown if it is, for whatever reason, not possible to create the dependency or set the field.

  • The targetType attribute can specify the concrete subtype that should be instantiated to create the dependency. By default, the field’s declared type is used.

  • It is not required to deal with circular dependencies.

  • Wrap all checked exceptions from Java’s reflection module into an appropriate InjectionException.

2.2.1.2. Post-construct initialization

Often we want to execute code after the autowiring process has been completed. Our DI container facilitates this via the @Initialize annotation. The container should behave as follows:

  • After instantiating a component, methods (either declared or inherited) annotated with @Initialize should be executed by the container. If an execution of such a method fails, throw an ObjectCreationException.

  • Methods annotated with @Initialize should not have any parameters. If they do, throw an InvalidDeclarationException.

  • In cases where the object hierarchy has multiple methods annotated with @Initialize or overwrites previously annotated methods, you should come up with your own behavior.

2.2.1.3. Property injection

The object container also allows the injection of values from java.util.Properties objects via the @Property annotation. The value of the @Property annotation, e.g. @Property("my_key") refers to the key my_key of the Properties object. You should inject the value of that key into the respective field. Properties are always required, so if the Properties object does not contain a requested key, then throw an ObjectCreationException with an appropriate message.

Because values in the properties may have a different type from the field it should be injected into, the autowire process needs to deal with type conversions. The value in the Properties object will be a string, as properties are typically read from files. To inject the value of myint into a field @Property("myint") Integer myInt;, you need to inspect the type of the field and convert the value accordingly. The supported type conversions are as follows: string values should be converted to boxed primitive types (Integer, Float, Boolean, etc.) or their primitive equivalents (int, float, boolean, etc.). Void types do not have to be considered. Propagate any string parsing errors using a TypeConversionException. Other objects or data structures such as Enums, Lists or Maps do not need to be considered, in which case you can throw a TypeConversionException.

It should be possible to change values of the Properties object returned by IObjectController.getProperties(), such that newly created objects are wired with the updated values. However, you do not need to update the values of already created objects. Simply lookup the most current value when wiring an object.

2.2.1.4. Thread safety

Make sure your container implementation is thread safe for the following cases. Consider the case when multiple threads call getObject to get a singleton object that has not been initialized yet. Furthermore, consider that a property value is injected multiple times into the same object. You should make sure that concurrent modifications to the property values does not lead to inconsistent property values in an object.

To this stage, execute the following tests to verify your implementation: DependencyInjectionTest, HierarchyTest, InitializeTest, ObjectContainerGradingTest, ObjectContainerSingletonTest & PropertyInjectionTest.

2.2.2. Bytecode Instrumentation

Bytecode instrumentation refers to the act of modifying an already compiled program. It is a useful tool to dynamically inject behavior into compiled code, making it particularly useful for application frameworks. In this task you will implement container managed named locks.

Suppose you want to use a lock across multiple objects. This would typically involve either passing a lock instance around, or having a global table of locks and their names. For example:

@Component
class BarClass {

   public void bar() {
      Lock lock = LockManager.getInstance().getLock("myLock"); // returns a specific lock managed by LockManager
      lock.lock(); // acquire lock
      // do critical stuff that is mutually exclusive with 'FooClass.foo()'
      lock.unlock(); // release lock
   }
}

@Component
class FooClass {

   public int foo() {
      Lock lock = LockManager.getInstance().getLock("myLock"); // returns a specific lock managed by LockManager
      lock.lock(); // acquire lock
      try {
         // do critical stuff that is mutually exclusive with 'BarClass.bar()'
         return 42;
      } finally {
         lock.unlock(); // release lock
      }
   }
}

The functionality you will implement in this task facilitates writing the above code (or equivalent variants without try/finally) as:

@Component
class BarClass {

   @Lock("myLock")
   public void bar() {
      // do critical stuff that is mutually exclusive with 'FooClass.foo()'
   }
}

@Component
class FooClass {

   @Lock("myLock")
   public int foo() {
      // do critical stuff that is mutually exclusive with 'BarClass.bar()'
      return 42;
   }
}

To that end you need two things: the LockManager (which should be straight forward, and can be implemented however you like), and dynamically instrument all methods that use the @Lock annotation with the locking procedure shown in the first example. The specific goal is to implement a ClassFileTransformer that modifies the byte code of these methods using the Javassist library (the required dependency is already part of the template project). Modify only classes that are annotated with our @Component annotation. Make sure that locks with the same name can be re-acquired by the same thread (which is necessary to make nested calls with the same lock. Simply use the correct lock types of Java!), and make sure locks are freed correctly, even if an exception occurs during the method execution.

Make yourself familiar with the java.lang.instrument package description and study the Javassist tutorial. Also, take a closer look at the pom.xml files in the root directory of the template, particularly the javaagent:... argLine and the Premain-Class manifest entry. To build the agent JAR run mvn package -DskipTests=true -Pass2-ioc. In your IDE, when you run the tests, make sure to explicitly activate the “ass2-ioc” profile, which is configured to enable the agent when running unit tests.

Note

During the discussion sessions, you should be able to explain the end-to-end process of bytecode injection in detail.

Hint

Make sure that you stick to the Javassist library. We had reports in the past that using the Java Reflection package can lead to Surefire Maven Plugin exceptions during testing.

To this stage, verify your code via the LockingTest.

2.3. Aspect Oriented Programming

Another important part of modern application servers is the dynamic loading and deployment of plugins or applications. This feature is also a nice demonstration for using reflection and class loading at runtime, so we will again implement a (simplified) custom solution of our own. Put the code for this task into the ass2-aop module.

The package dst.ass2.aop.sample in the template contains a couple of simple plugin examples for testing. You may optionally add your own plugin classes to test your solution more thoroughly.

Hint

Make sure to use Java 11!

2.3.1. Plugin Executor

Implement the IPluginExecutor interface provided in the template. To allow the test framework to instantiate instances of this interface, implement the createPluginExecutor method in PluginExecutorFactory. IPluginExecutor is the main component responsible for executing plugins. It has to monitor (i.e., repeatedly list the contents of) several directories to detect whether new jar files were copied to these directories or existing jar files were modified.

In addition to monitoring for changes, also check the directories once when initializing your application. Monitoring of sub-directories is not required. In case you choose to employ the Java WatchService, be aware that at file creation, the service (depending on the OS) might fire two events (ENTRY MODIFY and ENTRY CREATE) for the same file – please make sure that you only start a plugin once in this case.

The executor then scans the file and looks for classes that implement the IPluginExecutable interface. If some plugin executable is found, the executor spawns a new thread and calls its execute method. For this, you should be using a thread pool.

Take care of class loading: there must not be any problem with the concurrent execution of different plugins containing classes with equal names.

Also make sure to free all acquired resources after the execution of a plugin has been completed. The second method in the interface (IPluginExecutable.interrupted()) will be important later on, you can leave this method body empty for now.

If you detect that a plugin .jar file has changed and the plugin is currently still executing (the code in the previous version of the .jar file), you do not need to terminate the existing plugin instance (i.e., you can simply start a new version of the plugin with the updated code, and let the old one terminate normally).

Verify your implementation with the PluginExecutorTest.

2.3.2. Logging Plugin Executions

Now we want to implement some (decoupled) logging facilities for our plugin executor framework. To this end, we will make use of Aspect-Oriented Programming (AOP) with AspectJ. The required AspectJ dependencies are already part of the Maven template. The tests in the template have been configured to use run-time weaving (i.e., objects are instantiated and then the aspects are dynamically weaved into the underlying class definitions of these objects). One alternative would be load-time weaving (i.e., using the Java agent mechanism to weave aspects into the classes at class loading time), which we do not use for this assignment.

Before you get started, familiarize yourself with the concepts of aspects, advices, joinpoints and pointcuts. Then, refer to the AspectJ Development Kit Developer’s Notebook for support on how these concepts are implemented in AspectJ. Use the annotation-based development style to define your aspects.

Your first task is to write a simple logging aspect for plugins. Essentially, the aspect should write a single line of logging output before a plugin starts to execute, and after a plugin is finished. The bare class definition of LoggingAspect is already included in the template - add the required methods and annotations to this class. The log message can be very brief, but needs to contain the actual class name of the plugin:

[java] Plugin dst2.dynload.sample.PluginExecutable started to execute
[java] Plugin dst2.dynload.sample.PluginExecutable is finished

In some cases, users of the plugin framework might want to disable logging for some plugins. The template defines a method annotation Invisible. Whenever an IPluginExecutable.execute() is annotated as invisible, its execution should not be logged. Make sure that this condition is already considered in the pointcut definition of your logging advice (i.e., you should not match just any plugin method and filter out invisible plugins in your Java code).

Additionally, your logging aspect should re-use the logger of the plugin, if the plugin has defined one. That is, if the plugin has a member field of a subclass of java.util.logging.Logger, your log statements should be written to that logger. If no such logger is defined, use System.out. Configure the logging system to print all log messages of at least level INFO or higher.

Verify your implementation with the LoggingPluginTest.

2.3.3. Plugin Performance Management

Plugin frameworks like the one we are implementing often need some way to influence the execution of the managed plugins. Hence, we now implement some means to interrupt plugins whose execution takes too long.

The template already defines a method annotation Timeout, which has one parameter of type Long. Users can use this annotation on IPluginExecutable.execute() methods to define the “normal” maximum execution time (in milliseconds) of their plugins. Then, annotate and implement the aspect class ManagementAspect (see template) to enforce this defined maximum execution time. Have the aspect hook into each invocation of IPluginExecutable.execute() to keep track of the currently running plugins and their start time. From the start time you can then derive the current execution duration, e.g., by polling in regular intervals or using a timer task. If a plugin is detected that takes longer than its maximum defined time, call this plugin’s interrupted() method. You do not need to take any further action (i.e., we can assume that the developer of the plugin actually terminates the plugin if this callback is invoked). However, keep in mind that the interrupted() method can itself take some time to execute, so do not block while the client is cleaning up. If no timeout is defined for a plugin, you can assume that the plugin can run for as long as it needs to.

Verify your implementation with the PerformancePluginTest.