Showing posts with label Spring. Show all posts
Showing posts with label Spring. Show all posts

Friday, June 5, 2020

Auditing in MySQL

One of the important requirements in many RDBMS based workloads is to have audit log where any row in any table is stamped with who changed it and  when was it changed? To do this in every place where the table is modified is extremely cumbersome. In this blog post we look at how we can enable the springframework based auditing framework to perform this activity.

Before we do that, we need to upgrade the versions for our dependencies since I have been using this project for quite sometime. Also I decided to use lombok logging annotation and removed all the dependencies on log4j.

Here are the modifications to the pom.xml for setting dependencies.


Now we look at our User entity. We need to add auditing fields to this table. Since in a realistic project one would have multiple entities, we create a abstract entity base class with all the audit fields. We also move the primary key to the abstract base class. When we do that, hibernate provides multiple ways of mapping the entities to the database schema. We want all the parent attributes to be stored in a single table with the child class attribute. To accomplish that, we need to add @MappedSuperClass to the base class.

As we can see, we have added five attributes in the base class. One is the id for all our entities and rest four will be used for auditing purposes.
At this time we also add another layer to our code. Currently all the Endpoints directly call the Repository layer, this causes a problem if we want to write functions that can be reused across different endpoints. An example of this need is retrieveUser method that takes an argument that could be a username or a email. Currently this method lies in the Endpoint layer as a private method. This is a useful method in many different contexts, so we create a new UserService layer and move this method there.


Now, let's get to the original task of enabling auditing. First we define a Auditing Config as below.

We had earlier defined a ThreadLocal that is used by the auditAware method defined above to extract currently logged in user and return its userId. As we can see the audit fields in the AbstractBaseEntity expects a Long for @createdBy and  @LastModifiedBy fields. The EntityAuditConfig also has annotation @EnableJpaAuditing which is required.
At this point we also add a new endpoint called ProfileEndpoint which can be used to manage the entity that represents a user profile. This entity currently only contains a url.
Now if we perform any operation on any of the endpoint, we will see the auditing fields automatically populated. Give it a spin. It is a life saver in many productions applications. I have had situations where users changed their passwords, forgot them and then complained saying that they have been hacked. 
The complete code for this and previous posts can be found at my github repository. This tutorial changes are under v1.5.

Friday, April 12, 2019

Spring and more Kafka

In the last post, we saw how to integrate Kafka with Spring Boot application. The post was a very simple implementation of Kafka. The real world is much more complex. You have to deal with multiple topics, you need multiple partitions. In this post, we explore more details of a spring boot application with Kafka.
Let's start with the fact that we have multiple topics that we are dealing with. The very first thing that we need to do is to separate out the properties for each of the topics.

We can no longer use default KafkaTemplate, we will have to create our own. The best way is to define a KafkaProducer. To create a KafkaProducer, we need to create KarkaProducerConfig. Here are two sets of KafkaProducerConfig class, one each for the topic.


Looking at the producer config, we can observe that we create two beans which are qualified by names and return an appropriate KafkaTemplate which is later used to send the message. Similar to producer config, we need to create consumer config. Consumer config is used by the listener to listen for the message.


As we can see in each of the consumer configs, we create a bean which returns a ConcurrentKafkaListenerContainerFactory. The beans are qualified by a name so that we can use an appropriate container factory for receiving messages.
We also modify the MyTopicMessage and add a member variable topicName that will help us distinguish the topic to which the message needs to be sent.

We also modify the endpoint so that the message can be sent to the appropriate topic.

Now we modify the Listener to integrate everything so that the appropriate message can be received. Look at @KafkaListener annotation, we pass on the ListenerContainerFactory as an argument to receive a message from a queue.
Now we can test the server. The flow of the service is as below.

  1. The message is posted to the endpoint as a POST request
  2. The @RestController receives the message and based on the topicName in the request, it sends to message to the topic with the same name.
  3. The listener receives the message from the appropriate queue.
$ curl -X POST \
>   'http://localhost:8081/send?token=3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c' \
>   -H 'Content-Type: application/json' \
>   -H 'Postman-Token: 15fbe075-9c80-4af9-a797-6b5e0979fd1b' \
>   -H 'cache-control: no-cache' \
>   -H 'token: 3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c' \
>   -d '{
> "message" : "This is my message!",
> "topicName" : "FirstTopic"
> }'
Success!

The message receipt is indicated in the server log.
2019-04-12 14:50:47.142  INFO 51396 --- [ntainer#0-0-C-1] i.s.b.t.listeners.KafkaMessageListener   : Received FirstTopic message for partition 0 This is my message!
Similarly, we can send a message to SecondTopic.
$ curl -X POST   'http://localhost:8?token=3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c'   -H 'Content-Type: application/json'   -H 'Postman-Token: 15fbe075-9c80-4af9-a797-6b5e0979fd1b'   -H 'cache-control: no-cache'   -H 'token: 3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c'   -d '{
"message" : "This is another message!",
"topicName" : "SecondTopic"
}'
Success!

The message receipt is indicated in the server log.
2019-04-12 14:53:14.972  INFO 51396 --- [ntainer#1-0-C-1] i.s.b.t.listeners.KafkaMessageListener   : Received SecondTopic message for partition 0 This is another message!
The complete code base for this tutorial can be found at my github repository at v1.4.

Wednesday, April 10, 2019

Kafka and Spring

Kafka has become a very popular platform and is being used as a stream, journal and even eventing system. In this post, we explore how to integrate Kafka with spring framework application. First, we add the Kafka bootstrap server details in the application.properties file.

Let's also add dependencies in pom.xml.

Now, for each Kafka topic, we create a listener class. The listener class provides a callback method that is called when any message is retrieved on that topic.

Now we create an endpoint through which we inject a message in the queue. The message is sent to the queue and is retrieved by the listener.

We autowire a KafkaTemplate instance that is used to send the message to the queue.

$ curl -X POST \
>   'http://localhost:8081/send?token=3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c' \
>   -H 'Content-Type: application/json' \
>   -H 'Postman-Token: e281e3c5-0dae-4bb7-ac8d-6555f66a18c6' \
>   -H 'cache-control: no-cache' \
>   -H 'token: 3193fa24-a0ba-451b-83ff-eb563c3fd43b-cdf12811-7e41-474b-8fa6-e8fefd4a738c' \
>   -d '{
> "message" : "This is my message!"
> }'
Message sent successfully!.
The receipt of message is indicated in the spring server log.

2019-04-10 14:28:03.969  INFO 31091 --- [ntainer#0-0-C-1] i.s.b.t.listeners.MyTopicKafkaListener   : Received Promise message This is my message!

Tuesday, February 12, 2019

12. Adding GIT release information

In the previous post, we saw how to enable actuator endpoints on our spring server. Once we have done that, it is a good idea to add GIT release information to the server in order to get the information related to the currently deployed server at runtime.
We add the git-commit-id-plugin to our pom.xml.

   <plugin>
    <groupid>pl.project13.maven</groupid>
    <artifactid>git-commit-id-plugin</artifactid>
    <version>2.2.1</version>
   </plugin>
The next things to do is to create a git.properties file in the resource directory of the project.

The git-commit-id is added by the maven plugin so it is a good idea to build the project using maven.

$ mvn clean package -DskipTests
Now we can run the server and check the /manage/info endpoint using curl command.
$ curl -X GET http://localhost:9091/manage/info
{"git":{"branch":"master","commit":{"id":"6fb94c0","time":1549868235.000000000}}

11. Spring Actuators

Spring provides actuators that are a helpful set of tools to debug the application on runtime. Here is how to enable them. We first add actuator dependency in the pom.xml
<dependency>
  <groupid>org.springframework.boot</groupid>
  <artifactid>spring-boot-starter-actuator</artifactid>
</dependency>

Now we define a prefix for all the actuator endpoints. We add the following line into the application.properties file. This enables all the actuator endpoints. We can enable specific endpoints by adding a comma delimited list of endpoints. We also deploy management endpoint on a separate port so that we can block its access from something like ELB.
management.endpoints.web.base-path=/manage
management.server.port=9091
management.endpoints.web.exposure.include=*
Since we already have a security filter defined, we need to exempt health and info endpoint from security check. We add the following URLs int he SecurityConfiguration configure method.
@Override
    public void configure(WebSecurity web) throws Exception {
        web.ignoring().antMatchers("/manage/health");
        web.ignoring().antMatchers("/manage/info");
        web.ignoring().antMatchers("/webjars/**");
        web.ignoring().antMatchers("/error");
        web.ignoring().antMatchers("/swagger-ui.html");
        web.ignoring().antMatchers("/v2/api-docs/**");
        web.ignoring().antMatchers("/swagger-resources/**");
    }
Here we have added paths related to error, actuator, and swagger.
This enables actuator endpoints for our server. We can query these endpoints and following is the sample response.

$ curl -X GET http://localhost:9091/manage/health
{"status":"UP"}