Week-7 Full Stack
Week-7 Full Stack
Week-7 Full Stack
Why Spring?
Java programs are complex and feature many heavyweight components. Heavyweight
means the components are dependent on the underlying operating system (OS) for their
appearance and properties. Spring is considered to be a secure, low-cost and flexible
framework. Spring improves coding efficiency and reduces overall application
development time because it is lightweight -- efficient at utilizing system resources -- and
has a lot of support. Spring removes tedious configuration work so that developers can
focus on writing business logic. Spring handles the infrastructure so developers can focus
on the application.
o Presentation/view layer (UI) - This is the outermost layer which handles the
presentation of content and interaction with the user.
o Business logic layer - The central layer that deals with the logic of a program.
o Data access layer - The deep layer that deals with data retrieval from sources.
Each layer is dependent on the other for an application to work. In other words, the
presentation layer talks to the business logic layer, which talks to the data access layer.
Dependency is what each layer needs to perform its function. A typical application has
thousands of classes and many dependencies.
As shown in the figure, spring modules are divided into Test, Core Container, AOP,
Aspects, Instrumentation, Data Access / Integration, and Web (MVC / Remoting).
The Core Container is made up of four modules: Core, Beans, Context, and Expression
Language, with the following details:
1. The Core module contains the framework’s core components, such as the IoC and
Dependency Injection capabilities.
3. The Context module provides a means for accessing any objects established and
configured, and it builds on the robust foundation provided by the Core and Beans modules.
The ApplicationContext interface is the Context module’s focal point.
4. SpEL is a strong expression language for searching and modifying an object graph in real-
time.
The JDBC, ORM, OXM, JMS, and Transaction modules make up the Data Access/Integration
layer, as shown below.
1. The JDBC module contains a JDBC abstraction layer that eliminates the need for time-
consuming JDBC code.
2. JPA, JDO, Hibernate, and iBatis are just a few of the prominent object-relational mapping
APIs supported by the ORM module.
3. For JAXB, Castor, XMLBeans, JiBX, and XStream, the OXM module provides an
abstraction layer that supports Object/XML mapping implementations.
4. The JMS module of the Java Messaging Service offers facilities for sending and receiving
messages.
5. For classes that implement special interfaces and all of your POJOs, the Transaction
module offers programmatic and declarative transaction management.
Web Layer:
The Web layer is made up of the Web, Web-MVC, Web-Socket, and Web-Portlet modules,
which are detailed below.
1. The Web module includes basic web-oriented integration features including multipart file
upload and IoC container setup via servlet listeners and a web-oriented application
environment.
4. The Web-Portlet module mimics the functionality of the Web-Servlet module and
provides an MVC implementation for usage in a portlet context.
2. The Aspects module integrates with AspectJ, a sophisticated and well-established AOP
framework.
3. The Instrumentation module includes support for class instrumentation and classloader
implementations for usage in specific application servers.
5. Spring components can be tested with the JUnit or TestNG frameworks using the Test
module.
Spring Boot is a project that is built on the top of the Spring Framework. It provides an easier
and faster way to set up, configure, and run both simple and web-based applications.
It is a Spring module that provides the RAD (Rapid Application Development) feature to the
Spring Framework. It is used to create a stand-alone Spring-based application that you can
just run because it needs minimal Spring configuration.
In short, Spring Boot is the combination of Spring Framework and Embedded Servers.
We can use Spring STS IDE or Spring Initializer to develop Spring Boot Java applications.
Along with the Spring Boot Framework, many other Spring sister projects help to build
applications addressing modern business needs. There are the following Spring sister projects
are as follows:
o Spring Data: It simplifies data access from the relational and NoSQL databases.
o Spring Batch: It provides powerful batch processing.
o Spring Security: It is a security framework that provides robust security to
applications.
o Spring Social: It supports integration with social networking like LinkedIn.
o Spring Integration: It is an implementation of Enterprise Integration Patterns. It
facilitates integration with other enterprise applications using lightweight messaging
and declarative adapters.
o It creates stand-alone Spring applications that can be started using Java -jar.
o It tests web applications easily with the help of different Embedded HTTP servers
such as Tomcat, Jetty, etc. We don't need to deploy WAR files.
o It provides opinionated 'starter' POMs to simplify our Maven configuration.
o It provides production-ready features such as metrics, health
checks, and externalized configuration.
o There is no requirement for XML configuration.
o It offers a CLI tool for developing and testing the Spring Boot application.
o It offers the number of plug-ins.
o It also minimizes writing multiple boilerplate codes (the code that has to be included
in many places with little or no alteration), XML configuration, and annotations.
o It increases productivity and reduces development time.
The main goal of Spring Boot is to reduce development, unit test, and integration test time.
By providing or avoiding the above points, Spring Boot Framework reduces Development
time, Developer Effort, and increases productivity.
To create a Spring Boot application, following are the prerequisites. In this tutorial, we will
use Spring Tool Suite (STS) IDE.
o Java 1.8
o Maven 3.0+
o Spring Framework 5.0.0.BUILD-SNAPSHOT
o An IDE (Eclipse or Spring Tool Suite) is recommended.
Spring Framework is a widely Spring Boot Framework is widely used to develop REST
used Java EE framework for APIs.
building applications.
It aims to simplify Java EE It aims to shorten the code length and provide the easiest
development that makes developers way to develop Web Applications.
more productive.
The primary feature of the Spring The primary feature of Spring Boot is Autoconfiguration.
Framework is dependency It automatically configures the classes based on the
injection. requirement.
It helps to make things simpler by It helps to create a stand-alone application with less
allowing us to develop loosely configuration.
coupled applications.
To test the Spring project, we need Spring Boot offers embedded server such
to set up the sever explicitly. as Jetty and Tomcat, etc.
It does not provide support for an It offers several plugins for working with an embedded
in-memory database. and in-memory database such as H2.
Developers manually define Spring Boot comes with the concept of starter in pom.xml
dependencies for the Spring project file that internally takes care of downloading the
in pom.xml. dependencies JARs based on Spring Boot Requirement.
References
Spring Framework, Rod Johnson , Juergen Hoeller , Keith Donald
https://spring.io/projects/spring-framework
https://docs.spring.io/spring-framework/docs/3.1.x/spring-framework-
reference/html/
https://www.geeksforgeeks.org/introduction-to-spring-framework/
https://www.javapoint.com/
3. @ComponentScan: It is used when we want to scan a package for beans. It is used with
the annotation @Configuration. We can also specify the base packages to scan for Spring
Components.
4. @Service: It is also used at class level. It tells the Spring that class contains the business
logic.
9. @Required: It applies to the bean setter method. It indicates that the annotated bean must
be populated at configuration time with the required property, else it throws an
exception BeanInitilizationException.
@GetMapping: It maps the HTTP GET requests on the specific handler method. It is
used to create a web service endpoint that fetches It is used instead of
using: @RequestMapping(method = RequestMethod.GET)
@PostMapping: It maps the HTTP POST requests on the specific handler method. It
is used to create a web service endpoint that creates It is used instead of
using: @RequestMapping(method = RequestMethod.POST)
@PutMapping: It maps the HTTP PUT requests on the specific handler method. It is
used to create a web service endpoint that creates or updates It is used instead of
using: @RequestMapping(method = RequestMethod.PUT)
@DeleteMapping: It maps the HTTP DELETE requests on the specific handler
method. It is used to create a web service endpoint that deletes a resource. It is used
instead of using: @RequestMapping(method = RequestMethod.DELETE)
@PatchMapping: It maps the HTTP PATCH requests on the specific handler
method. It is used instead of using: @RequestMapping(method =
RequestMethod.PATCH)
@RequestBody: It is used to bind HTTP request with an object in a method parameter.
Internally it uses HTTP MessageConverters to convert the body of the request. When
we annotate a method parameter with @RequestBody, the Spring framework binds the
incoming HTTP request body to that parameter.
@ResponseBody: It binds the method return value to the response body. It tells the
Spring Boot Framework to serialize a return an object into JSON and XML format.
@PathVariable: It is used to extract the values from the URI. It is most suitable for
the RESTful web service, where the URL contains a path variable. We can define
multiple @PathVariable in a method.
@RequestParam: It is used to extract the query parameters form the URL. It is also
known as a query parameter. It is most suitable for web applications. It can specify
default values if the query parameter is not present in the URL.
@RequestHeader: It is used to get the details about the HTTP request headers. We
use this annotation as a method parameter. The optional elements of the annotation
are name, required, value, defaultValue. For each detail in the header, we should
specify separate annotations. We can use it multiple time in a method
@RestController: It can be considered as a combination
of @Controller and @ResponseBody annotations. The @RestController annotation
is itself annotated with the @ResponseBody annotation. It eliminates the need for
annotating each method with @ResponseBody.
Spring Initializer is a web-based tool provided by the Pivotal Web Service. With the help
of Spring Initializer, we can easily generate the structure of the Spring Boot Project. It offers
extensible API for creating JVM-based projects.
It also provides various options for the project that are expressed in a metadata model. The
metadata model allows us to configure the list of dependencies supported by JVM and platform
versions, etc. It serves its metadata in a well-known that provides necessary assistance to third-
party clients.
Generating a Project
Before creating a project, we must be friendly with UI. Spring Initializer UI has the following
labels:
o Project: It defines the kind of project. We can create either Maven Project or Gradle
Project. We will create a Maven Project throughout the tutorial.
o Language: Spring Initializer provides the choice among three languages Java,
Kotlin, and Groovy. Java is by default selected.
o Spring Boot: We can select the Spring Boot version. The latest version is 2.2.2.
o Project Metadata: It contains information related to the project, such as Group, Artifact, etc.
Group denotes the package name; Artifact denotes the Application name. The default Group
name is com.example, and the default Artifact name is demo.
o Dependencies: Dependencies are the collection of artifacts that we can add to our project.
There is a Generate button. When we click on the button, it starts packing the project and
downloads the Jar or War file, which you have selected.
Step 2: Provide the Group and Artifact name. We have provided Group
name com.javatpoint and Artifact spring-boot-example.
When we click on the Generate button, it starts packing the project in a .rar file and downloads
the project.
File -> Import -> Existing Maven Project -> Next -> Browse -> Select the project -> Finish
It takes some time to import the project. When the project imports successfully, we can see the
project directory in the Package Explorer. The following image shows the project directory:
Dependency Injection:
In a conventional way, Developers will have control over the code in creating the objects &
injecting them at run time. Here, the Spring framework takes the control of doing the above-
mentioned activities at run time, that’s why the term is named as ‘Inversion of Control’ (IoC)
i.e., the control is inverted! Dependency in programming is an approach where a class uses
specific functionalities of another class.
So, for example, If you consider two classes A and B, and say that class A uses functionalities
of class B, then its implied that class A has a dependency of class B. Now, if you are coding
in Java then you must know that, you have to create an instance of class B before the objects
are being used by class A.
So,the process of creating an object for some other class and let the class directly using the
dependency is called Dependency Injection.
2. Property Injection: In this type of injection, the injector method injects the
dependency to the setter method exposed by the client.
Let's see the simple example to inject primitive and string-based values. We have
created three files here:
o Employee.java
o applicationContext.xml
o Test.java
Employee.java
It is a simple class containing two fields id and name. There are four constructors and
one method in this class.
applicationContext.xml
Test.java
This class gets the bean from the applicationContext.xml file and calls the show method.
Output:10 null
If you don't specify the type attribute in the constructor-arg element, by default string
type constructor will be invoked.
If you change the bean element as given above, string parameter constructor will be
invoked and the output will be 0 10.
Output:0 10
Output:0 Sonoo
Output:10 Sonoo
We can inject the dependency by setter method also. The <property> sub element
of <bean> is used for setter injection.
Let's see the simple example to inject primitive and string-based values by setter
method. We have created three files here:
o Employee.java
o applicationContext.xml
o Test.java
o Step 1: Open your Eclipse IDE and create a Spring Boot Application by right-
clicking and choosing Spring Starter Project. Then mention the name of the project
and click on Finish.
o To get the Spring Starter Project, you have to install Spring Tool Suite from the
Eclipse Marketplace.
o You will automatically see that an application file is created as below.
Test.java
This class gets the bean from the applicationContext.xml file and calls the display method.
Step 2: Next, create a class in the same package. To do that right-click the file ->
choose Class and mention the class name. Then click on Finish. This will create a Class file.
Here I have created a Student class. Refer below.
Step 3: After that, let us put in some properties for the class. So, let us say, we include ID,
Employee name . Mention the code below.
package com.example.demo; //package name
Step 3.1: Once you are done with that, you have to generate Getter and Setter methods for
these properties. To do that, select these properties and the right click. Then choose Source -
> Generate Getter and Setter methods.
It is a simple class containing three fields id, name and city with its setters and getters
and a method to display these information.
Now, consider a scenario where you have to create an object for Employee and you do
not want to do it manually. In such scenario, you will then have to use Dependency
Injection, to get the objects whenever you require it.
So, next let us look into how we can achieve the same by using applicationContext.xml file
applicationContext.xml
We are providing the information into the bean by this file. The property element invokes
the setter method. The value sub element of property will assign the specified value.
This tag specifies that dependency will be resolved by calling method of a bean and since
the bean is of type prototype, we will get new instance on every call.
SingletonBean.java
public abstract class SingletonBean {
public SingletonBean()
{
System.out.println("Singleton Bean Instantiated !!");
}
public abstract PrototypeBean etPrototypeBean();
}
PrototypeBean.java
public class PrototypeBean {
private String message;
public PrototypeBean()
{
System.out.println("Prototype Bean Instantiated !!");
}
public void setMessage(String message){
this.message = message;
}
public String getMessage(){
return this.message;
}
public PrototypeBean getPrototypeBean()
{
return this;
}
}
import org.springframework.context.ApplicationContext;
import
org.springframework.context.support.ClassPathXmlApplicationContext;
public class TestProgram {
public static void main(String[] args) {
ApplicationContext context =
new ClassPathXmlApplicationContext("beans.xml");
SingletonBean singleton =
(SingletonBean)context.getBean("singletonBean");
PrototypeBean prototypeBeanA = singleton.getPrototypeBean();
PrototypeBean prototypeBeanB =
singleton.getPrototypeBean();
System.out.println(prototypeBeanA);
System.out.println(prototypeBeanB);
System.out.println("Is prototypeBeanA and prototypeBeanA same
? " + (prototypeBeanA==prototypeBeanB));
}
}
References
2. https://spring.io/projects/spring-framework
3. https://docs.spring.io/spring-framework/docs/3.1.x/spring-framework-
reference/html/
4. https://www.geeksforgeeks.org/introduction-to-spring-framework/
5. https://www.javapoint.com/
6. http://www.wideskills.com/spring/method-injection-in-spring
WEEK - 7
Apache Maven
Why Maven ?
There are many problems that we face during the project development. They are:
2) Creating the right project structure: We must create the right project structure in
servlet, struts etc, otherwise it will not be executed.
3) Building and Deploying the project: We must have to build and deploy the
project so that it may work.
What is Maven?
In case of multiple development teams environment, Maven can set-up the way to
work as per standards in a very short time. As most of the project setups are simple
and reusable, Maven makes life of developer easy while creating reports, checks,
build and testing automation setups.
Maven simplifies the above mentioned problems. It does mainly following tasks.
2. It provides uniform build process (maven project can be shared by all the
maven projects)
o Builds
o Documentation
o Reporing
o SCMs
o Releases
o Distribution
A build tool takes care of everything for building a process. It does following:
Features of Maven
Coherent site of project information − Using the same metadata as per the
build process, maven is able to generate a website and a PDF including
complete documentation.
Parallel builds − It analyzes the project dependency graph and enables you to
build schedule modules in parallel. Using this, you can achieve the
performance improvements of 20-50%.
Maven is a Java based tool, so the very first requirement is to have JDK installed on
your machine.
You can download and install maven on windows, linux and MAC OS platforms.
Here, we are going to learn how to install maven on windows OS.
4. Verify Maven
1) Download Maven
Download Maven latest Maven software from Download latest version of Maven
Now add MAVEN_HOME in variable name and path of maven in variable value. It
must be the home directory of maven i.e. outer directory of bin. For
example: E:\apache-maven-3.1.1 .It is displayed below:
Click on new tab if path is not set, then set the path of maven. If it is set, edit the path
and append the path of maven.
Here, we have installed JDK and its path is set by default, so we are going to append
the path of maven.
4)Verify maven
To verify whether maven is installed or not, open the command prompt and write:
mvn −version
Now it will display the version of maven and jdk including the maven home and java
home.
Before maven 2, it was named as project.xml file. But, since maven 2 (also in maven
3), it is renamed as pom.xml.
Some of the configuration that can be specified in the POM are following −
project dependencies
plugins
goals
build profiles
project version
developers
mailing list
It should be noted that there should be a single POM file for each project.
All POM files require the project element and three mandatory
fields: groupId, artifactId, version.
1 Project root
This is project root tag. You need to specify the basic schema settings such
as apache schema and w3.org specification.
2 Model version
3 groupId
4 artifactId
5 version
This is the version of the project. Along with the groupId, It is used within
an artifact's repository to separate versions from each other. For example −
com.company.bank:consumer-banking:1.0
com.company.bank:consumer-banking:1.1.
Sample POM.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.companyname.project-group</groupId>
<artifactId>project</artifactId>
<version>1.0</version>
</project>
Maven Archetypes
Maven achieves the uniform directory structure according to the project templates
called "archetype".
What is Archetype?
Why Archetype?
Archetype enables developers to stick onto the best practices employed by the
project/organization
During project creation, Maven consults the repository for the specified
archetype to fetch:
Any IDE which has support for Maven, by default, displays a few of the
archetypes supported by it. One amongst them can be chosen by the user to
create his/her project. Also, they can add an artifact to the existing list by
providing the following fields:
o artifactId
o groupId
o version
o repository URL
The following table provides the details of various kinds of Maven Archetypes that
are popular:
Archetype Description
Directory Description
Maven Repository
A maven repository is a directory of packaged JAR file with pom.xml file. Maven
searches for dependencies in the repositories. There are 3 types of maven repository:
1. Local Repository
2. Central Repository
3. Remote Repository
If dependency is not found in these repositories, maven stops processing and throws
an error.
Maven local repository is located in your local system. It is created by the maven
when you run any maven command.
Maven central repository is located on the web. It has been created by the apache
maven community itself.
The central repository contains a lot of common libraries that can be viewed by this
url http://search.maven.org/#browse.
Maven remote repository is located on the web. Most of libraries can be missing
from the central repository such as JBoss library etc, so we need to define remote
repository in pom.xml file.
You can search any repository from Maven official website mvnrepository.com.
clean
default(or build)
site
Clean Lifecycle :
Maven clean goal (clean:clean) is bound to the clean phase in the clean lifecycle. Its
clean:cleangoal deletes the output of a build by deleting the build directory. Thus,
when mvn clean command executes, Maven deletes the build directory.
To trigger this clean life-cycle in application provide the command "clean" as shown
below:
This is the primary life cycle of Maven and is used to build the application. It
comprises of twenty three phases and the most important phases are listed below:
Phase Description
Checks for the correctness of the project and all the information that is
validate required for a project build. For example, it verifies whether pom.xml is
available in the project’s root
Compiles the .java files of the project and the .class files are placed
compile
in ${basedir}/target folder
Executes the unit test cases available in the project and captures the result in
test .txt & .xml format in addition to printing the results onto the console. These
reports can be viewed inside ${basedir}/target/surefire-reports folder
Run checks on the integration test results to make sure that the quality criteria
verify
are met
Pushes the package/distribution unit to the local repository so that the package
install
can be used as a dependency for other local projects
Pushes the build unit to the remote repository for being shared with other
deploy
developers and projects
Site life-cycle
Site life-cycle is responsible for creating site for the purpose of documenting the
project, creating reports, etc. The phases of site life-cycle are as follows:
Phase Description
pre-site execute tasks needed prior to the actual project site generation
execute processes needed to finalize the site generation, and to prepare for
post-site
site deployment
site-
deploy the generated site documentation to the specified web server
deploy
In eclipse, click on File menu → New → Project → Maven → Maven Project. → Next → Next → Nex
write the group Id, artifact Id, Package as shown in below figure → finish.
Now you will see a maven project with complete directory structure. All the files will be created autom
such as Hello Java file, pom.xml file, test case file etc. The directory structure of the maven project is show
below figure.
Now you can see the code of App.java file and run it. It will be like the given code:
1. package com.javatpoint;
2. /**
3. * Hello world!
4. *
5. */
7. {
9. {
11. }
12. }
If you right click on the project → Run As, you will see the maven options to build the project.
Hello World!
References
1. https://infyspringboard.onwingspan.com/web/en/viewer/web-
module/lex_21949054394119170000_shared?collectionId=lex_21491207397
468033000_shared&collectionType=Course&pathId=lex_5106241350177772
000_shared
2. https://www.javatpoint.com/maven-eclipse
3. https://www.tutorialspoint.com/maven/index.htm
WEEK - 7
Spring Boot
Spring Boot is a framework built on top of the Spring framework that helps the
developers to build Spring-based applications very quickly and easily. The main goal
of Spring Boot is to create Spring-based applications quickly without
demanding developers to write the boilerplate configuration.
Spring Boot follows a layered architecture in which each layer communicates with the
layer directly below or above (hierarchical structure) it.
Before understanding the Spring Boot Architecture, we must know the different
layers and classes present in it. There are four layers in Spring Boot are as follows:
o Presentation Layer
o Business Layer
o Persistence Layer
o Database Layer
Presentation Layer: The presentation layer handles the HTTP requests, translates the
JSON parameter to object, and authenticates the request and transfer it to the business
layer. In short, it consists of views i.e., frontend part.
Business Layer: The business layer handles all the business logic. It consists of
service classes and uses services provided by data access layers. It also
performs authorization and validation.
Persistence Layer: The persistence layer contains all the storage logic and translates
business objects from and to database rows.
Database Layer: In the database layer, CRUD (create, retrieve, update, delete)
operations are performed.
o Spring Boot uses all the modules of Spring-like Spring MVC, Spring Data,
etc. The architecture of Spring Boot is the same as the architecture of Spring
MVC, except one thing: there is no need for DAO and DAOImpl classes in
Spring boot.
o The request goes to the controller, and the controller maps that request and
handles it. After that, it calls the service logic if required.
o In the service layer, all the business logic performs. It performs the logic on
the data that is mapped to JPA with model classes.
Spring Initializr
Spring Initializr is a web-based tool provided by the Pivotal Web Service. With the
help of Spring Initializr, we can easily generate the structure of the Spring Boot
Project. It offers extensible API for creating JVM-based projects.
Step 2: Provide the Group and Artifact name. We have provided Group
name com.javatpoint and Artifact spring-boot-example.
When we click on the Generate button, it starts packing the project in a .rar file and
downloads the project.
File -> Import -> Existing Maven Project -> Next -> Browse -> Select the project ->
Finish
It takes some time to import the project. When the project imports successfully, we
can see the project directory in the Package Explorer. The following image shows
the project directory:
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
SpringApplication.run(DemoApplication.class, args);
Spring Boot automatically scans all the components included in the project by
using @ComponentScan annotation.
The IoC container is responsible to instantiate, configure and assemble the objects.
The IoC container gets informations from the XML file and works accordingly.
The main tasks performed by IoC container are:
The container gets its instructions on what objects to instantiate, configure, and
assemble by reading the configuration metadata provided. The configuration metadata
can be represented either by XML, Java annotations, or Java code. The Spring
container is responsible for instantiating, configuring and assembling objects known
as beans, as well as managing their life cycles.
BeanFactory
ApplicationContext
Example
@ComponentScan(basePackages = "com.dte")
@Configuration
// ...
AutoWiring
What is Autowiring ?
In Spring if one bean class is dependent on another bean class then the bean
dependencies need to be explicitly defined in your configuration class. But you can let
the Spring IoC container to inject the dependencies into dependent bean classes
without bean defined in your configuration class. This is called as autowiring.
To do autowiring, you can use @Autowired annotation. This annotation allows the
Spring IoC container to resolve and inject dependencies into your bean. @Autowired
annotation performs byType Autowiring i.e. dependency is injected based on bean
type. It can be applied to attributes, constructors, setter methods of a bean class.
In other words, this annotation instructs the Spring container to find a registered bean
of the same type as of the annotated type and perform dependency injection.
The @Autowired annotation can be used on setter methods. This is called a Setter
Injection.
package com.dte.service;
@Autowired
this.customerRepository = customerRepository;
--------
@Autowired on Constructor
The @Autowired annotation can also be used on the constructor. This is called a
Constructor Injection.
package com.dte.service;
@Autowired
this.customerRepository = customerRepository;
--------------
@Autowired on Properties
package com.dte.service;
@Autowired
------------
In the above code, the Spring container will perform dependency injection using the
Java Reflection API. It will search for the class which implements
CustomerRepository and injects its object. The dependencies which are injected using
@Autowired should be available to the Spring container when the dependent bean
object is created. If the container does not find a bean for autowiring, it will throw the
NoSuchBeanDefinitionException exception.
@Autowired can be used at various places. Following example shows how to use it
on a field. In this example we are making use of 3 Java files namely
GreetingService.java
Greeter.java
AppRunner.java
Computer Science & Engineering Page 10
Full Stack Development - 20CS52I 2022-23
GreetingService.java
package com.logicbig.example;
Greeter.java
package com.logicbig.example;
import org.springframework.beans.factory.annotation.Autowired;
@Autowired
System.out.println(greetingService.getGreeting(name));
AppRunner.java
package com.logicbig.example;
import
org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
greeter.showGreeting("Joe");
OUTPUT
Hi there, Joe
References
1. https://infyspringboard.onwingspan.com/web/en/viewer/web-
module/lex_auth_012971083724537856869_shared?collectionId=lex_auth_0
1296689056211763272_shared&collectionType=Course&pathId=lex_auth_0
12970446325956608743_shared
2. https://infyspringboard.onwingspan.com/web/en/viewer/web-
module/lex_auth_013308189421461504151_shared?collectionId=lex_auth_0
1318928182131916893_shared&collectionType=Course&pathId=lex_auth_0
1322939740318105612349_shared
3. https://www.w3schools.blog/spring-boot
4. https://www.tutorialspoint.com/spring_boot/spring_boot_beans_and_depende
ncy_injection.htm
5. https://www.javatpoint.com/spring-boot-annotations
WEEK - 7
Employee.java
There are two beans, Software Engineer and Quality Assurance Engineer, which
implements the Employee interface.
SoftwareEngineer.java
@Override
@Override
}
Computer Science & Engineering Page 1
Full Stack Development - 20CS52I 2022-23
QAEngineer.java
@Override
@Override
EmployeeService.java
@Autowired
employee.calculateSalary();
employee.calculateDeductions();
To uniquely identify the different beans, we should use the @Qualifier annotation
along with @Autowired. So the above code needs to be changed as,
@Autowired
@Qualifier("softwareengineer")
employee.calculateSalary();
employee.calculateDeductions();
The @Autowired annotation can be used alone. If it is used alone, it will be wired by
type. So problems arise if more than one bean of the same type is declared in the
container as @Autowired does not know which beans to use to inject.
Bean Scope
The scope of a bean defines the life cycle and visibility of that bean in the contexts we
use it. Bean Scopes refers to the lifecycle of Bean that means when the object of Bean
will be instantiated, how long does that object live, and how many objects will be
created for that bean throughout. Basically, it controls the instance creation of the
bean and it is managed by the spring container.
singleton
prototype
request
session
application
websocket
The last four scopes mentioned, request, session, application and websocket, are only
available in a web-aware application.
Singleton Scope:
If the scope is a singleton, then only one instance of that bean will be instantiated per
Spring IoC container and the same instance will be shared for each request. That is
when the scope of a bean is declared singleton, then whenever a new request is made
for that bean, spring IOC container first checks whether an instance of that bean is
already created or not. If it is already created, then the IOC container returns the same
instance otherwise it creates a new instance of that bean only at the first request. By
default, the scope of a bean is a singleton.
Eg:
Step 1: To create a new maven project in Eclipse, open the Eclipse IDE. Go
to File menu -> New > Other> Maven Project.
Step 3: Give the details of group Id, artifact Id, version and package and click
on Finish.
Step 4: Add following Spring dependency and properties in pom.xml file of your
project:
Computer Science & Engineering Page 6
Full Stack Development - 20CS52I 2022-23
1. <dependencies>
2. <dependency>
3. <groupId>org.springframework</groupId>
4. <artifactId>spring-context</artifactId>
5. <version>5.2.7.RELEASE</version>
6. </dependency>
7. </dependencies>
8. <properties>
9. <maven.compiler.target>11</maven.compiler.target>
10. <maven.compiler.source>11</maven.compiler.source>
11. </properties>
1. package com.example.demo;
5. }
6. }
Step 6: Create SpringConfig class and define Singleton scope for Customer bean.
1. package com.example.config;
2.
3. import org.springframework.context.annotation.Bean;
4. import org.springframework.context.annotation.Configuration;
5. import org.springframework.context.annotation.Scope;
6. import com.example.demo.Customer;
7. import com.example.demo.PostPaid;
8. import com.example.demo.PrePaid;
9.
10. @Configuration
12.
13. @Bean
14. @Scope("singleton")
17. }
Step 7: Create the User Interface class which will execute the application.
1. package com.infosys.demo;
5. ApplicationContext context=new
AnnotationConfigApplicationContext(SpringConfig.class);
6.
12. if(customer1==customer2){
14. }
15. else{
17. }
18. }
Prototype Scope:
If the scope is declared prototype, then spring IOC container will create a new
instance of that bean every time a request is made for that specific bean. A request can
be made to the bean instance either programmatically using getBean() method or by
XML for Dependency Injection of secondary type. Generally, we use the prototype
scope for all beans that are stateful, while the singleton scope is used for the stateless
beans.
Let’s understand this scope with an example:
The code for all java files remains same as above , we will just change the scope in
SpringConfig class from singleton to prototype as, observe the output as,
package com.example.config;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Scope;
import com.example.demo.Customer;
import com.example.demo.PostPaid;
import com.example.demo.PrePaid;
@Configuration
@Bean
@Scope("prototype ")
References
1. https://infyspringboard.onwingspan.com/web/en/viewer/web-
module/lex_auth_012970427491500032730_shared?collectionId=lex_auth_0
1296689056211763272_shared&collectionType=Course&pathId=lex_auth_0
12970446325956608743_shared
2. https://infyspringboard.onwingspan.com/web/en/viewer/web-
module/lex_auth_013312239321874432394_shared?collectionId=lex_auth_0
1318928182131916893_shared&collectionType=Course&pathId=lex_auth_0
13214621763878912337_shared
3. https://www.baeldung.com/spring-bean-scopes
4. https://www.geeksforgeeks.org/singleton-and-prototype-bean-scopes-in-java-
spring/
5. https://javarevisited.blogspot.com/2021/10/difference-between-autowired-
and.html#axzz7lkXZEUJL
WEEK - 7
Introduction
In a typical monolithic application, all of the data objects and actions are
handled by a single, tightly knit codebase.
Data is typically stored in single database or filesystem.
Functions and methods are developed to access the data directly from this
storage mechanism, and all business logic is contained within the server
codebase and the client application.
It is possible to migrate several monolithic applications and/or platforms, each
with its own data storage mechanisms, user interfaces, and data schema, into a
unified set of microservices that perform the same functions as the original
applications under a single user interface.
Migrating these applications to microservices offers the following advantages:
removing duplication of effort for manual entry
reducing programmatic development risks
providing a single, unified view of the data
improving the control and synchronization of these systems
Monolith
Many current legacy applications are implemented as monoliths, in which all data
storage and processing are controlled by the monolith, and all functions for all data
objects are processed using the same backend codebase.
Updating code for one set of data objects may break dependencies in other
areas.
Microservices
Microservices can be written in any programming language and can use whichever
database, hardware, and software environment makes the most sense for the
organization.
An application programming interface (API) provides the only means for users and
other services to access the microservice's data.
The API need not be in any particular format, but representational state transfer
(REST) is popular, in part because its human-readability and stateless nature make it
useful for web interfaces.
Data Locality
Although each microservice maintains its own datastore, the data need not be stored
on a single disk on a single machine. What is important is that the only method for the
system and users to access the data is through the API. The underlying storage
mechanism may be a single disk, or it may be a distributed, high-availability cluster
that is mirrored through data replication and some form of quorum mechanism. The
implementation details are up to the system owners and should reflect the needs of the
system based on expected use.
Similarly, there is no requirement that the data-storage location for each microservice
reside on separate hosts. The microservices pattern focuses on only the data directly
reading its own dataset. Since it is not accessing those other datasets, it does not
matter whether there are other co-located datasets on the same data-storage system.
Again, system owners should weigh the cost and maintenance benefits of co-locating
data with the risks associated with maintaining a single point of failure for their data.
The underlying mechanism and processes for storing, backing up, and restoring data
for the monolithic system, the interim macroservices, and the final microservices
remain the prerogative of the system owners.
Users want their interactions with a system to return the right data at the right level of
detail, usually as fast as that data can be acquired. The jobs for users each involve one
or more data objects, and each data object has a set of associated actions that can be
performed. The development team that designs and implements the system must
consider the collection of jobs, data objects, and data actions. A typical process to
migrate from a monolithic system to a microservices-based system involves the
following steps:
There are three main information components with the data used in the system:
data objects
data actions
The data objects are the logical constructs representing the data being used. The data
actions are the commands that are used on one or more data objects, possibly on
different types of data, to perform a task. The job to perform represents the function
the users are calling to fulfill their organizational roles. The jobs to perform may be
captured as use cases, user stories, or other documentation involving user input.
When combining multiple systems into a unified system, the data objects, data
actions, and jobs to perform for each individual system must be identified. All these
components are implemented as modules within the codebase with one or more
modules representing each data object, data action, and job to perform. These
modules should be grouped into categories for working with later steps.
During this part of the migration process, system architects should be asking the
following questions:
If two or more applications provide similar data, can this data be merged?
What should be done about data fields being different or missing in similar
objects?
The migration from a monolithic system to microservices does not typically affect the
user interface directly. The components that are best for migrating are thus
determined by which components
After all the modules have been uniquely identified and grouped, it is time to organize
the groups internally. Components that duplicate functionality must be addressed
before implementing the microservice. In the final system, there should be only one
microservice that performs any specific function. Function duplication will most
likely be encountered when there are multiple monolithic applications being merged.
It may also arise where there is legacy (possibly dead) code that is included in a single
application.
Merging duplicated functions and data will require the same considerations as when
designing the ingestion of a new dataset:
Verify datatypes.
Identify outliers.
Since one of the effects of this migration is to have a single data repository for any
piece of data, any data that is replicated in multiple locations must be examined here,
and the final representation must be determined. The same data may be represented
differently depending on the job to be done. It is also possible that similar data may be
obtained from multiple locations, or that the data may be a combination from multiple
data sources. Whatever the source and however the data will be used, it is essential
that one final representation exists for each unique datatype.
After the components have been identified and reorganized to prepare for the
migration, the system architect should identify the dependencies between the
components. This activity can be performed using a static analysis of the source code
to search for calls between different libraries and datatypes. There are also
several dynamic-analysis tools that can analyze the usage patterns of an application
during its execution to provide an automated map between components.
After the dependencies have been identified, the system architect should focus on
grouping the components into cohesive groups that can be transformed into
microservices, or, at least, macroservices. The goal is to identify a small set of objects
and their constituent actions that should be logically separated in the final system.
The remote user interface is intended as the sole mode of communication between the
system, its components, and the system's users. The underlying interface must be
usable both during the migration and afterwards, so it is likely change as components
are reworked from the monolithic system to macroservices and microservices.
The key output from this migration effort is a unified API that the user interface(s)
and applications can use to manipulate the data. After the API layer is in place, all
new functionality should be added through the API, not through the legacy
applications.
The design and implementation of the API is key to the success of the migration to
microservices. The API must be able to handle all data-access cases supported by the
applications that will use the API.
The API should provide a mechanism so that the application can check the API
version being used and warn users and developers about incompatibilities. The only
changes to the API should be those that add new data objects and functions and that
do not modify the format of the existing outputs or expected inputs. For microservices
to work properly, all data access must be provided through the API to the micro-
services or, during the migration transition period, to the macroservices or legacy
application.
stateless
versioned
The main reason for not moving directly to microservices is complexity. A monolithic
system is typically built with intertwined logic that may cause problems when
converting to microservices. If the monolith is continuously changing, then migrating
to microservices in a single step will be a continuously changing target as well.
Computer Science & Engineering Page 8
Full Stack Development - 20CS52I 2022-23
The key goal at this step is to move component groups into separate projects and
make separate deployments. At a minimum, each macroservice should be
independently deployable from within the system's continuous integration
(CI) and continuous deployment (CD) pipeline.
The process of pulling the components, data objects, and functions out of the
monolithic system and into macroservices will provide insight into how these
components can be further separated into microservices. Remember, each
microservice maintains its own datastore and performs only a small set of actions on
the data objects within that datastore.
Once a macroservice or microservice is ready for deployment, the next step involves
integration testing and deployment. The monolithic system must be configured to use
the new service for its data needs rather than its legacy datastore.
All functions that access the migrated data should be tested in all user interfaces to
ensure that there is no function that still attempts to use the old datastore through a
previously undetected method. If possible, accesses to the old dataset on the old
datastore should be logged and flagged in case old or refactored code is still able to
access the legacy data. Access controls should be updated to prevent users from
accessing the old data directly from the datastore; users may be notified how to access
the data using the new API interface if such direct accesses are permitted.
Introduction
Let’s imagine we have a web application that is developed for a hospital OPD
(OutPatient Department). This is a standard Java based web application running on
Tomcat. Everything was working fine until recently where there had many outages of
the entire application due to increased patient influx due to a pandemic. After a root
cause analysis (RCA) done by the development team, they identified that having the
application as a single, monolithic application is causing a lot of trouble when fixing
issues and rolling out new updates. Some of the challenges they have identified are
Adding new features and improving existing features is a tedious task that
requires a restart of the entire application, with possible downtimes.
The failure of one function can cause the entire application to be useless.
If one particular function needs more resources, the entire application needs to
be scaled (vertically or horizontally).
Integrating with other systems is difficult since most of the functional logic is
baked into the same service.
It contains large code bases that become complex and hard to manage.
The development team has decided to go with an approach where they can divide the
functionality of the application into separate modules which can develop, deploy and
maintain separately. They first look at the existing application architecture and
identified certain level modularity by going through the user interface of the web
application. The below figure depicts the web application and its main components.
As per the preceding figure, we could identify there are several main components that
execute specific functions within the application.
Patient Registration — This is where the new patients coming into the OPD
section within the hospital are registered into the system. The patient or a
guardian comes to the registration desk and a hospital staff member collects
the patient details and updates the system
Patient Inspection — Once the patient is registered into the system, a token is
generated and the patient goes to the next step which is the inspection by a
doctor or a physician. Once the inspection is done, the patient will be admitted
to the temporary treatment unit or admitted to a ward for longer term treatment
depending on the patient's status.
Patient Release — Once the patient is ready to be released from the OPD
unit, the doctor will provide the final recommendation to send the patient back
home or admit the patient to the ward for longer-term treatments.
Database — This is where the patient details, treatment history, and other
health-related information are stored.
Once the main components are identified, the development team has decided to break
down the application into separate microservices and develop them as independently
managed components. The development team has been using a waterfall type
approach to deliver applications in the past and they have identified several challenges
with that approach. Some of them are:
— delayed releases
— missing features
— low-quality products
This kind of approach was not helping the organization and the business leadership
has been questioning this process due to the lack of innovation and flexibility when it
comes to improving the software.
After considering all the aspects, the development team has decided to break down the
application into separate microservices and build the user interface component as a
single page application (SPA) that communicates with these microservices. They
started with the already identified functional components (above) to implement the
new microservices. The below figure depicts the new architecture with the
microservices.
The preceding figure depicts how the team has divided the application into separate
microservices based on the functionality and also the usage of an API gateway and a
Message Broker to build the application along with a SPA for user interfaces. Some
of the core principles of the above diagram are:
Here the authentication service and the common data service are optional services. If
you are using the API Gateway for security, this authentication service can be
ignored. The common data service is depicted above if there is a requirement to
provide a read-only view of the various data for some applications in a materialized
view. Instead of accessing data from 4 major functional microservices, this common
data service can provide a read-only copy of the data that are managed by individual
microservices. This microservice can also be ignored if there is no such demand for
heavy read-only data access.
References
1. https://medium.com/microservices-learning/breaking-down-a-monolithic-
web-app-into-a-microservices-architecture-36c7bc1cf098
2. https://insights.sei.cmu.edu/blog/8-steps-for-migrating-existing-applications-
to-microservices/