0% found this document useful (0 votes)
23 views48 pages

Final Project Report_r

The document is a project report by Rahul M Sonawane detailing his full-time industrial internship at Iauro Systems Pvt Ltd as part of his Master of Computer Application program. It outlines his training in Node.js, development of RESTful APIs, and contributions to various projects including an HRMS module and a Twitter sentiment analysis tool. The report emphasizes the skills and technologies acquired during the internship, showcasing the practical application of his academic knowledge in a professional setting.

Uploaded by

pawaraniket0401
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views48 pages

Final Project Report_r

The document is a project report by Rahul M Sonawane detailing his full-time industrial internship at Iauro Systems Pvt Ltd as part of his Master of Computer Application program. It outlines his training in Node.js, development of RESTful APIs, and contributions to various projects including an HRMS module and a Twitter sentiment analysis tool. The report emphasizes the skills and technologies acquired during the internship, showcasing the practical application of his academic knowledge in a professional setting.

Uploaded by

pawaraniket0401
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

School of Computer Science & Engineering

Department of Computer Science and Applications


2024-2025

A
PROJECT REPORT
ON
Full-Time Industrial Internship

BY

Rahul M Sonawane
PRN NO- 1132230085

IN PARTIAL FULFILLMENT OF
MASTER OF COMPUTER APPLICATION- SCIENCE
DR. VISHWANATH KARAD MIT WORLD PEACE
UNIVERSITY, PUNE-411038

1
DEPARTMENT OF COMPUTER SCIENCE
AND APPLICATIONS
Certificate
This is to certify that, Rahul M Sonawane student of MSc. (CS) Semester IV has
successfully / partially completed Industrial Training at Iauro Systems Pvt Ltd in partial
fulfilment of MCA- Science under Dr. Vishwanath Karad MIT World Peace University,
for the academic year 2023-2024.

______________ Dr. Jalindar R. Gandal

Internal Guide Head-MCA- Science

Dr. Rajeshree Khande Prof. Dr. Lalit Kane

Program Director Associate Dean


Academics

Prof. Dr. Shubhalaxmi Joshi

Asso. Dean, Faculty Affairs

Date: ___/___/______

External Examiners:

1. _________________________ 2. __________________________

2
Internship Offer Letter

3
ACKNOWLEDGEMENT

I would like to take this opportunity to express my heartfelt gratitude to all


those who have supported and guided me throughout the course of this
project.

First and foremost, I wish to extend my sincere thanks Prof. Dr. C.H.Patil,
School of Computer Science, MIT-WPU, Pune, for their invaluable
guidance, insightful feedback, and constant encouragement. Their expertise
and mentorship were instrumental in completing this project successfully.

A special thanks to Iauro Systems for providing me with the opportunity to


work in a professional environment. The experience has been extremely
valuable and has helped me grow in knowledge and confidence.

I also wish to thank Dr. Vishwanath Karad MIT World Peace University for
providing me with the necessary resources and facilities to carry out this
project seamlessly. The conducive environment and access to tools were
integral to achieving the project goals.

Lastly, I would like to express my profound gratitude to my family and


friends for their unwavering support, encouragement, and motivation, which
helped me overcome challenges and stay focused.

Rahul M Sonawane.

MCA (Science)

Place: Pune

Date:

4
DECLARATION

I hereby declare that the project work entitled “Full-Time Industrial


Internship” submitted to the MIT-WPU, Pune is a record of an original work
done by me under the guidance of Prof. Dr. C.H.Patil and this project work
is submitted in the partial fulfilment of the requirements for the project and
the award of the degree of Master of Computer Application.

The results embodied in this project report have not been submitted to any
other University or Institute for the award of any degree or diploma. I have
contributed myself for this project going on in our company.

Sign

Rahul M Sonawane

PRN- 1132230085

5
INDEX

Sr No Content Page No

Chapter 1 Abstract 9

Introduction

2.1 About Iaro Systems Pvt Ltd 11


Chapter 2
2.2 Duration and goals of internship 12

2.3 Technologies used 12

Node.js Training

Chapter 3
3.1 Overview of the training program 13

3.2 Training outline and topics covered 13

Expense Tracker API

4.1 Introduction 17

4.2 Key features 18


Chapter 4
4.3 Technology stack 19

4.4 Folder structure & key code snippets 19

4.5 Screenshots 21

Blog-Post API

5.1 Introduction 24
Chapter 5
5.2 Key features 25

5.3 Technology stack 26

6
5.4 Folder structure & key code snippets 27

5.5 Screenshots 30

Documentation of IO Flow

6.1 Introduction 32

6.2 Tools used for documentation 32


Chapter 6
6.3 Objectives 32

6.4 Outcomes and Importance 33

6.5 Challenges and key insights 33

HRMS Project

7.1 Introduction 34

7.2 Objectives 34
Chapter 7
7.3 Technology stack 35

7.4 Impact on the overall project 35

7.5 Sample code snippets 36

Twitter (X) sentiment analysis

8.1 Introduction 38

8.2 Key features 38


Chapter 8
8.3 Project structure 39

8.4 Technology stack 40

8.5 Key code snippets 41

7
Current work

9.1 Introduction 43

9.2 Purpose and scope 43


Chapter 9
9.3 Services under study 43

9.4 Approach to understanding 44

9.5 Impact on the overall project 44

EncureIt internship overview

10.1 Introduction 45
Chapter 10
10.2 Topics covered 45

10.3 Tools & platforms used 46

10.4 Learning and outcomes 46

Chapter 11 Conclusion 48

8
1. Abstract

This report summarizes my ongoing six-month internship at iauro Systems Pvt. Ltd., a
company in the IT and AI domain that delivers smart, scalable digital solutions across
industries such as Edutech, Travel & Hospitality, Financial Services, and more. The
internship began on February 17, 2025, and will conclude on August 17, 2025. It has
offered me extensive hands-on experience in full-stack development, backend
architecture, documentation practices, and real-world software engineering processes.

In the initial phase, I completed a structured Node.js training program, during which I
developed two RESTful APIs—an Expense Tracker API and a Blog Post API—to
solidify my understanding of backend development concepts such as authentication,
CRUD operations, and RESTful principles.

Following the training, I contributed to active company projects. One key contribution
has been to the HRMS project, where I developed the Role-Based Access Control
(RBAC) module using Java, MySQL, and Spring Boot, implementing scalable and secure
role and permission management.

I also built a Twitter sentiment analysis module that fetches tweets tagging specific
handles, analyzes their sentiment using Python-based NLP libraries, and stores the results
in the database. This helped me gain experience with third-party APIs, text processing,
and real-time data handling.

Another major responsibility has been preparing detailed documentation for the IO Flow
service, using tools like Swagger and Postman. Additionally, I explored and analyzed
internal services like the Widget Service, Page Service, and User Management Service,
further enhancing my ability to work with production-level codebases.

As part of my internal upskilling, I also completed a Prompt Engineering course under the
Consultant 1.0 track, gaining foundational skills in crafting effective AI prompts.

The technologies and tools I have worked with include: Node.js, NestJS, Java, Redis,
GRPC, MongoDB, Typescript, Swagger, Postman, Python, JavaScript, and Git.

9
Key learning outcomes:

• Building production-ready backend services

• Designing enterprise-grade modules such as RBAC

• Developing a Twitter-based sentiment analysis tool using external APIs and


Python NLP

• Mastering API development and documentation tools like Swagger and Postman

• Understanding service-oriented architecture and version control

• Gaining real-world experience with large codebases and agile collaboration

This internship continues to be a transformative phase in my academic journey, equipping


me with practical skills and professional-level exposure essential for a career in software
development.

10
2. Introduction

This report outlines my ongoing six-month internship at iauro Systems Pvt. Ltd., Pune, a
forward-thinking company operating in the IT and Artificial Intelligence (AI) domain.
The internship commenced on February 17, 2025, and will conclude on August 17, 2025.
This internship opportunity has allowed me to work closely with production-grade
systems, real-world backend architectures, internal service documentation, and end-to-
end module development across various business domains. The experience has helped me
build essential technical skills, understand industry workflows, and apply theoretical
knowledge in practical, enterprise environments.

2.1 About iauro Systems Pvt. Ltd.

Established in 2009, iauro Systems Pvt. Ltd. is a technology-driven organization


specializing in delivering smart, scalable digital solutions across multiple domains
including Edutech, Travel & Hospitality, Financial Services, and Digital Engineering.
Founded by Nilesh Ratnaparkhi and Anupam Kulkarni, the company began by
challenging the conventional education system with their first project, BenchPrep. Since
then, iauro has evolved into a full-scale digital engineering company driven by innovation
and adaptability.

At its core, iauro fosters a culture of continuous evolution and agile development. The
organization emphasizes a product mindset, cross-functional collaboration, and client
empathy. Their teams integrate AI, Machine Learning, Cloud-Native technologies, and
Open Source tools to build cutting-edge, user-centric solutions.

iauro also leverages internal accelerators such as IO Craft, IO Flow, IO Data, and IO
Market—platforms designed to streamline design-to-code workflows, data management,
and reusable asset deployment. The company’s values are built on pillars of integrity,
accountability, empathy, and an evolving mindset, ensuring a transparent, ethical, and
inclusive work environment.

11
2.2 Duration and Goals of the Internship

The internship spans a duration of six months, from February 17, 2025 to August 17,
2025. The key goals of this internship include:

• Gaining hands-on experience in real-world enterprise projects

• Enhancing backend development capabilities using industry standards

• Working with production-grade architectures and APIs

• Learning to use tools for testing, documentation, and deployment

• Understanding cross-functional teamwork and agile practices

• Strengthening problem-solving and debugging skills in live environments

• Developing a mindset aligned with scalable, maintainable software development

2.3 Technologies Used

Throughout the internship, I have worked with a diverse set of technologies that support
scalable, enterprise-grade application development. These include:

• Node.js and NestJS (backend development for services like IO Flow)

• Java, Spring Boot, and MySQL (for HRMS RBAC module)

• JavaScript and TypeScript (application and service logic)

• Python (sentiment analysis module using NLP libraries)

• Postman and Swagger (for API testing and documentation)

• Redis and gRPC (for performance and inter-service communication)

• MongoDB (for non-relational data storage)

• Git (version control and collaboration)

This exposure to modern tools and platforms has equipped me with practical skills that
are crucial for professional software development and collaborative team projects.

12
3. Node.js Training

3.1 Overview of the Training Program

As part of my internship, I undertook a structured and intensive 1-month Node.js training


program aimed at building foundational and practical skills in backend development. The
training was designed to provide a comprehensive, hands-on understanding of key
technologies such as Node.js, Express.js, MongoDB, and the creation of RESTful APIs.
Throughout the program, I focused not only on the theoretical concepts but also on
implementing industry best practices, real-world examples, and adhering to development
standards commonly used in the field.

This training served as a critical stepping stone in my transition towards full-scale project
work at iauro, ensuring that I became familiar with the complete backend development
stack and the essential tools required for professional development.

3.2 Training Outline and Topics Covered

The Node.js training followed a progressive structure, beginning with core computer
networking concepts and gradually advancing to modern JavaScript backend frameworks
and tools. The comprehensive list of topics covered includes:

3.2.1 Fundamentals and Core Concepts

• Client-server architecture

• IP address, DNS

• Operating system processes

• Ports and network communication

• Multiprocessing, multithreading, and multitasking

• Inter-process communication (IPC)

• Remote procedure calls (RPC)

• APIs and gRPC

13
3.2.2 JavaScript & TypeScript Foundations

• Basics of JavaScript/TypeScript (Promises, Callbacks)

• Functional programming in JavaScript/TypeScript

• Server-side JavaScript/TypeScript

• Object-Oriented Programming (OOP) in JavaScript/TypeScript

3.2.3 Node.js Core Topics

• What is Node.js?

• Node.js process model

• Synchronous vs. asynchronous execution

• Blocking vs. non-blocking I/O

• Event loop and event queue

• Installation of Node.js (using NVM)

• Node.js console and script execution

• Creating a “Hello World” application

• Understanding Node.js modules and module exports

3.2.4 Node Package Manager (NPM)

• NPM usage

• Semantic versioning with NPM

3.2.5 Networking and System Modules

• HTTP and timers

• File System (FS) module

• Path module

• Environment variables

• Joi module (input validation)

14
• Buffers, streams, and events

• Process and clusters in Node.js

3.2.6 Best Practices in Backend Development

• API design principles (RMM)

• Error handling and coding standards

• Code structure and modularity

• Naming conventions and status codes

• Standard response messages

3.2.7 MongoDB and Database Integration

• MongoDB basics

• Installation (native and Docker-based)

• CRUD operations using MongoDB console

• Connecting Node.js with MongoDB using Mongoose

• Building RESTful CRUD APIs

3.2.8 Authentication, Tools, and DevOps Practices

• Authentication and authorization (JWT-based)

• Use of Postman for API testing

• Swagger for API documentation

• Git basics and version control practices

• Environment-based development (Dev vs. Prod)

• Introduction to Redis

15
3.2.9 Tools and Practices Followed

During the training period, several industry-standard tools and techniques were used to
simulate a real-world development environment. These included:

• Node Version Manager (NVM) for handling multiple Node.js versions

• Visual Studio Code as the primary development IDE

• Postman for testing APIs

• Git and GitHub for version control and code management

• Swagger for documenting RESTful APIs

• Docker for running MongoDB locally in isolated containers

• Linting tools for maintaining code quality

• Environment files (.env) for secure configuration management

16
4. Expense Tracker API

4.1 Introduction

As part of my backend training and to demonstrate practical understanding of RESTful


API development, I developed an Expense Tracker API. This project served as a hands-
on application of the concepts learned during the Node.js training, allowing me to
implement authentication, routing, middleware, and database integration using modern
tools and technologies. The API was designed to manage and track personal expenses
efficiently, with features that allow users to perform CRUD operations and filter expenses
by category.

4.1.1 Primary Objectives

The primary objective of the Expense Tracker API was to design a simple yet functional
backend service capable of handling real-world use cases such as expense management.
The system enables users to add, retrieve, update, and delete expense entries, while also
providing the ability to categorize and filter them. The inclusion of JWT-based
authentication ensures that the API is secure and follows current industry best practices.

4.1.2 Purpose and Scope of the API

The purpose of this project was to implement a full-featured REST API with proper
routing, data validation, secure authentication, and error handling. The scope was limited
to individual user use, focusing on simplicity, clean structure, and usability for personal
financial tracking. In addition to RESTful endpoints, the API architecture was extended to
support gRPC for inter-service communication, reflecting industry trends toward high-
performance, scalable microservices. The gRPC implementation was achieved using
protocol buffers (.proto files), with dedicated grpcClient.js and grpcServer.js modules to
enable future expansion of the system into a distributed environment.

17
4.2 Features of the Expense Tracker API

Retrieve All Expenses – GET /

Fetches a list of all expense records stored in the system.

Add a New Expense – POST /

Allows a user to create a new expense entry by providing details like amount, description,
category, and date.

Retrieve Expense by ID – GET /:id

Returns details of a specific expense using its unique identifier.

Update an Existing Expense – PATCH /:id

Updates one or more fields of an existing expense by its ID.

Delete an Expense – DELETE /:id

Permanently removes an expense record from the database.

Add Multiple Expenses – POST /addMany

Enables bulk insertion of expenses in a single request.

Filter by Category – GET /category?category=value

Returns expenses that match a specific category name provided as a query parameter.
18
4.3 Technology Stack

The API was built using the following technologies:

• Node.js – Runtime environment for executing JavaScript on the server.

• Express.js – Web framework used for routing and middleware integration.

• MongoDB – NoSQL database for storing expense records.

• Mongoose – ODM (Object Data Modeling) library for MongoDB.

• Postman – API testing and automation tool.

• JWT (JSON Web Tokens) – For secure user authentication.

4.4 Folder Structure and Key Code Snippets

The project is organized in a modular format to maintain clarity and separation of


concerns. Below is a breakdown of key directories and their responsibilities:

Figure 1: Folder Structure 1 Figure 2: Folder Strucrure 2

19
• controller/controller.js: Contains core logic for handling CRUD operations on
expenses.

• controller/authController.js: Handles user login, registration, and JWT token


generation.

• middlewares/identification.js: Validates JWT tokens for securing protected routes.

• models/expenses.js: Defines the schema for storing expenses in MongoDB.

• routes/expenses.js: Maps HTTP methods to controller functions for expenses.

Sample Controller Snippet – Adding a new expense:

20
Sample Route Snippet

4.5 Postman/ Swagger Test Case Screenshots

The API was tested using Postman and documented with Swagger UI. Screenshots
include the Swagger endpoint overview and a Postman test of the Signup API with
request and response details.

Figure 3: Swagger UI for Authentication

21
Figure 4: Swagger UI for Expenses

22
Figure 5: Postman tool testing

Figure 6: Signup feature testing using Postman

Figure 7: Get expenses feature testing using Postman

23
5. Blog-Post API

5.1 Introduction

As part of my self-paced learning and to reinforce the backend development concepts


covered during the Node.js training, I independently developed a Blog Post API. This
project was designed to simulate a real-world content management system (CMS) use
case, enabling the creation, retrieval, updating, and deletion of blog posts. The project
emphasized modular coding practices, RESTful API design principles, and secure
authentication mechanisms using modern development tools.

This project served as both a learning milestone and a proof of concept for building
scalable, maintainable APIs using the Node.js ecosystem. It allowed me to apply practical
knowledge in a structured way, gaining confidence in designing backend services from
scratch.

5.1.1 Project Objectives

The primary goal of the Blog Post API was to build a functional REST API for managing
blog posts, focusing on clean design, modularity, and secure access control. Through this
self-guided build, I aimed to strengthen my understanding of:

• RESTful route handling

• Secure authentication using JWT

• Data modeling using MongoDB

• Proper error handling and status code management

• Postman-based API testing

5.1.2 Purpose and Scope of the API

The purpose of the Blog Post API is to provide a structured and secure backend service
for managing blog content in a scalable and modular manner. This API enables users to
perform essential content management operations such as creating, reading, updating, and
deleting blog posts. It serves as a foundational learning project, allowing practical
implementation of RESTful principles, middleware integration, authentication, and
database interaction.

24
The scope of the API is intentionally focused on core CRUD functionalities within a
single-user or small-scale blogging platform context. Security is enforced through JWT-
based authentication to ensure protected access to write operations. While the current
implementation is streamlined for simplicity and educational clarity, it lays the
groundwork for potential future enhancements such as user roles, commenting systems,
content categorization, and public-facing read-only endpoints.

5.2 Key Features Implemented

The Blog Post API was designed with a focus on both user authentication and content
management functionality. Below are the core features that were implemented as part of
the API development:

5.2.1 User Authentication and Security

• POST /signup – Registers a new user in the system.

• POST /login – Authenticates user credentials and issues a JWT token.

• POST /logout – Logs the user out by invalidating the session using an identifier.

• PATCH /send-verification-code – Sends a verification code to the user for account


validation.

• PATCH /verify-verification-code – Verifies the code provided by the user to


confirm identity.

• PATCH /change-password – Allows authenticated users to change their password


securely.

25
• PATCH /send-forgot-password-code – Sends a password reset code to the user's
registered email.

• PATCH /verify-forgot-password-code – Verifies the code for password reset flow.

5.2.2 Blog Post Management

• GET /all-posts – Retrieves a list of all blog posts available.

• GET /single-post – Fetches details of a specific blog post.

• POST /create-post – Allows an authenticated user to create a new blog post.

• PUT /update-post – Enables authenticated users to update an existing post.

• DELETE /delete-post – Deletes a specific post authored by the authenticated user.

5.3 Technology Stack

• Node.js – Server-side JavaScript runtime

• Express.js – Web framework for routing and middleware

• MongoDB – NoSQL database for storing blog data

• Mongoose – ODM for defining schemas and interacting with MongoDB

• JWT (Json Web Token) – Secure authentication mechanism

• Postman – Used for testing and validating all API endpoints

26
5.4 Folder Structure and Key Code Snippets

The Blog Post API project is organized following a modular and scalable folder structure,
which promotes separation of concerns, code maintainability, and clarity. Below is an
overview of the key components and their responsibilities:

27
• controllers/authControllers.js: Handles user authentication, including signup,
login, verification, and password reset functionalities.

• controllers/postsControllers.js: Manages CRUD operations related to blog posts.

• middlewares/identification.js: Authenticates requests by validating JWT tokens


for protected routes.

• middlewares/validator.js: Validates request data to ensure proper format and


structure.

• models/userModel.js: Defines the schema for storing user information in the


database.

• models/postsModel.js: Defines the schema for storing blog post data.

• routes/authRoutes.js: Maps authentication-related endpoints to their controller


functions.

• routes/postsRouter.js: Maps blog post-related endpoints to the respective


controllers.

• utils/hashing.js: Provides utility functions for password hashing.

• utils/sendEmail.js: Sends verification and password reset emails to users.

• index.js: Initializes the server and applies middleware and routing.

28
Sample Controller Snippet- Create Post

Sample Route Snippet

29
5.5 Postman test case screenshots

The Blog Post API was tested using Postman. The following screenshots showcase the
API endpoints and demonstrate request and response details, including a test of the
signup, login, get all posts and delete post functionality.

Figure 8: Signup feature testing using Postman

Figure 9: Login feature testing using Postman

30
Figure 10: Get all posts feature testing using Postman

Figure 11: Delete -Post feature testing using Postman

31
6. Documentation of IO Flow

6.1 Introduction

The documentation was prepared for an internal enterprise service named IO Flow. IO
Flow is a proprietary platform developed to automate and manage workflow processes
within the organization. It plays a critical role in orchestrating tasks, streamlining
communication between services, and ensuring the consistency and reliability of business
operations.

6.2 Tools Used for Documentation

6.2.1 Tools Used for Understanding the Service

To gain a comprehensive understanding of the IO Flow service, the following tool was
utilized:

• Swagger UI: Employed to explore available API endpoints, examine request and
response structures, and analyze the overall functionality of the service.

6.2.2 Tools Used for Preparing Documentation

The following tools were used to create and organize the technical documentation:

• Swagger UI Snapshots: Used to include visual references of endpoint


specifications and sample API responses.

• Microsoft Word: Utilized to compile and format the documentation into a


structured and readable document for internal distribution.

6.3 Objective of the Documentation

The primary objective of the documentation was to produce a clear, structured, and
comprehensive reference guide for the IO Flow service. The documentation was intended
to:

• Improve understanding of complex workflows and service interactions across


modules.

• Serve as a technical reference for developers, testers, and stakeholders.

32
• Enhance team communication and alignment by providing a shared understanding
of the system.

6.4 Outcome and Importance

The documentation provided the following benefits to the project and team:

• Enabled team members to understand the service architecture, endpoints, and


workflows efficiently.

• Reduced onboarding time for new developers by offering a consolidated and


easily accessible guide.

• Improved consistency in service usage and integration by clearly defining


endpoint behaviors and data formats.

• Supported effective collaboration between development, QA, and product


management teams.

• Contributed to the long-term maintainability and scalability of the service by


preserving critical knowledge.

6.5 Challenges and Key Insights

The documentation process involved navigating initially undocumented or partially


documented endpoints. This challenge was addressed through extensive testing and
exploration using Swagger UI.

Key insights gained during the process include:

• The significance of standardized API design, including naming conventions,


response structures, and status codes.

• The importance of structured and well-maintained internal documentation in


supporting agile development and cross-functional collaboration.

• The value of clear technical documentation in reducing dependency on verbal


knowledge transfer and improving development efficiency.

33
7. HRMS Project (RBAC Authentication Service)

7.1 Introduction

Role-Based Access Control (RBAC) is a method of managing access to a system based


on the roles assigned to users. This security model is essential for organizations that need
to control which users can access specific parts of a system based on their assigned role.
Instead of granting permissions to individual users, roles are created, and users are
assigned to these roles. Each role then has permissions that define what users can and
cannot do within the system.

For the HRMS (Human Resource Management System), the development of an RBAC
authentication service was essential. It ensures that users access the system securely and
in alignment with their responsibilities. This authentication service helps maintain a
structured, secure, and scalable system by ensuring that users can only interact with
relevant features based on their roles.

7.2 Objectives

The RBAC authentication service was developed with the following objectives:

• Secure Access Control: The primary purpose of the RBAC authentication service
was to implement a secure way of managing user access. By creating different
roles (Admin, HR, Employee, etc.), each role is associated with specific
permissions, ensuring that only authorized users can access sensitive parts of the
HRMS system.

• Simplify Role and Permission Management: The service provides a structured


approach to managing user access by grouping permissions under specific roles,
making it easier for administrators to assign access rights rather than managing
individual user permissions.

• Ensure Compliance and Security: The RBAC service helps the organization
comply with security standards and regulations by preventing unauthorized access
to sensitive data and ensuring that users only have access to the parts of the
system that are necessary for their job.

34
• Enhance Scalability: As the HRMS system grows and more users are added, the
RBAC authentication service is designed to scale efficiently. New roles and
permissions can be easily integrated into the system as the organization's needs
evolve.

7.3 Technology Stack

The RBAC authentication service was developed using the following technical stack:

• Java Spring Boot: Spring Boot was chosen for its robust, scalable, and secure
features, making it the ideal choice for building the authentication service. Spring
Security was integrated to handle user authentication and authorization, ensuring
users are properly authenticated and their access rights are assigned based on their
roles.

• MySQL: The database layer is built using MySQL, which provides reliable and
efficient relational data storage for user, role, permission, and module information.
MySQL’s relational structure supports the entities and relationships necessary for
a dynamic and scalable RBAC system.

7.4 Impact on the Overall HRMS Project

The development of the RBAC authentication service has had a substantial impact on the
HRMS project by laying the foundation for secure user management and role-based
access control:

• Enhanced Security: By implementing RBAC, the service ensures that each user
has access only to the parts of the HRMS system relevant to their role. This
reduces the risk of unauthorized data access and strengthens the security of the
system.

• Simplified User Management: The RBAC service simplifies the process of


managing user access by grouping permissions into roles. This allows
administrators to easily assign and manage access at a higher level, rather than
managing permissions individually for each user.

• Scalability and Flexibility: As the HRMS system continues to develop and grow,
the RBAC authentication service allows for easy addition of new roles and

35
permissions without disrupting the system’s functionality. This flexibility ensures
the HRMS can evolve as the organization’s needs change.

• Compliance and Auditing: The service helps the HRMS meet security and
compliance standards by providing a structured method for controlling access and
ensuring that sensitive data is protected. It also facilitates auditing by tracking
which users have access to specific features within the system.

• Foundation for Further Development: The RBAC authentication service serves


as a crucial foundational element for the HRMS system. As other modules (such
as payroll, leave management, etc.) are developed, the authentication service
ensures that the appropriate roles and permissions are enforced across the system.

Through the development of the RBAC authentication service, the project has gained a
secure, scalable, and flexible user management system, which is essential for the further
development of the HRMS platform.

7.5 Sample Code Snippets

Login Controller

36
This snippet demonstrates the controller responsible for handling user login requests. It
includes endpoint mappings for authenticating users and issuing tokens based on their
credentials and roles. This forms the entry point to the secured HRMS system, ensuring
only authorized users can access protected resources.

Role Request DTO

This snippet represents the Data Transfer Object (DTO) used for handling role creation or
modification requests. It encapsulates role data passed from the client and ensures a clean
separation between the API layer and internal data models, improving code
maintainability and clarity.

37
8. Twitter (X) Sentiment Analysis

8.1 Introduction

This chapter describes the development of a Twitter Sentiment Analysis service created
as part of a larger project. The objective was to automate the process of fetching tweets
mentioning a specific handle, analyze the sentiment of each post using Natural Language
Processing (NLP) techniques, and store the processed data in a structured database. The
system was designed to operate on a scheduled basis to ensure continuous sentiment
monitoring.

8.1.1 Purpose and Scope

The primary purpose of the Twitter Sentiment Analysis service was to build an automated
system capable of:

• Fetching live tweets where a particular Twitter handle is mentioned.

• Performing sentiment analysis on each fetched tweet.

• Categorizing each post as positive, negative, or neutral.

• Storing the original tweet content along with its sentiment result into a local
database.

• Executing this process periodically, without manual intervention.

This service was developed as a supporting component for a larger system, with an
emphasis on automation, accuracy, and efficiency.

8.2 Key Features

• Automated Tweet Retrieval: Tweets mentioning a specific handle were fetched


programmatically using the Twitter API.

• Sentiment Analysis: Implemented using transformer models, ensuring deep


contextual understanding of tweet content.

• Database Storage: All tweet content and sentiment outcomes were stored in a
SQLite database for easy access and traceability.

38
8.3 Project Structure and key code snippets

39
• main.py: Entry point of the application. Initializes and coordinates the execution
of tweet fetching, sentiment analysis, and database storage.

• twitter/fetcher.py: Connects to the Twitter API, retrieves tweets that mention a


specific handle, and passes them on for analysis.

• twitter/sentiment.py: Performs sentiment analysis on fetched tweets using


transformer-based models and NLP libraries.

• scheduler/job.py: Contains the scheduled task logic that periodically triggers the
entire data fetching and analysis pipeline.

• db/models.py: Defines the database schema for storing tweet content and
corresponding sentiment labels using SQLite.

• view_results.py: A utility script to display or query results stored in the SQLite


database for verification or reporting.

• requirements.txt: Lists all Python dependencies required by the project,


including libraries like transformers, torch, fugashi, unidic-lite, and ipadic.

• tweets.db: The SQLite database file where tweet data and sentiment analysis
results are stored.

• .env: Holds environment-specific variables such as Twitter API credentials and


other configurations.

8.4 Technical Stack

• Programming Language: Python

• Database: SQLite3

• Libraries & Frameworks:

o transformers – for sentiment model inference

o torch – PyTorch deep learning backend

o fugashi, unidic-lite, ipadic – for Japanese language tokenization

40
• Job Scheduling: Built-in Python scheduler using time and loop logic or
APScheduler

• External Integration: Twitter API v2 via tweepy

8.5 Code Snippets

Fetcher snippet: Fetches posts from twitter(X).

Sentiment snippet: Performs sentiment analysis.

41
Job Snippet: Run the project at finite intervals.

42
9. Current Work

9.1 Introduction

As part of the ongoing responsibilities during the internship engagement, current efforts
are focused on comprehensively understanding three important internal microservices
within the enterprise system architecture — the User Management Service, the Page
Service, and the Widget Service. These services are essential components of the broader
product ecosystem, playing distinct roles in user handling, dynamic content generation,
and visual composition within the platform.

9.2 Purpose and Scope

The primary objective of this phase is to develop a clear and practical understanding of
how these services function individually and interact with each other. This includes
reviewing the codebase, exploring the API structure, and participating in real tasks under
the guidance of an industry mentor. This preparation is essential for contributing
effectively to the ongoing development and maintenance of the system.

9.3 Services Under Study

9.3.1 User Management Service

This service is responsible for handling user-related data and functionality, it manages
information such as:

• Full Name (including middle name, nickname, and local name)

• Personal attributes like likes and dislikes

• Structured user profile data used across various modules in the application

9.3.2 Page Service

This service is responsible for managing the creation and configuration of dynamic
"pages" within the platform. These pages often act as containers that host various
functional or visual components.

43
9.3.3 Widget Service

Closely linked with the Page Service, the Widget Service handles the creation,
configuration, and rendering logic for individual widgets. Widgets are the modular
elements that populate pages, providing both content and interactivity.

9.4 Approach to Understanding

To ensure a systematic and effective learning process, the following methods are being
utilized:

• Swagger UI: Reviewing and experimenting with available endpoints to


understand API flows and data structures.

• Codebase Analysis: Going through the service repositories to understand internal


logic, models, and business rules.

• Mentor-Led Discussions: Regular sessions with the industry mentor for


clarification of complex logic, design rationale, and architectural decisions.

• Documentation and Notes: Maintaining structured notes to document the


understanding of services, endpoints, and workflows for future reference.

9.5 Impact on Overall Project

Understanding these services in depth enables better alignment with the organization’s
product goals. Since they represent critical infrastructure components, mastering them
enhances the ability to take on future implementation and debugging tasks more
confidently. It also ensures that any feature development or customization work is built
on a solid understanding of the system’s architecture and data flows.

44
10. EncureIt Internship Overview

10.1 Introduction

Before beginning the current engagement at Iauro, I completed a one-month structured


training program under EncureIT, which provided a solid foundation for entering the tech
industry. This program was designed to establish fundamental knowledge in front-end
development, programming logic, and databases. The training covered essential web
technologies and development tools through theoretical sessions and practical, hands-on
assignments.

10.2 Topics Covered

Over the course of the training, I acquired skills in the following key areas:

10.2.1 HTML and CSS

• HTML Topics: Introduction to HTML, tags, forms, media embedding, tables, and
webpage structure.

• CSS Topics: Types of CSS (inline, internal, external), layout management using
Flexbox and Grid, and application of various style properties.

10.2.2 Bootstrap Framework

• Gained an understanding of Bootstrap 5 classes for responsive design.

• Implemented UI components such as accordions, cards, modals, navbars,


buttons, and carousels.

• Built real-world web page replications including the EncureIT and Netflix
websites using responsive techniques.

10.2.3 JavaScript and jQuery

• Explored JavaScript syntax, variables, data types, functions, objects, loops, and
events.

• Integrated form validations using both JavaScript and jQuery.

• Created interactive web pages with basic dynamic behavior.

45
10.2.4 SQL and MySQL

• Learned database fundamentals and how to interact with MySQL using DDL,
DML, DCL, TCL, and DQL commands.

• Applied clauses such as WHERE, GROUP BY, HAVING, ORDER BY, and
JOINs (INNER, LEFT, RIGHT, etc.).

• Practiced with views, stored procedures, transactions, and triggers.

10.2.5 PHP (Basics)

• Understood how PHP works in a request-response cycle.

• Implemented PHP features like variables, data types, loops, functions, arrays,
and file handling operations.

• Practiced creating simple server-side applications.

10.3 Tools and Platforms Used

• VS Code – for code editing and project development

• XAMPP – for running local PHP and MySQL server environments

• phpMyAdmin – for database interaction

• Bootstrap CDN – for responsive design

• Browser DevTools – for inspecting and testing UI changes

10.4 Learning and Outcomes

This training program played a crucial role in strengthening my understanding of the core
building blocks of web development. I gained:

• A strong foundation in full-stack basics

• Confidence in writing clean and structured HTML/CSS

• Proficiency in using Bootstrap for responsive designs

• Functional knowledge of JavaScript and jQuery

• Hands-on experience in MySQL database operations

46
• An introductory experience in PHP for backend scripting

Overall, this experience enhanced my problem-solving skills, improved my coding


discipline, and made me more prepared for real-world development tasks, laying a solid
base for my ongoing work at Iauro.

47
11. Conclusion

Throughout the course of this internship, I experienced significant professional and


technical growth. The opportunity to engage in real-time projects exposed me to the
practical side of software development, bridging the gap between academic knowledge
and industry expectations. It allowed me to understand how real-world applications are
structured, developed, and maintained in collaborative environments.

I learned how to navigate live codebases, understand API documentation, and work with
development tools in a production-like setting. Exposure to modular software design,
version control practices, and regular code reviews enhanced my ability to write cleaner,
maintainable code and understand software systems at a broader level.

Additionally, participation in team discussions, sprint planning, and technical


brainstorming sessions helped me improve my communication and problem-solving
skills. It cultivated a sense of ownership, accountability, and professionalism in how I
approached each task.

This internship has not only expanded my technical toolkit but also shaped my mindset
toward continuous learning and collaboration. It has played a pivotal role in aligning my
career direction with full-stack software development and has given me the confidence to
contribute meaningfully in a professional software engineering environment.

48

You might also like