Janus Henderson Investors - Tax & Statements, Covance Softsol
Limited
ETL Developer | Sep 2025 to Feb 2026
Description: Janus Henderson
Investors is a global asset management firm managing investment funds for institutional and
retail clients worldwide. The company provides
portfolio management, investment advisory, and financial services with strong focus on
regulatory reporting, investor statements, and tax reporting. Performing config driven
pipelines using AWS Services like Dynamo DB, AWS
Lambda, AWS S3 and AWS Glue to process client files and prepare reports. >
Roles & Responsibilities:
- Designed and maintained dynamic configuration-driven data pipelines using Amazon
DynamoDB to control processing logic based on client and file types.
- Implemented event-driven file processing workflows triggered by file uploads to Amazon
S3.
- Developed AWS Lambda functions to read configuration from DynamoDB and determine the
sequence of processing steps for different file types.
- Configured environment variables in Lambda to dynamically trigger appropriate workflows
in AWS Step Functions.
- Built and maintained AWS Step Functions orchestration workflows to coordinate execution
of multiple Lambda functions and AWS Glue jobs.
- Developed AWS Glue Spark jobs to process large-scale datasets, perform data
transformations, and generate client-specific reports.
- Implemented distributed data processing using Apache Spark to efficiently handle
high-volume financial and transactional data.
- Automated report generation workflows based on processed data to support tax reporting
and investor statements.
- Created Lambda-based utilities for sending automated email notifications upon job
completion, failure, or status updates.
- Maintained workflow tracking and execution logs in DynamoDB, storing file name,
processing status, step details, and unique execution IDs.
- Implemented error handling, retry mechanisms, and status tracking across Lambda, Step
Functions, and Glue jobs to ensure reliable pipeline execution.
- Monitored and troubleshot data processing pipelines, resolving failures and optimizing
performance for large data workloads.
- Collaborated with business stakeholders to translate client reporting requirements into
scalable ETL workflows.
- Ensured data quality and validation checks before processing files to maintain accuracy
in financial reporting outputs.
- Monitored and debugged serverless ETL pipelines using Amazon CloudWatch logs and
metrics.
- Implemented event-driven workflow monitoring using Amazon Event Bridge.
- Troubleshot pipeline failures by Analyzing CloudWatch logs, Step Function execution
history, and event triggers.
Environment: Python, AWS Lambda, Dynamo DB, AWS Step Functions, AWS S3, AWS
Glue (Spark Jobs), AWS Cloud Front, AWS Event Bridge, JIRA, Agile and Cl/CD.
SECHUB. Vulnerability Management Tool, Publicis Groupe (TLG
India Pvt Ltd.)
Full Stack Developer | Dec 2022 to Jan 2024
Description: SECHUB is an aggregator
tool to maintain vulnerability management data and prepare reports for various clients based
on the user groups. SECHUB will ingest data from
different applications like Rapid7 insightVM, Cycognito, Wiz, AlertLogic, Ermetic, Bitsight,
Pentest and ASV Scans. The data ingest using
different application via Azure Data Factory pipelines and transform the data and load into
the MS SQL Server. The Front-end is developed
using Microsoft Power Apps and Power flows. The
main modules in this project are Assets, Vulnerabilities, Solutions, Asset tags, and
Vulnerability Instances.
Roles & Responsibilities:
- Analyze and prepare requirements documents and design document as per the requirements
of the clients.
- Working on REST API documentation of different applications and for analyzing the data
to implement in the data ingest.
- Prepare and implement the Pipelines in Data factory for ingesting required data and
performing data cleaning.
- Transforming the data in Dataflow as per the requirements and running batch processing
of pipelines.
- Design and implementation of tables in MS SQL Server as the requirements documents.
- Writing complex procedures and functions for preparing the data as per the reports to
end users.
- Writing triggers on tables to implement the and ingest required data from other tables
and maintain data integrity.
- Writing complex queries based on the end-users requirements and prepared views with
them.
- Creating front-end using Microsoft Power apps pages and align the data from the MS SQL
Server.
- Preparing Power flows to run the flow automatically based the data and actions of users
like button control or sending emails to others.
Environment: Azure Data Factory, MS SQL Server, MySQL, Rest API, Azure
DevOps, Service Now, Microsoft Power Apps, Power Flows, Cl/CD
Decision Desk HQ. Reporting platform, Skilldrive Infotech
Full Stack Developer | Jun 2021 to Dec 2022
Description: Decision Desk HQ is an
American website that focuses on reporting election results in the United States. Decision
Desk HQ uses an application programming
interface (API) to get election result at the same time they are published on websites
provided by election officials. This is a web
application and has various REST APIs. We worked on Design and implementation for Ul and
APIS. Some main modules in this project are
Places, Parties and Voting Results.
Roles & Responsibilities:
- Preparing design requirement and models for the system covering Software Requirements
Specification (SRS) document.
- I worked on front-end, system administration and database architecture tasks that
including the following.
- Design and Developed an API (Restful Web Services) and Optimized back-end performance.
- Used various third-party tools for preparing reporting and results.
- Converted images and MS Office files to PDF.
- Built custom Django, PostgreSQL ORM models and queries, allowing users to see and
download detailed requests.
- Fixed bugs on some issues raised based on client requirements.
- Re modelling and updating of the existing models according to new SRS document.
- Prepared UI inputs as per the client requirements, and implemented the reports download
options.
Environment: Django, MySQL, JavaScript, PostgreSQL, Windows, REST API,
Docker, JIRA, GIT, SCRUM
Customer 360, Barclay's Bank, Support Application
Backend Developer | Nov 2019 to May 2021 >
Description: Customer 360 is a web
application for bank customer support agents to support and validate the customer
information. This application manages the data
related to customers. We were Managing both Rest APIs and UI functionalities as part of this
application. Major modules in this application is
to support on customer's debit cards, account details, service requests etc., And also other
products or services of Barclay's bank related information and reports prepared for
supporting agents.
Roles & Responsibilities:
- Preparing Data models for storing customer's data and supporting information and
implementing constraints on data.
- Designing Data Models for the system covering Software Requirements Specification (SRS)
document.
- Implemented user interface guidelines and standards throughout the development and
maintenance of the website using the HTML3, CSS3, JavaScript, JQuery and AngularJS.
- Created methods (GET, POST, PUT, and DELETE) to make requests to the API server and
tested Restful API using Postman.
- Performed joins, group by and other operations for backend development.
- Updating and altering the Existing data models for any changes in the existing
architecture and design for well versed in Software Architecture.
- Involved in implementation of Django ORM method to Query data from the database.
- Creating various functions, View and procedures for querying and performing some complex
joins and query optimization.
- Developed and optimized complex database queries using Django ORM, including filters,
grouping, joins, and aggregations to efficiently retrieve and process application data.
Environment: Django, POSTMAN, MYSQL, SQL, RESTful Web Services, JIRA,
AGILE.