S HARIKRISHNA

S HARIKRISHNA

Web Application Developer

View Profile

SUMMARY

5+ years of experience as a Web Application Developer using python, Django and Django-REST Framework.
Hands on experience in Web Designing using HTML5, CSS3, JavaScript, Bootstrap, JQuery and AngularJS.
Hands on experience in Design and Development of web applications using python, Django and MySQL and Data modeling using Django ORM Modeling and mapping techniques.
Hands on experience into API Testing using POSTMAN, Swagger and Manual Testing.
Good experience in writing SQL, PL/SQL queries based on the functional specification document provided by the client.
Hands on experience in Data Processing and ETL using Azure Data Factory, Azure Synapse Analytics and Azure Data Lake Storage.
Efficient in preparing and managing Pipelines in Azure Data Factory and transforming data using data Dataflow activities.
Experience in Microsoft Azure Power Apps and Power Flows to create web application as per the client requirements.
Experience in maintaining CI/CD pipelines in Microsoft Azure Data Factory and Azure DevOps.
Having good knowledge on Django commands and Shell Commands.
Experience in using IDE's such as Visual Studio, Visual Studio Code and PyCharm.
Experience in Django REST Framework for creating REST full API Services in an application, good experience in Django-all-auth for various authentication and login methods.
Experience on GIT commands and GIT workflow version control process.
Developed and maintained serverless ETL pipelines using AWS Lambda, AWS Glue (Spark), and AWS Step Functions to process large datasets and generate client-specific financial reports.
Implemented dynamic, configuration-driven workflows using Amazon S3 and DynamoDB for automated file ingestion, processing orchestration, and job status tracking.
Monitored and debugged serverless ETL pipelines using Amazon CloudWatch logs and metrics, and troubleshot workflow failures by analyzing Amazon EventBridge events and Step Functions execution history.
Good written and oral communication skills and interpersonal skills with strong ability to excel through collaboration with team members.
Extensive experienced working in AGILE methodology environment and participating in SCRUM sessions that includes Analyzing, Designing, implementation & Production Support.

CERTIFICATIONS

Microsoft Certified: Azure Data Fundamentals
Microsoft Certified: Azure Fundamentals
Microsoft Certified: Fabric Data Engineer Associate
Microsoft Certified: Azure Data Engineer Associate

TECHNICAL SKILLS

Primary Languages
Python 3.7 Core Java C C++
Frameworks
Django Bootstrap JQuery AngularJS REST-Framework Report Lab
Azure Services
Azure Data Factory Microsoft Power Apps Power Flows Azure DevOps
AWS Services
AWS S3 AWS Lambda AWS Spark AWS EventBridge AWS Cloud Watch Step Functions
Database
Oracle 10g/11g MSSQL Server MySQL SQLite3 Firebase MS Access AWS DynamoDB
IDE's & Servers
Visual Studio Visual Studio Code PyCharm Eclipse PyDev IIS Apache Tomcat XAMPP
Web Technologies & SDLC Methods
HTML5 CSS3 DHTML XML JavaScript SCRUM Agile
Testing Frameworks & Other
ROBOT POSTMAN Swagger Windows 7/10 GIT Hub Git BitBucket JIRA

EXPERIENCE

Janus Henderson Investors - Tax & Statements, Covance Softsol Limited

ETL Developer | Sep 2025 to Feb 2026

Description: Janus Henderson Investors is a global asset management firm managing investment funds for institutional and retail clients worldwide. The company provides portfolio management, investment advisory, and financial services with strong focus on regulatory reporting, investor statements, and tax reporting. Performing config driven pipelines using AWS Services like Dynamo DB, AWS Lambda, AWS S3 and AWS Glue to process client files and prepare reports. >

Roles & Responsibilities:

  • Designed and maintained dynamic configuration-driven data pipelines using Amazon DynamoDB to control processing logic based on client and file types.
  • Implemented event-driven file processing workflows triggered by file uploads to Amazon S3.
  • Developed AWS Lambda functions to read configuration from DynamoDB and determine the sequence of processing steps for different file types.
  • Configured environment variables in Lambda to dynamically trigger appropriate workflows in AWS Step Functions.
  • Built and maintained AWS Step Functions orchestration workflows to coordinate execution of multiple Lambda functions and AWS Glue jobs.
  • Developed AWS Glue Spark jobs to process large-scale datasets, perform data transformations, and generate client-specific reports.
  • Implemented distributed data processing using Apache Spark to efficiently handle high-volume financial and transactional data.
  • Automated report generation workflows based on processed data to support tax reporting and investor statements.
  • Created Lambda-based utilities for sending automated email notifications upon job completion, failure, or status updates.
  • Maintained workflow tracking and execution logs in DynamoDB, storing file name, processing status, step details, and unique execution IDs.
  • Implemented error handling, retry mechanisms, and status tracking across Lambda, Step Functions, and Glue jobs to ensure reliable pipeline execution.
  • Monitored and troubleshot data processing pipelines, resolving failures and optimizing performance for large data workloads.
  • Collaborated with business stakeholders to translate client reporting requirements into scalable ETL workflows.
  • Ensured data quality and validation checks before processing files to maintain accuracy in financial reporting outputs.
  • Monitored and debugged serverless ETL pipelines using Amazon CloudWatch logs and metrics.
  • Implemented event-driven workflow monitoring using Amazon Event Bridge.
  • Troubleshot pipeline failures by Analyzing CloudWatch logs, Step Function execution history, and event triggers.
Environment: Python, AWS Lambda, Dynamo DB, AWS Step Functions, AWS S3, AWS Glue (Spark Jobs), AWS Cloud Front, AWS Event Bridge, JIRA, Agile and Cl/CD.

SECHUB. Vulnerability Management Tool, Publicis Groupe (TLG India Pvt Ltd.)

Full Stack Developer | Dec 2022 to Jan 2024

Description: SECHUB is an aggregator tool to maintain vulnerability management data and prepare reports for various clients based on the user groups. SECHUB will ingest data from different applications like Rapid7 insightVM, Cycognito, Wiz, AlertLogic, Ermetic, Bitsight, Pentest and ASV Scans. The data ingest using different application via Azure Data Factory pipelines and transform the data and load into the MS SQL Server. The Front-end is developed using Microsoft Power Apps and Power flows. The main modules in this project are Assets, Vulnerabilities, Solutions, Asset tags, and Vulnerability Instances.

Roles & Responsibilities:

  • Analyze and prepare requirements documents and design document as per the requirements of the clients.
  • Working on REST API documentation of different applications and for analyzing the data to implement in the data ingest.
  • Prepare and implement the Pipelines in Data factory for ingesting required data and performing data cleaning.
  • Transforming the data in Dataflow as per the requirements and running batch processing of pipelines.
  • Design and implementation of tables in MS SQL Server as the requirements documents.
  • Writing complex procedures and functions for preparing the data as per the reports to end users.
  • Writing triggers on tables to implement the and ingest required data from other tables and maintain data integrity.
  • Writing complex queries based on the end-users requirements and prepared views with them.
  • Creating front-end using Microsoft Power apps pages and align the data from the MS SQL Server.
  • Preparing Power flows to run the flow automatically based the data and actions of users like button control or sending emails to others.
Environment: Azure Data Factory, MS SQL Server, MySQL, Rest API, Azure DevOps, Service Now, Microsoft Power Apps, Power Flows, Cl/CD

Decision Desk HQ. Reporting platform, Skilldrive Infotech

Full Stack Developer | Jun 2021 to Dec 2022

Description: Decision Desk HQ is an American website that focuses on reporting election results in the United States. Decision Desk HQ uses an application programming interface (API) to get election result at the same time they are published on websites provided by election officials. This is a web application and has various REST APIs. We worked on Design and implementation for Ul and APIS. Some main modules in this project are Places, Parties and Voting Results.

Roles & Responsibilities:

  • Preparing design requirement and models for the system covering Software Requirements Specification (SRS) document.
  • I worked on front-end, system administration and database architecture tasks that including the following.
  • Design and Developed an API (Restful Web Services) and Optimized back-end performance.
  • Used various third-party tools for preparing reporting and results.
  • Converted images and MS Office files to PDF.
  • Built custom Django, PostgreSQL ORM models and queries, allowing users to see and download detailed requests.
  • Fixed bugs on some issues raised based on client requirements.
  • Re modelling and updating of the existing models according to new SRS document.
  • Prepared UI inputs as per the client requirements, and implemented the reports download options.
Environment: Django, MySQL, JavaScript, PostgreSQL, Windows, REST API, Docker, JIRA, GIT, SCRUM

Customer 360, Barclay's Bank, Support Application

Backend Developer | Nov 2019 to May 2021 >

Description: Customer 360 is a web application for bank customer support agents to support and validate the customer information. This application manages the data related to customers. We were Managing both Rest APIs and UI functionalities as part of this application. Major modules in this application is to support on customer's debit cards, account details, service requests etc., And also other products or services of Barclay's bank related information and reports prepared for supporting agents.

Roles & Responsibilities:

  • Preparing Data models for storing customer's data and supporting information and implementing constraints on data.
  • Designing Data Models for the system covering Software Requirements Specification (SRS) document.
  • Implemented user interface guidelines and standards throughout the development and maintenance of the website using the HTML3, CSS3, JavaScript, JQuery and AngularJS.
  • Created methods (GET, POST, PUT, and DELETE) to make requests to the API server and tested Restful API using Postman.
  • Performed joins, group by and other operations for backend development.
  • Updating and altering the Existing data models for any changes in the existing architecture and design for well versed in Software Architecture.
  • Involved in implementation of Django ORM method to Query data from the database.
  • Creating various functions, View and procedures for querying and performing some complex joins and query optimization.
  • Developed and optimized complex database queries using Django ORM, including filters, grouping, joins, and aggregations to efficiently retrieve and process application data.
Environment: Django, POSTMAN, MYSQL, SQL, RESTful Web Services, JIRA, AGILE.

EDUCATION

MSc - Master of Computer Science

SV University, Tirupati (71%) - 2015

BSc (MPCs)

Sri Venkateswara University, Tirupati (68%) - 2013

INTERMEDIATE

Govt. Jr. College, Gudupalle (75%) - 2009

SSC

Govt. High School, Gutharlapalle (78%) - 2007