Durga D Nagisettygari

Data Engineer & Cloud Infrastructure Specialist

Building scalable AI-powered solutions with combined experience in cloud infrastructure, DevOps automation, and cutting-edge machine learning platforms.

About Me

I'm a passionate Data Engineer and Cloud Infrastructure Specialist with over 3 years of combined experience building scalable, secure, and innovative solutions. Currently pursuing my Master's in Computer Science at Indiana State University, I specialize in AI-powered automation, MLOps platforms, and cloud-native architectures.

My expertise spans from developing autonomous AI agents to implementing federated learning systems, always with a focus on security, scalability, and real-world impact. I'm particularly passionate about the intersection of AI and cloud infrastructure, creating solutions that push the boundaries of what's possible.

AI/ML Engineering
Cloud Architecture
DevOps Automation
Data Engineering

Master's in Computer Science

Indiana State University

Aug 2023 - May 2025

B.Tech Computer Science

SRM Institute of Science and Technology

Jul 2019 - Jun 2023

Location

Terre Haute, Indiana, USA

Open to relocation & remote work

Professional Experience

Computer Support Analyst
Indiana State University
Terre Haute, IN
Aug 2023 - May 2025
Part-Time

Automated system workflows using Python, cutting manual effort and downtime by over 70%

Deployed and managed Dockerized AWS infrastructure using Terraform for scalable app delivery

Created Power BI dashboards and optimized SQL queries to accelerate data insights by 50%

Conducted IAM audits and enforced RBAC policies, reducing security vulnerabilities by 40%

Resolved 95% of Tier-2 support tickets for 200+ users, improving response and resolution times

Streamlined onboarding with automated provisioning scripts, reducing setup time by 65%

Key Technologies & Tools:

Python
Docker
Terraform
AWS
Power BI
SQL
IAM
RBAC
DevOps

Featured Projects

Agentic AI Workflow Automation

Built an autonomous agent pipeline integrating GPT-style LLMs with LangChain for chained decision-making, featuring orchestration with Docker containers and Kubernetes cron jobs. Achieved 95% task completion rate in simulated environments.

Key Highlights:

  • Multi-step pipelines with callbacks and state management
  • Serverless inference endpoints via AWS Lambda
  • Dynamic self-assessment module for iterative output refinement
  • Optimized prompt structures reducing latency and token usage

Technologies:

Python
LLMs
LangChain
Docker
Kubernetes
AWS Lambda
End-to-End GenAI Document Summarizer

Created a serverless web app to ingest PDFs, transcripts, and web pages for generative summarization using GPT-4 and Azure Cognitive Services. Reduced summarization time by 40% compared to baseline models.

Key Highlights:

  • OCR and text extraction before GPT-4 processing
  • Role-based access control with Azure Key Vault
  • Optimized NLP throughput using batching and prompt tuning
  • Usage analytics and feedback loop for quality refinement

Technologies:

GPT-4
Azure Cognitive Services
Streamlit
Terraform
Azure Functions
Real-time Edge-Cloud IoT AI Pipeline

Architected a hybrid edge-cloud system for real-time computer vision inference on IoT devices with 80% latency reduction through model optimization. Achieved 99% accuracy in object detection.

Key Highlights:

  • Object detection models optimized for edge devices
  • MQTT streaming to AWS Greengrass cores
  • 80% latency reduction through model quantization
  • Secured device communication using TLS and AWS IAM

Technologies:

Raspberry Pi
MQTT
AWS Greengrass
SageMaker Neo
Edge Impulse
MLOps Platform for Multimodal Model Training

Developed an end-to-end MLOps stack deployed in Kubernetes with automated model testing, canary deployment pipelines, and drift detection. Reduced model deployment time by 60%.

Key Highlights:

  • Experiment tracking and version control with MLflow
  • CI/CD workflows for model retraining and deployment
  • Multimodal data ingestion from S3 and Snowflake
  • Automated drift detection and alerting system

Technologies:

Kubernetes
MLflow
Docker
Terraform
Argo CD
S3
Snowflake
Federated Learning Platform

Designed a privacy-conscious federated learning system where clients train local models and share updates without exposing raw data. Maintained 90% model accuracy compared to centralized training.

Key Highlights:

  • Secure enclave support with Docker + FastAPI
  • Distributed training orchestration with Flower framework
  • Encrypted model updates with mutual TLS
  • Validated with 10+ simulated clients achieving near-centralized performance

Technologies:

PyTorch
Flower
Docker
FastAPI
PostgreSQL

Technical Skills

Programming Languages
Python
SQL
JavaScript
C
Cloud Platforms
AWS (EC2, S3, IAM, Lambda, Redshift)
Azure Data Factory
Google Cloud
DevOps & Infrastructure
Docker
Terraform
Kubernetes
GitHub Actions
Jenkins
CI/CD Pipelines
Data Engineering
Snowflake
Streamlit
Power BI
ETL
Airflow
DataBricks
AI/ML & Analytics
LangChain
PyTorch
MLflow
SageMaker
Edge Impulse
Computer Vision
Security & Systems
IAM
RBAC
HIPAA Compliance
Network Protocols
Linux Administration
Certifications & Achievements

Certifications:

  • AI & High-Performance Computing Certification – FICE Education
  • Google UX Design Certification
  • AWS Solutions Architect (In Progress)
  • Terraform Associate (In Progress)

Achievements:

  • Runner-up at ISU Hackathon 2024
  • Active GitHub contributor
  • Led student DevOps group
  • Volunteer mentor for CS students

Get In Touch

Let's Connect

I'm always interested in discussing new opportunities, innovative projects, and collaborations in AI, cloud infrastructure, and data engineering. Feel free to reach out!

Location

United States of America

Open to relocation & remote work

Send a Message