Nikhitha

Hey! I'm Niki (yes, with one 'k' because why not). I build backends and AI systems that actually work in production, not just in demos. I've been doing this for 3+ years with Python, FastAPI, PostgreSQL, and AWS, and honestly? Still obsessed with making code run faster and smarter.

Right now I'm deep into autonomous agents, the kind that actually make decisions and don't just answer questions. My latest project is a multi-agent system coordinating 44 tools across 6 MCP servers, processing 500+ decisions daily. It crashed every 3 hours for two weeks until I figured out tool failure handling. That's the stuff they don't teach you in tutorials.

If you're building LLM features or need someone who can go from "this prompt works sometimes" to "this handles 500 requests/sec in prod," we should talk. I live in that space between AI research papers and real infrastructure, where rate limits, retries, and fallbacks matter more than model parameters.

GitHub Activity

Contribution Graph

Skills

Languages

Python
Python
Go
Go
JavaScript
JavaScript
SQL
SQL

Backend

FastAPI
FastAPI
Flask
Flask
SQLAlchemy
SQLAlchemy
Celery
Celery
Redis
Redis

Databases

PostgreSQL
PostgreSQL
MySQL
MySQL
MongoDB
MongoDB
Supabase
Supabase
Pinecone
Pinecone

Infrastructure

AWS
AWS
Docker
Docker
Terraform
Terraform
GitHub Actions
GitHub Actions
Prometheus
Prometheus
Grafana
Grafana

AI/ML Frameworks

OpenAI
OpenAI
Claude
Claude
LangChain
LangChain
CrewAI
CrewAI
PyTorch
PyTorch
TensorFlow
TensorFlow
Scikit-learn
Scikit-learn
Hugging Face
Hugging Face
MLflow
MLflow

Architecture

Microservices
Microservices
Event-Driven
Event-Driven
Async Programming
Async Programming
AI Agents
AI Agents

APIs & Integrations

Stripe
Stripe
JWT
JWT

Security

OWASP
OWASP

Experience

University at Buffalo - International Student Services

Software Engineer - Graduate Assistant

University at Buffalo - International Student Services

May 2025 – Dec 2025
Buffalo, NY
  • Engineered backend services and data pipelines using Python, FastAPI, and PostgreSQL with asynchronous processing, automating ETL workflows for 5,000+ international students, reducing manual processing time by 70%
  • Built and integrated LLM-powered automation tools using OpenAI API and LangChain for data extraction and classification, handling workflow tasks. Developed Tableau dashboards for leadership insights.
  • Implemented CI/CD automation (Docker, GitHub Actions, AWS) and architected scalable microservices with asynchronous task processing for high-concurrency workloads in Agile/Scrum sprints.
Roche (Accenture)

Software Development Analyst

Roche (Accenture)

May 2023 – July 2024
Chennai, India
  • Owned and developed the patient and drug data pipeline module for a clinical data management platform, architecting secure REST APIs using FastAPI and PostgreSQL with role-based access control.
  • Implemented comprehensive testing framework with pytest including (unit, integration & API) achieving 85% code coverage and containerized applications with Docker for CI/CD quality gates that reduced production bugs.
  • Developed asynchronous task processing with Celery for background workflow execution, enabling non-blocking operations and improving system responsiveness for high-volume data processing tasks.
  • Designed normalized data models and optimized PostgreSQL queries with advanced indexing, improving response times by 75%, and implemented performance monitoring using Prometheus and Grafana.
Accenture

Software Development Associate

Accenture

May 2021 – May 2023
Chennai, India
  • Developed RESTful API endpoints and backend services using Python and PostgreSQL, implementing CRUD operations, data validation logic, and error handling for web applications across client projects.
  • Debugged and resolved production issues across web applications, improved error handling and logging mechanisms, and contributed to system stability improvements that reduced incident response time by 40%.
  • Integrated third-party APIs and services to automate workflows across client applications, reducing manual processes while contributing to monitoring and logging infrastructure improvements.