Experience

My career path has been defined by a passion for pushing technological boundaries and delivering impactful solutions in Software Development, AI, and Data Science tooling.

Here is a story I like to tell about how my career started: I went from Jupyter user to Jupyter developer. During my time at grad school studying for PhD, I transitioned our lab from using Mathematica for data analsysis and fitting to Python + Jupyter. When working on a paper, I created the accompaning notebook on Colab, which showcased equation rendering, translation to code, computation and visualization of results. The paper ultimately got rejected, but the hiring manager was so impressed by the notebook, he offered me to build Jupyter-based Data Science platform for them!

As you’ll discover, each role in my journey has been more than just a position—it’s been an opportunity to solve complex problems and create meaningful technological advancements. From developing algorithms for FDA-approved medical device and building AI assistants and tools to designing, implementing and leading entire Data Science platforms each experience has built upon the last.

I was fortunate enough to have opportunity to work with and contribute back to open-source tools, especially around Jupyter. Python community is near and dear to my heart.

Take a look at my experience and reach out if you are interested in working with me

40+
Technologies Used
35+
Projects Completed
3+
Recommendations Received
20+
Courses Completed
Axle Informatics logo
Axle Informatics

Development Manager

January 2025 - Present

Key Responsibilities:

  • Leading development of interactive AI-driven notebook environments and data portals for federal clients
  • Overseeing 7 engineers cross-functional teams
Orange Bricks logo
Orange Bricks

Independent Consultant

January 2024 - Present

Key Responsibilities:

  • Developed AI-powered copilot JupyterLab extension for a YC-backed startup, including prompt engineering, intelligent code completions, debugging agents, and chat interfaces
  • Enhanced and supported JupyterLab extensions for a tech startup and set up CI/CD pipeline
Axle Informatics logo
Axle Informatics

Technical Lead

January 2022 - Present

Key Responsibilities:

  • Led the development of Notebooks Hub - a collaborative Data Science platform combining JupyterLab, RStudio, and VSCode IDEs and dashboarding tools such as Streamlit and Shiny
  • Created and maintained custom JupyterLab extensions for Data Scientists
  • Designed system architecture and managed team of 4 developers through implementation
  • Created productivity tools used across the company, including an AI agent for generating monthly status reports from Slack/Jira/GitHub activity
  • Mentored 15 trainees through collaborations with Georgia Tech, Cornell's Breakthrough AI Program, and NIH internships
Axle Informatics logo
Axle Informatics

Data Engineer / Jupyter Developer

June 2021 - December 2021

Key Responsibilities:

  • Contributed to development of Notebooks Hub platform
  • Created custom JupyterLab extensions
  • Implemented backend services for collaborative data science workflows
De Novo Software logo
De Novo Software

Algorithms Developer

August 2020 - May 2021

Key Responsibilities:

  • Worked in a regulated environment on an FDA-approved software medical device
  • Contributed to developing a desktop application for analyzing flow cytometry data
Axle Informatics logo
Axle Informatics

Full Stack Developer

January 2019 - July 2020

Key Responsibilities:

  • Engineered high-performance cloud deployment with shared distributed storage for image processing pipelines used by Data Scientists at NIH/NCATS
  • Contributed to JupyterLab extensions now widely used within developer communities (20k+ downloads monthly)
  • Designed a new framework for scheduling containerized analysis workflows on Kubernetes and HPC clusters
Center for Molecular Study of Condensed Soft Matter logo
Center for Molecular Study of Condensed Soft Matter

Research Assistant

2014 - 2018

Key Responsibilities:

  • Created highly efficient GPU implementation of stochastic molecular model which is up to 300x faster than the previous version
  • Automated pipeline for computational experiments, including job scheduling, data fitting, and analysis using Python and JupyterLab