Atharva Tonape

Atharva Tonape

Backend and AI Developer

I am a dedicated Backend and AI Developer with a strong foundation in designing and implementing robust back-end solutions, specializing in artificial intelligence integration. My experience encompasses developing and managing complex backend architectures, implementing AI models, and working with large-scale data processing. I am skilled in Python, Flask, and Docker, and have hands-on experience with container management. My expertise extends to developing innovative software solutions, including projects involving freehand drawing interfaces that leverage AI for image generation using state-of-the-art models like those from OpenAI.

Master of Computer Science (AI)

Brandenburgische Technische Universität, Cottbus, Germany

Oct 2022 - Oct 2024 (Expected)

GPA: 2.0

Key Courses: Information Retrieval, Data Warehousing, Machine Learning

B.E. in Computer Science

Savitribai Phule Pune University, India

Apr 2018 - Apr 2022

GPA: 1.3

Key Courses: Computer Networks, Cybersecurity, Computer Architecture

T-Systems Logo

T-Systems

AI and Backend Developer

Jan 2024 – Present | Bonn, Germany

  • Developed backend infrastructure for Large Language Models (LLMs) and contributed to AI-driven projects.
  • Designed and implemented software where users draw freehand, and an AI-generated image is produced, utilizing OpenAI's APIs.
  • Responsible for full-stack development, integrating technologies like Python, Flask, Docker, and OpenAI models.
  • Gained hands-on experience with cloud deployment, RESTful APIs, containerization, and database management (PostgreSQL, MongoDB).
  • Collaborated across teams to deploy scalable back-end solutions using CI/CD pipelines.
Datametica Logo

Datametica

Data Engineer

Apr 2018 – Dec 2023 | Pune, India

  • Engineered data pipelines in Linux for processing and optimizing large datasets, ensuring efficiency and scalability.
  • Designed ETL processes using Python, Spark, and SQL for transforming and managing data across diverse systems.
  • Integrated data storage solutions (e.g., PostgreSQL, MongoDB, AWS S3) to enable smooth data workflows and real-time analytics.
  • Developed and maintained data models, ensuring high data quality and compliance with business requirements.
  • Leveraged tools like Docker and Airflow for automated workflows, collaborating with cross-functional teams to deploy data-driven solutions.

Langauge

Python, C++, Linux (Bash)

Backend

Django, Flask, FastApi, ExpressJS

Concepts

DBMS, OOPs, DSA, Networking, RESTful API

Database Technologies

MySQL, Snowflake, Databricks, MongoDB

Cloud Technologies

AWS, GCP, AZURE

Big Data

Apache Spark

Awards

  • Data Science Honours

    Awarded distinction in a 20-credit Data Science honours course at SPPU.