About Me#
As a data engineer with a background in analytics, I have a strong foundation in building data pipelines, data modelling as well as process automation. I am constantly seeking out new opportunities to learn and grow, and am always excited to pick up new tools and technologies. I am a problem-solver at heart and enjoy using my skills to help organizations and people to automate the boring stuff.
In my free time, I am passionate about IoT, Robotics, and everything techy and to further my knowledge in these areas, I am currently pursuing a degree in Humanoid Robotics at BHT Berlin.
Experience#
Data Engineer#
tonies GmbH
2021/12 - Present (2 yrs 2 mos +)
tonies is the largest interactive audio platform for kids in the world.
Responsibilities
- Building and maintaining ELT pipelines using Python and Airflow/ Prefect
- Writing quality and performance tests to ensure the accuracy and reliability of the data
- Processing Event Data via Confluent
- Deploying infrastructure on AWS, using tools such as Docker, CloudFormation and Terraform
- Transforming data using dbt
- Implementing, maintaining, and expanding the organization’s data lake by working with external databases, REST APIs, Elasticsearch
- Building data applications using Streamlit
- Ensure data is GDPR-compliant
- Conducting proof-of-concept projects to evaluate new technologies, such as the Amundsen Data Catalog and Metabase
Data Engineer#
adviqo Group
2020/07 - 2021/12 (1 yr 6 mos)
adviqo is building platforms and apps that connect individuals with experts all over the world in the area of life coaching.
Responsibilities
- Building and Maintaining ETL/ ELT Pipelines and Streaming Data Pipelines with Python, SQL, PDI, Snowflake and AWS services
- Ingesting Data from External Data Sources (Databases, APIs, Web, Webhooks) into Data Lake and Data Warehouse
- Building, Documenting and Maintaining REST APIs with Python, Lambda and API Gateway
- Writing, executing and debugging Python, Shell and SQL scripts
- Process scheduling with Airflow, Cron, Snowflake Tasks
- Data Manipulation with Python, SQL (Oracle, SnowSQL, PostgreSQL, MySQL, MariaDB), NoSQL (MongoDB)
Business Intelligence & Automation Analyst#
HelloFresh SE
2017/01 - 2020/07 (3 yrs 7 mos)
HelloFresh changes the way people eat forever.
Responsibilities
- Data Visualization with Tableau and Jupyter Notebook
- Data Analysis with Python (Pandas, Numpy) and SQL (MySQL, Impala, Redshift)
- Building and Maintaining ETL pipelines
- Managing ETLs via AWS Lambda, Cron and Airflow
- Creating basic Time Series Forecasts with Python (statsmodels, scikit-learn)
- Process and Report Automation
- Explaining Insights and Processes to various stakeholders (with and without technical knowledge)
Education#
Bachelor of Engineering (B.Eng.)#
Berliner Hochschule für Technik (BHT)
2023 - today
Fields of Study:
- Humanoid Robotics
Nanodegree - Data Streaming#
Udacity
Fields of Study:
- Data Ingestion with Kafka & Kafka Streaming (REST Proxy, Kafka Connect, KSQL, Faust Python Stream Processing)
- Apache Spark and Spark Streaming (Integrating Spark Streaming and Kafka, PySpark)
Nanodegree - Business Analytics#
Udacity
Fields of Study:
- Descriptive Statistics
- SQL for Data Analysis
- Tableau for Data Visualization
State Certified Business Economist#
Europäische Wirtschaftsfachschule Berlin
Fields of Study:
- Marketing
- Statistics
Courses and Certifications#
Agile Foundations#
Kanban Foundations#
Scrum Foundations#
Tableau Data Scientist#
Tableau Software Certification