πΉ A passionate Data Engineer specialized in Python, SQL, ETL Development, and Cloud Data Warehousing (Snowflake, Redshift).
πΉ Skilled in building scalable Data Pipelines, Automation Systems using Apache Airflow, Kafka, and RabbitMQ.
πΉ Focused on delivering production-ready data systems and generating actionable insights from large datasets.
- Programming: Python, PySpark, SQL
- Data Engineering: ETL Development, Data Modeling (Star, Snowflake Schema)
- Cloud Platforms: AWS (EC2, S3, Lambda, RDS), Databricks
- Orchestration: Apache Airflow, Shell scripting, Cron jobs
- Databases: Snowflake, Redshift, MongoDB, PostgreSQL, MySQL
- Other Tools: Apache Spark, IBM Cognos Analytics, GitHub, Generative AI
- Built ETL workflows integrating OLTP and NoSQL systems with Apache Airflow.
- Developed sales forecasting models using SparkML.
- Visualized KPIs using IBM Cognos Analytics.
- Designed pipelines to process live TCP/IP stream data into structured databases (SQL/CSV).
- Used RabbitMQ and MySQL Stored Procedures for asynchronous and optimized processing.
(More projects will be added soon π)
- IBM Data Engineering Certificate (Coursera)
- Python (Basic) - HackerRank
- SQL (Intermediate) - HackerRank
- Email: charakirahul@gmail.com
- LinkedIn: Rahul_Charaki
Always eager to learn, build, and grow!