6 to 8 Years of Relevant Experience
We are seeking a highly skilled and motivated Data Engineer with 6-8 years of experience in building scalable data pipelines and implementing robust data engineering solutions. This role involves working with modern data tools and frameworks such as Apache Airflow, Python, and PySpark to support the reliable delivery, transformation, and integration of data.
Responsibilities / Expectations from the Role
- 5–8+ years of experience as a data engineer and extensive development using Snowflake or similar data warehouse technology
- Strong technical expertise with DBT, Snowflake, PySpark, Apache Airflow, AWS
- Strong hands-on experience in design & build of robust ELT pipelines using DBT on Snowflake, including ingestion from relational databases, cloud storage, flat files & APIs
- Enhance DBT/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design
- Hands-on experience with SQL and Snowflake database design
- Hands-on experience with AWS, Airflow, and GIT
- Great analytical and problem-solving skills
- Degree in Computer Science, IT, or similar field; a Master’s is a plus