As a Data Engineer you will manage and maintain Sylndr’s data warehouses, ensuring the efficient processing and storage of large-scale datasets and enabling relevant stakeholders to access the data they need to drive their decisions. You will be part of a multi-cultural and dynamic team that takes pride and ownership of our work and encourages positive disruption!
What You’ll Do!
- Maintain and build data pipelines from different sources to a centralized warehouse
- Deploy and manage kubernetes deployments ensuring scalability and reliability.
- Work on data modeling and schema design for efficient data storage and retrieval in Google BigQuery.
- Maintain documentation of data pipelines and system configurations.
- Own and maintain the integrity of Sylndr’s data
- Troubleshoot data related issues and performance bottlenecks.
Who You Are!
- Minimum of 3 years in a data engineering role.
- Excellent knowledge of the Python programming language
- Excellent knowledge of the Apache Airflow v.2.0+ and experience with the taskflow API.
- Excellent knowledge of data warehousing concepts (experience with BigQuery is preferred)
- Excellent Knowledge of data infrastructure concepts
- Experience in managing and orchestrating containerized applications using Kubernetes.
- Background in automating deployment processes and integrating containerized applications into continuous integration and continuous delivery (CI/CD) pipelines.
- Knowledge of machine learning concepts is a PLUS
- Familiarity with DBT (Data Modeling Tool) is a PLUS
- Having an automotive background is a PLUS