BackDuties & Responsibilities
- Develop and maintain optimal data pipeline architecture.
- Ensure the availability, reliability, and scalability of the data platform.
- Build infrastructure for efficient data extraction, transformation, and loading (ETL).
- Enable consumption/utilization of the data warehouse.
- Guarantee data pipeline compliance with SLAs.
- Perform data analysis to uncover insights and support business decisions.
- Create reports, dashboards, and visualizations.
- Apply statistical methods and data modeling for predictive insights.
Requirements
- Advanced SQL knowledge and experience with various databases.
- Proven experience building and optimizing data pipelines and datasets.
- Experience with data transformation, structures, and workload management.
- Familiarity with message queuing, stream processing, and big data stores.
- Track record of extracting value and insights from large datasets.
- Strong analytical skills.
- Proficiency with data analysis and visualization tools (e.g., Tableau, Power BI).
- Understanding of statistical analysis.
- Experience with:
- Big data tools (e.g., Hadoop, Spark, Kafka).
- Data pipeline and workflow management tools.
- Cloud services (e.g., AWS, Azure, GCP).
- Programming languages (e.g., Python, Java).