
Data Engineer
- Kuala Lumpur
- Tetap
- Sepenuh masa
- Use your data engineering experience to provide high-quality, robust datasets to underpin Engineering Operations across the business
- Transforms raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.
- Consume data directly from a wide array of sources (APIs, files, OData, graph databases etc.) and take the data from raw, through cleaned and validated states and on to model-ready metrics
- Design and implement ETL pipelines that integrate with systems and support modern data workflows
- Familiarity with handling both structured and unstructured data sources
- Collaborate with cross-functional teams to deliver comprehensive data solutions
- Drive innovation by identifying automation opportunities, experimenting with new tools and technologies
- 3-4 years of data engineering experience with demonstrated expertise in cloud-based platforms, especially Databricks, utilizing Spark and knowledge of the Lakehouse concept
- Strong SQL and Python skills
- Experience with API development and integration for data ingestion and system connectivity
- Understanding of ETL processes and data pipeline design
- Knowledge of CI/CD workflows and DevOps practices for data engineering
- Understanding of pipeline orchestration and workflow management
- Experience with Delta Lake and modern data lake architecture
- Experience implementing thorough test processes to produce high-quality deliverables
- Strong communication and problem-solving skills.
- Experience with Neo4j or other graph databases for complex relationship modeling
- Understanding of Generative AI
- Knowledge of MCP (Model Context Protocol) or similar AI communication protocols
- Experience with modern workflow orchestration tools
- An understanding of Data Modelling theory (preferably Kimball), or practical experience of Power BI data modelling (Star schema)