Data Engineer
Job Duties:
-
Translate Data Pipeline requirements into data pipeline design, guide and direct the design by working closely with a range of stakeholders including the rest of the architecture team, external developers, data consumers, data providers, and internal/external business users.
-
Contribute to use case development, e.g. workshop to gather and validate business requirements.
-
Model and design the ETL pipeline data structure, storage, integration, integrity check and reconciliation. Standardize all exception control and ensure good traceability during trouble shooting.
-
Document and write technical specifications for the function and non-functional requirements of the solution.
-
Design data model/platform allowing managed growth of the data model to minimize risk and cost of change for a large-scale data platform.
-
Analyze new data sources with structured Data Quality Evaluation approach and work with stakeholders to understand the impact of integrating new data into existing pipelines and models.
-
Bridge the gap between business requirements and ETL logic by troubleshooting data discrepancies and implementing scalable solutions.
Skills / Knowledge:
-
Bachelor's degree (or higher) in mathematics, statistics, computer science, engineering, or related field
-
Proficient in both spoken and written English and Chinese (Mandarin/Cantonese).
-
Strong technical understanding of data modelling, design and architecture principles and techniques across master data, transaction data and data warehouse.
-
Experience with Stored Procedure (PL/SQL) and SQL DDL/DML.
-
Proactive with good problem solving and multitasking skills and task management strategies.
-
At least 8+ year IT experiences with 2-year in data migration and/or data pipelines projects.
Hong Kong, HK, HK