- Work closely with different business stakeholders to maintain and develop a centralized data warehouse for data management, and data pipeline
- responsible for the design, develop, maintain, and manage the data pipeline, data quality and data management for meeting the core systems and project requirements, and business needs
- Oversee and manage the work-flow of data quality management, data cleansing and data exchanges processes by designing and implementing different rules and best practice
- Write SQL, PL/SQL, Python to run queries and support data analysis work Job requirements:
- Solid knowledge on database, modelling, data cleaning, transformation and warehousing, and data lake
- Good command in both Cantonese and English
- Solid experience to design and develop, maintain the Enterprise Data Warehouse and Business intelligence, and data visualization environments.
- Experience in designing and delivering solutions within big data environments such as AWS, Mango DB etc.
- Experience in Pension Business is preferred
- Degree or above in Computer Science or related disciplines
- At least 3 years of working experience in Data Management.
- Solid experience on Data Management or Data Engineering.
- Passionate in data management, data analysis and business intelligence
- Strong team player and with can-do attitude