Big Data Engineer (Agile/ DevOps)- 60-75k
|Location:||Hong Kong, Hong Kong|
Create standardized data engineering frameworks for data integration - ELT/ETL programs, design, plan and handle platform upgrade activities
- Builds, codes, tests, maintain high-quality software by working closely with QA team and country stakeholders
- Perform POCs and generate valuable results together with new big data technologies based on the team's strategies.
- Participates in Agile sprints and ceremonies; supports rapid iteration and development
- Develops, maintains, and tests data pipelines, application framework, infrastructure for data generation; works closely with information architects and Data Scientists
- Experience initializing and managing cloud platforms: AWS, Azure, etc.
Hands on Big data engineering skills i.e Apache Nifi, Hive, Hbase, Hdfs and data streaming tools including Kafka, Spark...etc
- Familiarity with agile and DevOps principles, test-driven development, continuous integration, and other approaches to accelerate the delivery of new features
- Familiar with platform technologies i.e HDP, HDF or any big data platform equivalent
- Understanding of software development lifecycle
Contact Ms. Alexandra Leung at (852) 3103 4312 or APPLY NOW by clicking the button below.
Data provided is for recruitment purposes only.