IT - Data Lake Developer
A member of Data Lake team, responsible to build, develop and maintain the data pipeline of our strategic Big Data platform, Data Lake serving for management needs, business operations of the bank and comply with Data Governance.
- Develop and maintain scalable and reliable data pipelines to ingest data from a variety of different data sources into Data Lake, ensure right data format and adhere to data quality standards, assure the downstream users can get the data quickly
- Develop and maintain highly scalable and extensible Big Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels. Define and maintain data pipeline, data structure, data format to enable business solution
- Develop and enable big data and batch/real-time analytical solutions that leverage emerging technologies.
- Works in a team to build next-generation Hadoop data lake and analytics applications on a group of core Hadoop technologies
- Evaluate new technologies and products, and research to identify opportunities that impact business strategy, business requirements and performance that can accelerate access to data.
- Work with Advanced Analytics Team to plan and execute high-impact actionable insight generation through big data advanced analytics including predictive analytics, advanced Machine Learning Technologies that reduce cost and improve analytics speed to insight by accelerating the pace of Big Data innovation at ACB.
- Ensure proper configuration management and change controls are implemented during code migration.
1. 1. Educational Qualifications
Bachelor Degree in Information Technology/Data Management/Big Data/ Information system.
2. 2. Relevant Knowledge/ Expertise
- At least 3 years experience in any following data analysis programming languages: Python, R, Java, Scalar, Spark is a must. Knowledgeable in at least any web programming langauges is a big plus
- At least 1 years experience in any following NoSQL database: MongoDB, Hbase, Google BigTable, Couchbase is a must
- At least 1 year experience in any following Hadoop ecosystem components (Flume, HDFS, Kafka, Oozie, Solr, Storm, Nifi) is a must. Knowledgable about following solution Cloudera, Hortonworks, MapR is a plus.
- Experience in multinational project, cross-functional engineering teams is a plus.
3. 3. Skills
- Ability of reading, understanding IT materials in English
- Ability of communicating in English. IELTS 6.5 and above, or TOEIC 700 and above is a plus
- Good in thinking, observing and analytics
- Ability of presenting idea to non-technical people
- Flexibility, and adaptability in this fast-growing environment
- Knowledgeable about anything of finance, accounting, business, e-commerce, insurance, securities is preferred.
4. 4. Relevant Experiences
- 1+ years of overall IT experience including the following:
- Experience in evolving/managing technologies/tools in a rapidly changing environment to support business needs and capabilities
- Experience with backend microservice, Cloud Computing such as Google, AWS and big data projects in financial industry is a plus
5. 5. Personal Characteristics
- Hard-working, proactive, creative, and patient
- High compliance and paying attention to detail.