Job Detail

AVP, System Analyst (Data Lake)

SG

Job Description

About the Department

Group Technology and Operations (GTO) provides software and system development, information technology support services and banking operations. We have centralized and standardized the technology components into Singapore, creating a global footprint which can be utilized for supporting our regional subsidiaries and the branches around the world. We operate and support 19 countries with this architecture to provide a secure and flexible banking infrastructure. Our Operations divisions provide transactional customer services for our businesses while also focusing on cost efficiency through process improvements, automation and straight through processing. 

Job Responsibilities

The System Analyst is a member of the Data Analytics Technology team. You will be responsible for the end-to-end software development in the data analytics domain (which could be projects, quarterly change requests, L3 production fixes). This includes requirement analysis, solution design, development, implementation, testing and support. 

You will be expected to be responsible for quality assurance of the team’s delivery in conformance with the Bank-defined software delivery methodology and tools. You will partner with other technology functions to help deliver required technology solutions.

 

  • Play a key role in Data Lake to onboard all critical upstream system data
  • Proper gate keeping, ensuring upstream or business users following best practices and principles of Data Lake.
  • In addition, taking care of Data Lake registration and production mask data restoration into UAT. 

Job Requirement

Technical skillsets

  • Strong Data warehouse working experience in banking domain and upstream system knowledge
  • Test methodologies and testing tools. E.g. TestNG, JUnit. Full-SDLC cycle, participated in large-scale live roll-out as a developer.
  • Demonstrated ability to solve complex problems
  • Should have experience in deployment and support, and monitoring the performance of the Control-M
  • Experience in loading the data files into EDW Staging using Teradata(optional) 
  • Good communication and interpersonal skills is a must
  • Unix scripting experience
  • Good knowledge and working experience in Hadoop, ETL (Informatica BDM) & SQL 
  • Strong experience in ETL is a MUST (with at least 5+ years) 
  • Database design, programming, tuning and query optimization
  • Having a great attitude, flexibility to stretch and take on challenges will be key to success in this role

 

Functional skillsets
EDW, Data marts, Data Integration

  • Highly advantageous to have hands-on experience in implementing large scale data warehouse & analytics platforms in financial services industry.
  • Good functional knowledge of products & services offered in Retail bank / Wholesale / Global Markets covering some of the following analytics domains:
Logo
×

Full Name*
Email address*
Upload a different Resume (Your application will be submitted using this resume instead)
Choose a file
Only .pdf is allowed
HACKERBUCK AWARDED