Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: AVP, Data Integration (pySpark, Nifi, Hadoop).
Malaysia Jobs Expertini

Urgent! AVP, Data Integration (pySpark, Nifi, Hadoop) Job Opening In Kuala Lumpur – Now Hiring Maybank

AVP, Data Integration (pySpark, Nifi, Hadoop)



Job description

AVP, Data Integration (pySpark, Nifi, Hadoop)

Maybank WP.

Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia

Get AI-powered advice on this job and more exclusive features.

  • Implement ETL systems that are operationally stable, efficient, and automated.

    This includes technical solutions that are scalable, aligned with the enterprise architecture, and adaptable to business changes.

  • Collaborate with internal and external teams to define requirements for data integrations, specifically for Data Warehouse/Data Marts implementations.

Responsibilities of the Role

  • Review business and technical requirements to ensure the data integration platform meets specifications.

  • Apply industry best practices for ETL design and development.

  • Produce technical design documents, system testing plans, and implementation documentation.

  • Conduct system testing: execute job flows, investigate and resolve system defects, and document results.

  • Work with DBAs, application specialists, and technical support teams to optimize ETL system performance and meet SLAs.
  • Assist in developing, documenting, and applying best practices and procedures.

  • Strong SQL writing skills are required.

  • Familiarity with ETL tools such as pySpark, NiFi, Informatica, and Hadoop is preferred.

  • Understanding of data integration best practices, including master data management, entity resolution, data quality, and metadata management.

  • Experience with data warehouse architecture, source system data analysis, and data profiling.

  • Ability to work effectively in a fast-paced, adaptive environment.

  • Financial domain experience is a plus.

  • Ability to work independently and communicate effectively across various levels, including product owners, executive sponsors, and team members.

  • Experience working in an Agile environment is advantageous.

Qualifications

  • Bachelor’s Degree in Computer Science, Information Technology, or equivalent.

  • Over 5 years of total work experience, with experience programming ETL processes using Informatica, NiFi, pySpark, and Hadoop.

  • At least 4 years of experience in data analysis, profiling, and designing ETL systems/programs.

Seniority level

  • Mid-Senior level

Employment type

  • Full-time

Job function

  • Information Technology

Industries

  • Banking
#J-18808-Ljbffr


Required Skill Profession

Management & Operations



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your AVP Data Potential: Insight & Career Growth Guide