Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Developer Hadoop, Teradata, Python.
Malaysia Jobs Expertini

Urgent! Developer - Hadoop, Teradata, Python Job Opening In Kuala Lumpur – Now Hiring Luxoft

Developer Hadoop, Teradata, Python



Job description

Project description

Our Client a leading bank in Asia with a global network of more than 500 branches and offices in 19 countries and territories in Asia Pacific, Europe, and North America, are looking for Consultants to be part of the project.

The Technology and Operations function is comprised of five teams of specialists with distinct capabilities: business partnership, technology, operations, risk governance, and planning support and services.

They work closely together to harness the power of technology to support our physical and digital banking services and operations.

This includes developing, centralising, and standardising technology systems as well as banking operations in Malaysia and overseas branches.

The client has more than 80 years of history in the banking industry and is expanding its footprint in Malaysia.

You will be working in a newly set-up technology centre located in Kuala Lumpur as part of Technology and Operations to deliver innovative financial technology solutions that enable business growth and technology transformation.

Responsibilities

  • Design, develop, and maintain data pipelines and ETL workflows using Informatica Data Integration Suite, Python, and R.

  • Build and optimize large-scale data processing systems on Cloudera Hadoop (6.x) and Teradata Inteliflex platforms.

  • Implement data ingestion, transformation, and storage solutions integrating diverse data sources, including Oracle, SQL Server, PostgreSQL, and AS400.

  • Develop and deploy dashboards and analytics solutions using QlikSense, Microsoft Power BI, and other visualization tools.

  • Collaborate with business teams to deliver analytics and decision-support solutions across domains like Credit Risk Analytics, Credit Scoring,
  • Treasury & Wealth Management, and Trade Finance.

  • Leverage data science tools (Python, R Studio, Kafka, Spark) to support predictive modeling, scoring, and advanced analytics use cases.

  • Participate in code reviews, performance tuning, and data quality validation using tools like QuerySurge, SonarCube, and JIRA.

  • Automate workflows, deployments, and job scheduling using Jenkins, Control-M, and Bitbucket.

  • Ensure scalability, security, and governance of data solutions in production environments across Linux, AIX, Windows, and AS400 platforms.

Must have

  • 3 to 5 years experience in Big Data & Data Engineering: Cloudera Hadoop (6.x), Spark, Hive, HUE, Impala, Kafka
  • ETL & Data Integration: Informatica (BDM, IDQ, IDL), QuerySurge
  • Databases: Teradata Inteliflex, Oracle, SQL Server, PostgreSQL
  • Data Visualization: QlikSense Discovery, Microsoft Power BI
  • Programming & Analytics: Python, R, R Studio
  • Version Control & Automation: Jenkins, Bitbucket, Control-M
  • OS: AS400, AIX, Linux, Windows
  • Domain Knowledge: Minimum 1 of the following:
  • Credit Risk Analytics
  • Credit Scoring & Decision Support
  • Treasury & Wealth Management (Murex)
  • Trade Finance & Accounts Receivable (FITAS, ARF)
  • Retail Banking & Cards (Silver Lake)
  • Data Modeling (FSLDM / Data Marts)

Nice to have

AS400, Experian PowerCurve, SAS

#J-18808-Ljbffr


Required Skill Profession

It & Technology



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Developer Hadoop Potential: Insight & Career Growth Guide