Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: MLOps Data Engineer (MLOps DE).
Malaysia Jobs Expertini

Urgent! MLOps Data Engineer (MLOps DE) Job Opening In Kuala Lumpur – Now Hiring Boost

MLOps Data Engineer (MLOps DE)



Job description

Job Overview

The role will be responsible to bridge the gap between data science and IT operations, enabling seamless model lifecycle management and scalable ML infrastructure.

It involves designing, developing, and optimizing data pipelines to support reliable and efficient data workflows across the organization for both structured and unstructured data, supporting Boost’s unified data modelling across AI, machine learning, and analytics projects.

The role collaborates with cross‑functional teams—including data scientists, analysts, software engineers, and DevOps—to optimize the production and deployment of machine learning solutions and enhance data‑driven decision‑making.

Scope & Authority

  • Develop and automate deployment pipelines for machine learning models, ensuring smooth transition from development to production.

  • Implement monitoring systems to track model performance and stability, addressing model drift or data drift issues.

  • Design, build, manage and maintain end‑to‑end model lifecycle processes including pipelines, CI/CD, performance evaluation, experiments, observability, deployments, versioning, and serving.

  • Manage and optimize ML infrastructure and cloud platforms (e.g., AWS, GCP, Azure, Databricks, Snowflake).

  • Support model scalability, reliability, and reproducibility in production by working with data scientists and DevOps.

  • Maintain documentation for model pipelines, infrastructure, and deployment procedures to ensure consistency and compliance.

  • Design, develop, manage, and maintain scalable ETL/ELT data pipelines to support ingestion, transformation, and integration of data from multiple sources.

  • Design and implement robust data architectures, models, and storage solutions for efficient processing and accessibility.

  • Optimize and manage data warehouses, ensuring high availability, reliability, and performance; implement data quality checks, monitoring, and governance protocols.

  • Identify and address performance bottlenecks in implemented solutions.

  • Maintain thorough documentation of data pipelines, catalogs, workflows, and lineage for transparency and reproducibility.

  • Support the ongoing Centralized Data Platform initiative between Boost entities, aiming to create a single view of Boost users and merchants.

  • Automate data pipelines to reduce manual effort and improve overall efficiency.

  • Conform to best practices when designing or implementing a solution.

Qualifications

  • Bachelor’s Degree in Computer Science, Data Engineering, Machine Learning, or a related field with a minimum of 3+ years of experience designing, developing, and maintaining large Data Warehouse and analytics projects.

  • Strong problem‑solving skills, collaboration skills, adaptability to evolving technology, commitment to process improvement, attention to detail, and the ability to communicate technical concepts effectively to non‑technical stakeholders.

  • Strong knowledge of cloud platforms for data solutions (AWS, Azure, GCP, Databricks, Snowflake).

    Strong knowledge in ETL/ELT tools (e.g., Apache Airflow, AWS Glue, Databricks Jobs/Pipelines).

    Proficiency in data modeling and schema design.

  • Proficiency in programming languages such as Python or Bash scripting.

  • Proficiency in SQL and data warehouse (e.g., Redshift, BigQuery, Databricks, Snowflake or similar).

  • Familiarity with ML/MLOps frameworks (e.g., mlflow, TensorFlow, PyTorch, scikit‑learn).

  • Familiarity with data governance frameworks and best practices.

  • Familiarity with data lake architectures.

  • Familiarity with big data processing frameworks (e.g., Apache Spark, Hadoop).

  • Familiarity with Infrastructure as Code – IaC (e.g., Terraform, Terragrunt, Serverless framework).

  • Experience with CI/CD tools (e.g., Jenkins, GitLab CI/CD).

#J-18808-Ljbffr


Required Skill Profession

It & Technology



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your MLOps Data Potential: Insight & Career Growth Guide