Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: Sr./Quality Engineer (Cloud Solutions).
Malaysia Jobs Expertini

Urgent! Sr./Quality Engineer (Cloud Solutions) Job Opening In Kuala Lumpur – Now Hiring Confidential

Sr./Quality Engineer (Cloud Solutions)



Job description

Overview

Job Title: Quality Engineer (Cloud Solutions)

Working Arrangement: Hybrid

Location: Petaling Jaya

About Horizontal: Established since 2003 in the US, Horizontal solves complex challenges across two distinct businesses: Horizontal Digital and Horizontal Talent.

We are consistently recognized for being a top workplace and one of the fastest-growing private companies.

Horizontal Talent specializes in staffing for IT, Digital & Creative, and Business & Strategy markets.

We have global offices in US, UAE, India, Malaysia, and Australia.

Role Summary: We are looking for a detail-oriented and passionate Quality Engineer to ensure the reliability, quality, and integrity of modern cloud-based data platforms and software components.

This role will test complex data workflows involving identity resolution, data tokenization, and privacy-compliant data sharing, with a strong focus on quality practices in cloud and data ecosystems like Databricks, Snowflake, AWS, Azure, and GCP.

You will play a key role in validating ETL/ELT pipelines, data transformations, and integrations that power high-scale marketing, analytics, and identity systems.

Responsibilities

  • Execute manual and automated QA test cases for data pipelines and validation processes.

  • Batch and streaming data pipelines on Databricks, Snowflake, and Spark environments.

  • ETL/ELT workflows using orchestration tools like Airflow, dbt, or native cloud schedulers.

  • Validate data ingestion from various sources (S3, ADLS, GCS, Kafka) into staging and curated zones.

  • Perform field-level and schema-level validation between raw and transformed datasets.

  • Validate business logic transformations, aggregations, joins, and data quality rules.

  • Develop and maintain SQL- or Python-based test scripts to automate validation steps and data assertions.

  • Collaborate with Data Engineers to ensure pipeline coverage for edge cases, data anomalies, and lineage tracking.

  • Run data profile checks to detect duplicates, null values, incorrect mappings, and data drift.

  • Test and validate secure data-sharing setups such as Snowflake Clean Rooms and Databricks Delta Sharing; cross-cloud or multi-party identity joins and tokenization flows.

  • Perform QA of source-to-target mappings and documentation updates based on data lineage and metadata.

  • Integrate data validations into CI/CD pipelines using tools like GitHub Actions, Azure DevOps, or GitLab.

  • Develop and maintain test documentation, QA best practices, and automation frameworks.

  • Track and report on QA metrics and defects using modern bug-tracking tools.

Qualifications

  • 3+ years of experience in QA, data validation, or data pipeline testing.

  • Strong SQL skills with the ability to write data validation queries across complex joins and aggregations.

  • Experience with scripting in Python for test automation.

  • Familiarity with data formats such as Parquet, Avro, JSON, and Delta Lake.

  • Familiarity with batch vs.

    streaming data processing concepts.

  • Familiarity with cloud storage and compute concepts in AWS (S3, Glue), Azure (Data Lake, Synapse), or GCP (BigQuery, Dataflow).

  • Familiarity with QA practices: test planning, exploratory testing, regression, and bug tracking.

  • Familiarity with one or more test frameworks (e.g., Mocha, Jasmine, Jest).

  • Good communication skills, attention to detail, and a team-oriented mindset.

Preferred Qualifications

  • Exposure to data orchestration tools like Airflow, dbt, or cloud-native schedulers (e.g., AWS Step Functions, Azure Data Factory).

  • Experience using Databricks notebooks or Snowflake SQL worksheets for test development.

  • Basic understanding of distributed compute concepts (Spark, Delta Live Tables).

  • Familiarity with data quality frameworks (e.g., Great Expectations, Deequ) or metadata management tools (e.g., Collibra, Alation).

  • Interest in learning identity resolution workflows, tokenization, and privacy-first data architecture.

  • Understanding of regulatory compliance frameworks (e.g., GDPR, CCPA) and secure data-sharing practices.

Seniority level

  • Mid-Senior level

Employment type

  • Full-time

Job function

  • Quality Assurance
  • Industries
  • Staffing and Recruiting

#J-18808-Ljbffr


Required Skill Profession

It & Technology



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your Sr Quality Potential: Insight & Career Growth Guide