Design, build, and maintain scalable data pipelines using
Snowflake, Hadoop, Spark, NiFi
, and related big data technologies.
Implement data architectures and optimize workflows for massive financial datasets.
Write high-quality, maintainable code in
Python and SQL
following best practices.
Integrate data governance principles, metadata management, and lineage tracking into solutions.
Data Quality Assurance & Testing
Develop automated testing frameworks and validation scripts for ETL processes and data transformations.
Implement
data quality checks, reconciliation processes, and regression testing suites
to ensure accuracy, completeness, and timeliness.
Perform unit, integration, and end-to-end testing for data pipelines and schema changes.
Use tools like dbt tests, custom Python utilities for automated validation.
Collaboration & Agile Delivery
Work closely with Data Engineers, Product, and Data Science teams to embed testing into the development lifecycle.
Participate in agile ceremonies (sprint planning, backlog refinement, retrospectives) with a focus on quality and delivery.
Support production incident response with rapid data validation and root cause analysis.
Continuous Improvement
Stay current with emerging data engineering and testing technologies.
Contribute to team knowledge sharing, mentoring junior engineers, and improving technical standards.
Shape best practices for data reliability, testing automation, and CI/CD integration.
Skills & Qualifications
Core Technical Expertise
Advanced
SQL
and experience with relational and NoSQL databases.
Strong experience with
Snowflake, Hadoop, Spark, Databricks, Kafka
, and cloud data platforms.
Proficiency in
Python
for both data engineering and test automation.
Familiarity with orchestration tools and workflow management systems.
Testing & Quality
Proven experience in
data testing methodologies
, ETL validation, and automated testing frameworks.
Knowledge of
data profiling, anomaly detection, and statistical validation techniques
.
Experience integrating testing into CI/CD pipelines.
Professional Attributes
Strong problem-solving and analytical skills with attention to detail.
Excellent communication skills for cross-functional collaboration.
Ability to work independently and manage multiple priorities in fast-paced environments.
Job Type: Fixed term contract
Contract length: 6 months
Pay: 80,000.00-85,000.00 per day
Experience:
Data Engineering: 5 years (required)
Snowflake: 3 years (required)
Hadoop: 3 years (required)
Spark: 3 years (required)
NiFi: 2 years (required)
Python: 2 years (required)
SQL : 2 years (required)
Databricks: 3 years (required)
Kafka: 1 year (required)
Data testing methodologies: 3 years (required)
* CI/CD pipelines: 2 years (required)
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.uk will not be responsible for any payment made to a third-party. All Terms of Use are applicable.