Design, build, and maintain scalable, secure, and efficient data pipelines on AWS.
Develop
ETL/ELT workflows
using services such as
AWS Glue
,
AWS Lambda
,
EMR
,
Step Functions
, and
Kinesis
.
Integrate data from APIs, databases, SaaS platforms, and streaming sources.
Build and manage data lakes using
Amazon S3
and associated Lake Formation/Data Catalog components.
Develop optimized
data models
, dimensional models, and analytical datasets for downstream consumption.
Work with data warehousing technologies such as
Amazon Redshift
or Snowflake (optional).
Collaborate with architects to design scalable, cost-effective AWS data architectures.
Optimize performance of ETL pipelines, storage layers, and compute clusters.
Implement best practices for monitoring, logging, and cloud resource utilization.
Implement data quality checks, validation rules, and automated monitoring.
Ensure security compliance using IAM, encryption, data access control, and AWS governance tools.
Maintain documentation, lineage, and metadata for all data assets.
Partner with data analysts, data scientists, BI teams, and product stakeholders.
Translate business needs into technical design specifications.
Participate in sprint planning, design reviews, and technical workshops.
Required Skills & Experience (3-12 Years)
Candidates will be considered at
junior-mid
,
mid-senior
, or
senior
levels depending on experience.
Core Technical Skills
Hands-on experience with
AWS data services
, such as:
AWS Glue (ETL / PySpark)
Amazon S3
AWS Lambda
Amazon EMR / Spark
Amazon Redshift or Redshift Spectrum
AWS Step Functions
Amazon Kinesis or MSK
Strong
SQL
skills and proficiency in
Python
(PySpark experience preferred).
Experience designing and maintaining
data pipelines
for batch and/or streaming workloads.
Expertise in
data modelling
, data warehousing concepts, and data lake architectures.
Experience with
version control (Git)
and
CI/CD
pipelines (AWS DevOps / CodePipeline / GitHub Actions etc.).
Experience with Terraform or CloudFormation for IaC.
Knowledge of Snowflake, Databricks, or other cloud data platforms.
Exposure to MLops / data orchestration frameworks.
AWS certifications (e.g.,
AWS Data Analytics - Specialty
,
AWS Solutions Architect
) are a plus.
Job Types: Full-time, Permanent
Pay: 55,000.00-110,000.00 per year
Benefits:
* Work from home
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.uk will not be responsible for any payment made to a third-party. All Terms of Use are applicable.