Join Barclays as a Cloud Data Engineer - Python, where you'll be part of the Financial Metrics Reporting team within Private Bank. You will be responsible for building, deploying, and maintaining data pipelines on cloud platforms, ensuring data quality, security, and scalability, while also collaborating with teams to deliver actionable insights.
To be successful, you should have:
Strong proficiency in Python or Scala programming languages.
Experience with Apache Spark and big data processing frameworks.
Hands-on experience with AWS development, including some of the following services: Lambda, Glue, Step Functions, IAM roles, Lake Formation, EventBridge, SNS, SQS, EC2, Security Groups, CloudFormation, RDS, and DynamoDB.
Other highly valued skills may include:
Designing, building, and deploying data pipelines and workflows using Databricks.
Proven ability to design and develop enterprise-level software solutions using tools and techniques such as source control, build tools (e.g., Maven), TDD, and GitLab.
Knowledge of streaming services (Kafka, MSK, Kinesis, Glue Streaming, etc.) is preferable.
Experience with Databricks Delta Lake or MLflow.
You may be assessed on key critical skills relevant to success in this role, including risk and controls, change and transformation, business acumen, strategic thinking, digital and technology skills, as well as job-specific technical expertise.
This role is based in Glasgow
MNCJobs.co.uk will not be responsible for any payment made to a third-party. All Terms of Use are applicable.