Data Engineer Scala Spark Gcp

London, United Kingdom

Job Description

Data Engineer Scala Spark GCP

Data Engineer (Scala Spark GCP Big Data) *Remote Interview WfH*. Technologist Data Engineer sought to work on new innovations for a technology savvy media / publishing group.

As a Data Engineer you will create and maintain the data pipeline and associated analytics tools, insights and business performance metrics; you'll be responsible for building the infrastructure for optimal extraction, transformation and loading of data. You'll be working with the latest technology on a modern cloud based technology stack encompassing Scala, Python, Apache Spark, BigQuery, DBT, Apache Airflow, AWS and GCP, improving access to data and data insights as well optimising huge datasets.

The company can currently offer a remote interview and onboarding process as well as the ability to work from home throughout the term of the contract (although occasional visits to the London office could be required).


  • Strong Scala development skills, ideally also Python
  • Strong Apache Spark experience
  • Strong knowledge of Cloud technologies: GCP and AWS primarily
  • Experience with Data warehouse technologies including BigQuery, DBT and Apache Airflow
  • Used to dealing with large datasets / Data Lake
  • Excellent communication / collaboration skills

Apply now or call to find out more about this Data Engineer (Scala Spark Cloud Big Data) contract opportunity.

Job Role: Data Engineer (Scala Spark AWS); Location: Remote WfH; Rate: £600 to £650 p/day; Term: 6 months; Start: Immediate / ASAP

Beware of fraud agents! do not pay money to get a job will not be responsible for any payment made to a third-party. All Terms of Use are applicable.

Related Jobs

Job Detail

  • Job Id
  • Industry
    Not mentioned
  • Total Positions
  • Job Type:
    Full Time
  • Salary:
    Not mentioned
  • Employment Status
  • Job Location
    London, United Kingdom
  • Education
    Not mentioned