As a Data Engineer, you'll design and maintain data scrapers and data pipelines, design & optimize analytics & relational databases, and build analytics models using DBT and bespoke aggregation engines. You'll work closely with business stakeholders, other BI Developers and DataOps as well as System engineers to support both data and application integrations using bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, Apache NIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily.
Requirements:
-----------------
Essential Skills / Experience:
SQL, Data Modelling & Database Administration
Advanced SQL development and deep understanding of RDBMS concepts and engines
5+ years' experience designing dimensional models (Kimball-style stars/Inmon schemas)
Practical knowledge of Data Warehouse infrastructure architecture and best practices
Practical database design, development, administration knowledge (any RDBMS)
Strong understanding of data governance, quality, and privacy (e.g. GDPR compliance)
Data Transformation & Pipelines
Proficiency in ELT/ETL processes
Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc..) or open source Meltano, Airbyte, and Airflow
Proven experience with DBT (data build tool)
Analytics & Dashboarding
Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or similar).
Integration & Programming
Hands-on experience with API development and integration (REST/SOAP)
Proficiency in at least 1 object/procedural/functional language (e.g: Java, PHP, Python)
Familiarity with EAI tools such as MuleSoft/SnapLogic or Apache NiFi
Desirable Skills / Experience:
Experience with infrastructure-as-code tools such as Terraform and Ansible
Experience with version control (e.g. Git, SVN) and CI/CD workflows for deployment
Experience scraping external data sources using Beautiful Soup, Scrapy, or similar
Familiarity with Database Replication & CDC technologies such as Debezium
Familiarity with message & event-driven architecture, including tools like AWS MQ, Kafka
Exposure to cloud database services (e.g., AWS RDS, Snowflake)
Benefits:
-------------
25 days of holiday
Bonus
Pension contribution
Private medical, dental, and vision coverage
Life assurance
Critical illness cover
Wellness contribution program with access to ClassPass
Plumm Platform
Five volunteering days
Give as You Earn initiative
Learning and development programs
Electric Vehicle Scheme
Cycle to Work Scheme
Season Ticket Loan
Department:
Location:
London
United Kingdom
Contract:
Beware of fraud agents! do not pay money to get a job
MNCJobs.co.uk will not be responsible for any payment made to a third-party. All Terms of Use are applicable.