Principal GCP Data Engineer
<b>Requirements:</b>
<ul><li>Proven experience delivering production-ready data solutions on Google Cloud Platform</li><li>Strong knowledge of batch and streaming frameworks, data pipelines, and orchestration tools</li><li>Expertise in designing and managing structured and unstructured data systems</li><li>Experience translating business needs into technical solutions</li><li>Ability to mentor and coach teams and guide technical decision-making</li><li>Excellent communication skills, with the ability to explain technical concepts to technical and non-technical stakeholders</li><li>A pragmatic approach to problem solving, combined with a drive for technical excellence</li></ul>
<b>Responsibilities:</b>
<ul><li>Lead the design, development, and delivery of data processing solutions using GCP tools such as Dataflow, Dataproc, and BigQuery</li><li>Design automated data pipelines using orchestration tools like Cloud Composer</li><li>Contribute to architecture discussions and design end-to-end data solutions</li><li>Own development processes for your team, establishing robust principles and methods across architecture, code quality, and deployments</li><li>Shape team behaviours around specifications, acceptance criteria, sprint planning, and documentation</li><li>Define and evolve data engineering standards and practices across the organisation</li><li>Lead technical discussions with client stakeholders, achieving buy-in for solutions</li><li>Mentor and coach team members, building technical expertise and capability</li><li>Develop production-ready data pipelines and processing jobs using batch and streaming frameworks such as Apache Spark and Apache Beam</li><li>Apply expertise in data storage technologies including relational, columnar, document, NoSQL, data warehouses, and data lakes</li><li>Implement modern data pipeline patterns, event-driven architectures, ETL/ELT processes, and stream processing solutions</li><li>Translate business requirements into technical specifications and actionable solution designs</li><li>Work with metadata management and data governance tools such as Cloud Data Catalog, Collibra, or Dataplex</li><li>Build data quality alerting and data quarantine solutions to ensure downstream reliability</li><li>Implement CI/CD pipelines with version control, automated tests, and automated deployments</li><li>Collaborate in Agile teams, using Scrum or Kanban methodologies</li></ul>
<b>Technologies:</b>
<ul><li>BigQuery</li><li>CI/CD</li><li>Cloud</li><li>Composer</li><li>ETL</li><li>GCP</li><li>Kanban</li><li>NoSQL</li><li>Spark</li></ul>
<p><b>More:</b></p>
<p>We are an award-winning innovation and transformation consultancy known for our cutting-edge work in data engineering, cloud solutions, and enterprise transformation. With a culture that fosters the growth of technical specialists, we empower our team to turn complexity into opportunity. We are looking for a Principal GCP Data Engineer to join our data and analytics practice, leading the design and delivery of end-to-end data solutions on Google Cloud Platform. This role offers the chance to shape data strategy and drive technical excellence across complex programmes while enjoying a collaborative, inclusive, and learning-focused culture.</p>
<p>last updated 8 week of 2026</p>