GCP Engineer

Posted 2 days 20 hours ago by Gravitai Ltd

Permanent
Not Specified
Other
London, United Kingdom
Job Description

We're a small team with really big ambitions, both for what we want to achieve and also for the culture we're building. We want to create a company that remains people-focused, harnessing the power of empowered and engaged teams. As we scale, we want people to really own what they do and be given the autonomy and freedom to make mistakes, learn, and create something meaningful.

We all work incredibly hard because we really care about what Gravitai represents and stands for - we're looking for exceptional people who are proactive, hungry to learn, and want to put their pride in our collective achievements.

We believe that having diversity in age, background, gender identity, race, sexual orientation, physical or mental ability, ethnicity, and perspective will make us an infinitely better company.

Health, Safety, and Well-being are at the heart of everything we do.

Purpose

The Google Cloud Platform (GCP) Data Engineer will be responsible for designing, developing, and maintaining scalable data solutions in the cloud.

The ideal candidate will have strong experience with GCP services, data pipelines, ETL processes, and big data technologies. You will work closely with data scientists, analysts, and software engineers to optimise data workflows and ensure the integrity and security of data within the GCP ecosystem.

You'll work closely with developers, end-users, and stakeholders to deliver projects smoothly and improve the system over time.

We're looking for someone who's not just technical but also enjoys working with people and solving real-world business challenges.

Main Duties and Responsibilities
  • Design and Develop Data Pipelines: Build and implement scalable data pipelines using GCP services, including Cloud Dataflow, Cloud Dataproc, Apache Beam, and Cloud Composer (Apache Airflow).
  • ETL/ELT Workflow Management: Develop, optimise, and maintain ETL/ELT workflows for structured and unstructured data.
  • Big Data Solutions: Manage and optimise big data environments leveraging BigQuery, Cloud Storage, Pub/Sub, and Data Fusion.
  • Data Integrity and Security: Ensure data quality, security, and governance by following industry best practices.
  • Database Expertise: Work with both SQL and NoSQL databases, such as BigQuery, Cloud SQL, Firestore, and Spanner.
  • Automation and Infrastructure as Code: Automate data workflows using Terraform, CI/CD pipelines, and Infrastructure as Code (IaC) methodologies.
  • Performance Monitoring and Troubleshooting: Identify and resolve performance bottlenecks, failures, and latency issues.
  • Cross-Functional Collaboration: Work closely with analytics, AI/ML, and business intelligence teams to integrate data solutions.
  • Real-Time and Batch Processing: Implement efficient data management strategies for both real-time and batch processing.
  • Technical Documentation: Maintain comprehensive documentation of technical specifications, workflows, and best practices.
Experience & Expertise
  • Education: Bachelor's Degree, Information Systems, or a related field (Preferred).

Experience

  • 3+ years of hands-on experience in data engineering with GCP.
  • Strong proficiency in SQL, Python, and/or Java/Scala for data processing.
  • Practical experience with BigQuery, Cloud Dataflow, Cloud Dataproc, and Apache Beam.
  • Experience with event-driven streaming platforms such as Apache Kafka or Pub/Sub.
  • Familiarity with Terraform, Kubernetes (GKE), and Cloud Functions.
  • Strong understanding of data modeling, data lakes, and data warehouse design.
  • Knowledge of Airflow, Data Catalog, and IAM security policies.
  • Exposure to DevOps practices, CI/CD pipelines, and containerisation (Docker, Kubernetes) is a plus.

Skills:

  • Strong analytical and problem-solving abilities.
  • Ability to thrive in an agile, fast-paced environment.
Preferred Qualifications
  • Certification: GCP Professional Data Engineer Certification (Required).
  • Machine Learning Integration: Experience with ML pipelines using Vertex AI or TensorFlow on GCP.
  • Cloud Architecture: Familiarity with multi-cloud and hybrid cloud environments.
Benefits

28 days of holiday plus Bank Holidays.

Regular socials & team events including Christmas events between all offices and staff (incl. remote).

Remote-first position, preferably for UK-based candidates, with the option of contract-based role for non-UK staff.