Senior Data Engineer

Posted 3 hours 54 minutes ago by Salt

Permanent
Not Specified
Other
Barcelona, Spain
Job Description

We have partnered with a global logistics business that is building a new Data Team to transform how the company utilises data! This is an exciting time to join as the team is just getting started on its journey, building everything from scratch, from a plethora of data points and striving for full automation across the business.

This is a huge undertaking and they're looking for a skilled Senior Data Engineer that is experienced in working with a high volume of diverse data, working on greenfield projects, automation and with GCP.

Tech - Python | SQL | Spark | GCP | BigQuery | Data Warehouses, Lake Houses Data Lakes

Location - Barcelona HQ - hybrid - 1 day a week in the office

Key Responsibilities as a Senior Data Engineer:

  • Design, develop, and optimize scalable data pipelines to ingest, process, and transform large datasets from various sources.
  • Build and maintain efficient and secure data storage solutions (lake houses, data lakes, warehouses, etc.) using cloud-based platforms.
  • Collaborate with Data Scientists, Data Analysts, and stakeholders to ensure data is easily accessible and available for analysis and reporting.
  • Implement best practices for data governance, quality, security, and privacy compliance.
  • Optimize data processing performance and scalability by evaluating and improving ETL/ELT workflows.
  • Automate data flow processes and develop tools to enable self-service data access for business users.
  • Mentor junior engineers and contribute to the continuous improvement of data engineering practices and technology stack.

Qualifications for the role of Senior Data Engineer:

  • Degree in a STEM field
  • Solid experience as a Senior Data Engineer or in a related role, with a proven track record of building and maintaining data systems.
  • Strong proficiency in SQL, Python and hands-on experience with ETL/ELT frameworks and data pipeline orchestration tools (eg, Airflow, Luigi, Prefect).
  • Experience with cloud platforms such as GCP (BigQuery, Dataflow).
  • Solid knowledge of data modelling techniques and experience with both relational and non-relational databases (PostgreSQL, MongoDB, etc.).
  • Expertise in optimizing data workflows and performance tuning for large datasets.

Would you be interested in hearing more? Reach me at (see below)