Description:
As a Senior Data Engineer, you will play a key role in designing, developing, and optimising data solutions on Google Cloud Platform (GCP). You will work closely with clients to implement scalable and reliable data pipelines, ensuring efficient data processing and analytics.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, DataFlow (Apache Beam), and Cloud Storage.
- Implement ETL/ELT processes for data ingestion, transformation, and loading.
- Optimise data workflows for performance, scalability, and reliability.
- Ensure data governance, security, and compliance best practices.
- Collaborate with data scientists and analysts to enable seamless access to structured and unstructured data.
- Maintain code repositories and CI/CD pipelines to ensure efficient deployment processes.
- Troubleshoot and resolve data infrastructure issues, minimising downtime and performance bottlenecks.
- Stay up to date with industry trends and emerging technologies, continuously improving data engineering practices.
Key Skills & Experience:
- 5+ years of experience in data engineering, with a strong background in Google Cloud Platform (GCP).
- Expert knowledge of SQL for data querying and manipulation within BigQuery.
- Experience in DataForm, Pub/Sub, BigQuery, Cloud Storage and other GCP services.
- Strong proficiency in Python for data processing tasks.
- Deep understanding and experience in data warehousing, schema design, and data modelling.
- Experience with ETL tools and frameworks for data ingestion and transformation.
- Knowledge of containerisation technologies (Docker, Kubernetes) is a plus.
- Strong problem-solving skills and ability to thrive in a fast-paced environment.
- Excellent communication and collaboration skills.
- Bachelor’s degree in Computer Science, Engineering, or a related field (Master’s preferred).