Senior Data Engineer (EU-based, AWS+BigData)

115 Days Ago SFT (EU)
Remote Job Job View : 375 Job Apply : 0
Description

START: ASAP

JOB TYPE: LONG TERM, FULL TIME REMOTE JOB

SENIOR DATA ENGINEER (EU-BASED, AWS & BIG DATA)

We are seeking a Senior Data Engineer based in the EU to join our fully remote team. If you are passionate about building scalable data pipelines, optimizing big data processing, and working with modern cloud-based data technologies, this role is for you!

RESPONSIBILITIES:

•Design, develop, and maintain scalable data pipelines and ETL processes for processing large datasets.

•Work with AWS cloud services (Glue, Lambda, Redshift, S3, EMR, Athena) to manage and optimize data infrastructure.

•Implement streaming data processing using tools like Apache Kafka, Kinesis, or Spark Streaming.

• Develop and maintain data lakes and data warehouses using Snowflake, Redshift, or BigQuery.

•Optimize query performance and storage efficiency for structured and unstructured data.

• Automate data ingestion, transformation, and processing using Python, SQL, or Scala.

• Implement data quality checks, validation processes, and governance frameworks.

•Collaborate with data scientists, analysts, and engineers to provide clean, reliable, and well-structured datasets.

•Work with CI/CD pipelines for data workflows, ensuring automated deployments and monitoring.

• Ensure security, compliance, and GDPR alignment in all data processes.

REQUIREMENTS:

6+ years of experience as a Data Engineer or in a similar role.

•Strong experience with AWS data services (Glue, Redshift, S3, EMR, Athena, Kinesis).

•Proficiency in Python and SQL for data processing and automation.

• Hands-on experience with ETL pipeline development, data modeling, and warehouse optimization.

•Experience with big data frameworks such as Apache Spark, Hadoop, or Flink.

•Familiarity with streaming data architectures (Kafka, Kinesis, Pulsar).

• Strong understanding of data governance, GDPR, and compliance best practices.

•Experience with Docker and Kubernetes for deploying data processing workloads.

•Hands-on knowledge of CI/CD tools for automating data pipeline deployments.

Fluent in English, with excellent communication skills and ability to work in a remote team.

NICE TO HAVE:

• Experience with Snowflake, Databricks, or Google BigQuery.

• Knowledge of ML pipelines and AI-driven data processing.

•Familiarity with Terraform or Pulumi for infrastructure automation.

•Experience in FinTech, AI, or cybersecurity data environments.

WHY JOIN US?

100% Remote – Work from anywhere in the EU.

Long-term, stable projects with modern tech stacks.

Competitive salary and benefits package.

Collaborative, innovative environment with a strong data engineering culture.

 

Application ends in 04-03-2035

Please publish modules in offcanvas position.