Senior Big Data Engineer
dunnhumby Limited
All India • 1 month ago
Experience: 5 to 9 Yrs
PREMIUM
Deal of the Day
--:--:--
A recruiter messaged CVX24 Premium users few seconds ago.
Upgrade to CVX24 Premium: Only $2.49
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Get practical interview tips and guidance
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$4.99
$2.49
🔥 50% OFF
Activate
$4.99
$2.49
all inc.
(Validity: 6 Months. After payment confirmation we will reach out to you)
Enter Your Details
Job Description
As a Sr Big Data Engineer at dunnhumby, you will be responsible for designing end-to-end data solutions, architecting scalable data pipelines, developing automation frameworks, and ensuring data integrity and availability. Your role will involve leading architectural reviews, implementing data governance standards, and mentoring technical teams. Your technical expertise should include proficiency in data pipeline tools, experience with cloud platforms, understanding of API design, and familiarity with modern data stack tools. Strong problem-solving skills, effective communication, collaboration with cross-functional teams, and adaptability to new technologies are essential soft skills for this role.
**Key Responsibilities:**
- Design end-to-end data solutions, including data lakes, data warehouses, ETL/ELT pipelines, APIs, and analytics platforms.
- Architect scalable and low-latency data pipelines using tools like Apache Kafka, Flink, or Spark Streaming.
- Design and orchestrate end-to-end automation using frameworks like Apache Airflow.
- Develop intelligent systems for anomaly detection, alerts, and process maintenance.
- Define data architecture strategies supporting advanced analytics, machine learning, and real-time processing.
- Implement data governance, metadata management, and data quality standards.
- Lead architectural reviews and technical design sessions.
- Translate business needs into data architecture requirements.
- Explore tools, platforms, and technologies aligned with organizational standards.
- Ensure security, compliance, and regulatory requirements in data solutions.
- Evaluate and recommend improvements to existing data architecture and processes.
- Provide mentorship and guidance to data engineers and technical teams.
**Qualification Required:**
- Bachelor's or master's degree in computer science, Information Systems, Data Science, or related field.
- 5+ years of experience in data architecture, data engineering, or related field.
- Proficiency in data pipeline tools such as Apache Spark, Kafka, Airflow, or similar.
- Experience with data governance frameworks and cloud platforms.
- Strong understanding of API design, data security, and modern data stack tools.
- Knowledge of high-level programming languages like Python, Java & Scala.
- Experience with Hadoop/Spark Toolsets, relational database management systems, and data flow development.
In your role at dunnhumby, you can expect not only to meet but exceed your expectations. You will enjoy a comprehensive rewards package, personal flexibility, and thoughtful perks like flexible working hours and your birthday off. Additionally, you will benefit from an investment in cutting-edge technology that reflects the company's global ambition, with a small-business feel that fosters innovation and growth. As a Sr Big Data Engineer at dunnhumby, you will be responsible for designing end-to-end data solutions, architecting scalable data pipelines, developing automation frameworks, and ensuring data integrity and availability. Your role will involve leading architectural reviews, implementing data governance standards, and mentoring technical teams. Your technical expertise should include proficiency in data pipeline tools, experience with cloud platforms, understanding of API design, and familiarity with modern data stack tools. Strong problem-solving skills, effective communication, collaboration with cross-functional teams, and adaptability to new technologies are essential soft skills for this role.
**Key Responsibilities:**
- Design end-to-end data solutions, including data lakes, data warehouses, ETL/ELT pipelines, APIs, and analytics platforms.
- Architect scalable and low-latency data pipelines using tools like Apache Kafka, Flink, or Spark Streaming.
- Design and orchestrate end-to-end automation using frameworks like Apache Airflow.
- Develop intelligent systems for anomaly detection, alerts, and process maintenance.
- Define data architecture strategies supporting advanced analytics, machine learning, and real-time processing.
- Implement data governance, metadata management, and data quality standards.
- Lead architectural reviews and technical design sessions.
- Translate business needs into data architecture requirements.
- Explore tools, platforms, and technologies aligned with organizational standards.
- Ensure security, compliance, and regulatory requirements in data solutions.
- Evaluate and recommend improvements to existing data architecture and processes.
- Provide mentorship and guidance to data engineers and technical teams.
**Qualification Required:**
- Bachelor's or master's degree in computer science, Information Systems, Data Science, or related field.
- 5+ years of experience in data architecture, data engineering, or related field.
- Proficiency in data pipeline tools such as Apache Spark, Kafka, Airflow, or similar.
- Experience with data governance frameworks and cloud platforms.
- Strong understanding of
Skills Required
Apache Kafka
Machine Learning
Data Governance
Metadata Management
Data Security
Agile
dbt
Snowflake
Python
Java
Scala
Hive
Oozie
HBase
Spark
Hadoop
Git
Flink
Spark Streaming
Apache Airflow
Data Quality Standards
Cloud Platforms
API Design
Dev Ops
Databricks
Map Reduce
Relational Database Management Systems RDBMS
Data Flow Development
Posted on: March 13, 2026
Relevant Jobs
Step 2 of 2