Technical Lead- Data Engineering
Sigmoid Analytics
All India, Delhi • 1 month ago
Experience: 9 to 13 Yrs
PREMIUM
Deal of the Day
--:--:--
A recruiter messaged CVX24 Premium users few seconds ago.
Upgrade to CVX24 Premium: Only $2.49
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Get practical interview tips and guidance
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$4.99
$2.49
🔥 50% OFF
Activate
$4.99
$2.49
all inc.
(Validity: 6 Months. After payment confirmation we will reach out to you)
Enter Your Details
Job Description
You will be joining Sigmoid as a Technical Lead Data Engineering, where your primary responsibility will be to build a scalable and extensible big data platform for collecting, storing, modeling, and analyzing massive data sets from various channels. You will report to the Engineering Manager.
**Responsibilities:**
- Align Sigmoid with key Client initiatives
- Interface daily with customers from leading Fortune 500 companies to understand strategic requirements
- Engage with VP and Director level clients regularly
- Travel to client locations
- Understand business requirements and translate them into technology solutions
- Design, develop, and evolve highly scalable and fault-tolerant distributed components using Big data technologies
- Experience in Application development, support, integration development, and data management
- Provide technical leadership and manage it on a day-to-day basis
- Guide developers in design and coding tasks
- Key role in hiring technical talents for Sigmoid's future
- Stay updated on the latest technology trends for maximizing ROI
- Hands-on coding with a good understanding of enterprise-level code
- Design and implement APIs, abstractions, and integration patterns to solve distributed computing problems
- Define technical requirements, data extraction, transformation, automate jobs, productionize jobs, and explore new big data technologies in a Parallel Processing environment
**Qualifications:**
- 9+ years of relevant work experience in computer science or a related technical discipline
- Experience in architecture and delivery of Enterprise scale applications, developing frameworks, design patterns, etc.
- Proven track record of building and shipping large-scale engineering products, knowledge of cloud infrastructure such as Azure/GCP preferred
- Experience with large, complex data sets from various sources
- Experience with Hadoop, Spark, or similar stack is a must
- Experience with functional and object-oriented programming, Python, or Scala is a must
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists, and product managers
- Technical knowledge in Spark, Hadoop & GCS Stack
- Comfortable working in a fast-paced start-up environment
**Preferred Qualifications:**
- Experience in agile methodology
- Development and support experience in the Big Data domain
- Architecting, developing, implementing, and maintaining Big Data solutions
- Experience with database modeling and development, data mining, and warehousing
- Experience with Hadoop ecosystem (HDFS, MapReduce, Oozie, Hive, Impala, Spark, Kerberos, KAFKA, etc) You will be joining Sigmoid as a Technical Lead Data Engineering, where your primary responsibility will be to build a scalable and extensible big data platform for collecting, storing, modeling, and analyzing massive data sets from various channels. You will report to the Engineering Manager.
**Responsibilities:**
- Align Sigmoid with key Client initiatives
- Interface daily with customers from leading Fortune 500 companies to understand strategic requirements
- Engage with VP and Director level clients regularly
- Travel to client locations
- Understand business requirements and translate them into technology solutions
- Design, develop, and evolve highly scalable and fault-tolerant distributed components using Big data technologies
- Experience in Application development, support, integration development, and data management
- Provide technical leadership and manage it on a day-to-day basis
- Guide developers in design and coding tasks
- Key role in hiring technical talents for Sigmoid's future
- Stay updated on the latest technology trends for maximizing ROI
- Hands-on coding with a good understanding of enterprise-level code
- Design and implement APIs, abstractions, and integration patterns to solve distributed computing problems
- Define technical requirements, data extraction, transformation, automate jobs, productionize jobs, and explore new big data technologies in a Parallel Processing environment
**Qualifications:**
- 9+ years of relevant work experience in computer science or a related technical discipline
- Experience in architecture and delivery of Enterprise scale applications, developing frameworks, design patterns, etc.
- Proven track record of building and shipping large-scale engineering products, knowledge of cloud infrastructure such as Azure/GCP preferred
- Experience with large, complex data sets from various sources
- Experience with Hadoop, Spark, or similar stack is a must
- Experience with functional and object-oriented programming, Python, or Scala is a must
- Effective communication skills (both written and verbal)
- Ability to collaborate with a diverse set of engineers, data scientists, and product managers
- Technical knowledge in Spark, Hadoop & GCS Stack
- Comfortable working in a fast-paced start-up environment
**Preferred Qualifications:**
- Ex
Skills Required
Big Data
Data Engineering
Predictive analytics
Application development
Integration development
Data management
APIs
Parallel Processing
Hadoop
Spark
Python
Scala
Azure
GCP
Agile methodology
Database modeling
Data mining
Warehousing
AI consulting
Cloud data modernization
Generative AI
DataOps
Abstractions
Integration patterns
GCS Stack
Posted on: March 5, 2026
Relevant Jobs
Step 2 of 2