Lead Data Engineer(Snowflake/DBT)
Tide
All India, Delhi • 1 month ago
Experience: 7 to 11 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
After free-trial 6 Months subscription will be auto Activated @ $2.49 (Cancel Anytime).
Free Bluetooth earphones with 6 Months subscription only.
Enter Your Details
Job Description
As part of the team at Tide, you will be responsible for building and running the data pipelines and services that support business functions, reports, and dashboards. The company is heavily reliant on tools such as BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt, and Tableau/Looker for business intelligence, with a focus on making data-driven decisions to help SMEs save time and money.
**Key Responsibilities:**
- Developing end to end ETL/ELT Pipeline in collaboration with Data Analysts
- Designing and implementing scalable, automated processes for data extraction, processing, and analysis
- Mentoring Junior Engineers in the Team
- Troubleshooting technical issues and providing on-ground diagnosis
- Translating business requirements into technical specifications
- Performing exploratory data analysis to ensure data quality
- Building Looker Dashboard for use cases if required
**Qualifications Required:**
- 7+ years of extensive development experience using Snowflake or similar data warehouse technology
- Working experience with dbt, Snowflake, Apache Airflow, Fivetran, AWS, git, and Looker
- Experience in agile processes like SCRUM
- Proficiency in writing advanced SQL statements and performance tuning
- Expertise in data ingestion techniques and data modeling
- Experience with data mining, data warehouse solutions, ETL, and databases in a business environment
- Familiarity with architecting analytical databases in Data Mesh architecture
- Strong technical documentation skills and clear communication abilities
At Tide, you'll benefit from:
- Competitive Compensation
- Generous Time Off
- Parental Leave
- Sabbatical options
- Health Insurance
- Life & Accident Cover
- Mental Wellbeing support
- Volunteering & Development Days
- Learning & Development budget
- Work Outside the Office policy
- Home Office Setup assistance
- Laptop Ownership program
- Snacks & Meals at the office
Tide fosters a flexible workplace model that supports both in-person and remote work to cater to the needs of different teams. The company values diversity and inclusivity, celebrating individuals from various backgrounds and experiences. Tide is committed to transparency, ensuring that every voice is heard and respected. As part of the team at Tide, you will be responsible for building and running the data pipelines and services that support business functions, reports, and dashboards. The company is heavily reliant on tools such as BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt, and Tableau/Looker for business intelligence, with a focus on making data-driven decisions to help SMEs save time and money.
**Key Responsibilities:**
- Developing end to end ETL/ELT Pipeline in collaboration with Data Analysts
- Designing and implementing scalable, automated processes for data extraction, processing, and analysis
- Mentoring Junior Engineers in the Team
- Troubleshooting technical issues and providing on-ground diagnosis
- Translating business requirements into technical specifications
- Performing exploratory data analysis to ensure data quality
- Building Looker Dashboard for use cases if required
**Qualifications Required:**
- 7+ years of extensive development experience using Snowflake or similar data warehouse technology
- Working experience with dbt, Snowflake, Apache Airflow, Fivetran, AWS, git, and Looker
- Experience in agile processes like SCRUM
- Proficiency in writing advanced SQL statements and performance tuning
- Expertise in data ingestion techniques and data modeling
- Experience with data mining, data warehouse solutions, ETL, and databases in a business environment
- Familiarity with architecting analytical databases in Data Mesh architecture
- Strong technical documentation skills and clear communication abilities
At Tide, you'll benefit from:
- Competitive Compensation
- Generous Time Off
- Parental Leave
- Sabbatical options
- Health Insurance
- Life & Accident Cover
- Mental Wellbeing support
- Volunteering & Development Days
- Learning & Development budget
- Work Outside the Office policy
- Home Office Setup assistance
- Laptop Ownership program
- Snacks & Meals at the office
Tide fosters a flexible workplace model that supports both in-person and remote work to cater to the needs of different teams. The company values diversity and inclusivity, celebrating individuals from various backgrounds and experiences. Tide is committed to transparency, ensuring that every voice is heard and respected.
Skills Required
Snowflake
Airflow
dbt
Tableau
AWS
GCP
SQL
Python
Data Mining
ETL
Unit Testing
Continuous Integration
Data Quality
Data Governance
BigQuery
Stitch
Fivetran
Looker
Agile processes
Data Ingestion
Data Modelling
Data Warehouse Solutions
Data Mesh Architecture
Agile Crossfunctional Delivery Team
Code Quality
Posted on: March 15, 2026
Relevant Jobs
Step 2 of 2