GCP & Bigdata Engineer
NucleusTeq
All India, Gurugram • 2 months ago
Experience: 4 to 8 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
After free-trial 6 Months subscription will be auto Activated @ $2.49 (Cancel Anytime).
Free Bluetooth earphones with 6 Months subscription only.
Enter Your Details
Job Description
As a skilled GCP & Big Data Engineer, your role will involve designing, building, and optimizing scalable data pipelines using Google Cloud services and Big Data technologies. You will work with GCS, Dataproc, BigQuery, and Composer (Airflow) for data ingestion, processing, orchestration, and analytics. Your responsibilities will include developing and maintaining PySpark-based data processing jobs, ensuring data quality, reliability, and performance, collaborating with stakeholders to deliver data solutions, implementing best practices in data engineering, troubleshooting performance bottlenecks, and maintaining documentation for pipelines and system architecture.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines on Google Cloud Platform.
- Work extensively with GCS, Dataproc, BigQuery, and Composer (Airflow) for data ingestion, processing, orchestration, and analytics.
- Develop and optimize PySpark-based data processing jobs for large datasets.
- Ensure data quality, reliability, and performance through monitoring and optimization.
- Collaborate with data analysts, architects, and business stakeholders to deliver data solutions.
- Implement best practices in data engineering, including data governance, security, and cost optimization.
- Troubleshoot performance bottlenecks and provide scalable solutions.
- Maintain documentation for pipelines, workflows, and system architecture.
Qualifications Required:
- 47 years of experience in Big Data engineering and cloud-based data platforms.
- Strong hands-on experience with Google Cloud Platform (GCS, Dataproc, BigQuery, Composer/Airflow).
- Proficiency in PySpark and distributed data processing frameworks.
- Solid understanding of ETL/ELT processes, data warehousing, and data modeling.
- Experience with workflow orchestration tools and pipeline automation.
- Good knowledge of SQL, scripting languages, and performance tuning.
- Strong analytical, problem-solving, and communication skills.
In addition, it is preferred to have experience with CI/CD pipelines and DevOps practices, exposure to other cloud platforms or modern data stack tools, and knowledge of data security, governance, and compliance standards. As a skilled GCP & Big Data Engineer, your role will involve designing, building, and optimizing scalable data pipelines using Google Cloud services and Big Data technologies. You will work with GCS, Dataproc, BigQuery, and Composer (Airflow) for data ingestion, processing, orchestration, and analytics. Your responsibilities will include developing and maintaining PySpark-based data processing jobs, ensuring data quality, reliability, and performance, collaborating with stakeholders to deliver data solutions, implementing best practices in data engineering, troubleshooting performance bottlenecks, and maintaining documentation for pipelines and system architecture.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines on Google Cloud Platform.
- Work extensively with GCS, Dataproc, BigQuery, and Composer (Airflow) for data ingestion, processing, orchestration, and analytics.
- Develop and optimize PySpark-based data processing jobs for large datasets.
- Ensure data quality, reliability, and performance through monitoring and optimization.
- Collaborate with data analysts, architects, and business stakeholders to deliver data solutions.
- Implement best practices in data engineering, including data governance, security, and cost optimization.
- Troubleshoot performance bottlenecks and provide scalable solutions.
- Maintain documentation for pipelines, workflows, and system architecture.
Qualifications Required:
- 47 years of experience in Big Data engineering and cloud-based data platforms.
- Strong hands-on experience with Google Cloud Platform (GCS, Dataproc, BigQuery, Composer/Airflow).
- Proficiency in PySpark and distributed data processing frameworks.
- Solid understanding of ETL/ELT processes, data warehousing, and data modeling.
- Experience with workflow orchestration tools and pipeline automation.
- Good knowledge of SQL, scripting languages, and performance tuning.
- Strong analytical, problem-solving, and communication skills.
In addition, it is preferred to have experience with CI/CD pipelines and DevOps practices, exposure to other cloud platforms or modern data stack tools, and knowledge of data security, governance, and compliance standards.
Skills Required
Posted on: March 1, 2026
Relevant Jobs
Step 2 of 2