SRM360 CONSULTING Logo

Cloud Data Engineering Specialist - Python/SQL

SRM360 CONSULTING

All India, Hyderabad • 1 month ago

Experience: 3 to 7 Yrs

PREMIUM
Deal of the Day --:--:--

A recruiter messaged CVX24 Premium users few seconds ago.

Upgrade to CVX24 Premium: Only $2.49

Offer Announcement Banner
  • Free Resume Writing
  • Get a Verified Blue tick
  • See who viewed your profile
  • Unlimited chat with recruiters
  • Rank higher in recruiter searches
  • Get up to 10× more recruiter visibility
  • Get practical interview tips and guidance
  • Receive verified recruiter messages directly
  • Unlock hidden jobs, not visible to free users
$4.99 $2.49 🔥 50% OFF
Activate
Gift Image

(Validity: 6 Months. After payment confirmation we will reach out to you)

Job Description

As a Data Architect & Platform Developer, your primary responsibilities will involve designing and implementing robust cloud-based data architectures. This includes creating data lakes, data warehouses, and real-time streaming systems. You will be tasked with developing scalable ETL/ELT pipelines using cloud-native tools like AWS Glue, Azure Data Factory, GCP Dataflow, Databricks, or Apache Airflow. Your role will also require you to integrate structured, semi-structured, and unstructured data from various sources into centralized platforms. Your key responsibilities will include: - Designing and implementing cloud-based data architectures - Developing scalable ETL/ELT pipelines using cloud-native tools - Integrating data from various sources into centralized platforms - Building, automating, and optimizing high-performance data pipelines - Ensuring data availability, reliability, and integrity - Implementing data quality checks, validations, and monitoring frameworks - Working with cloud-native services and managing infrastructure as code - Supporting big data analytics and enabling analytics platforms - Implementing data governance practices and ensuring data security - Collaborating with stakeholders and communicating technical concepts clearly In terms of qualifications, you are required to have: - Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field - Expertise in one major cloud platform: AWS, Azure, or GCP - Strong programming skills in Python, SQL, Scala, or Java - Proficiency with ETL/ELT tools, data orchestration tools, and workflow scheduling frameworks - Experience with relational and NoSQL databases - Solid understanding of data modeling, warehousing concepts, and distributed systems This job will provide you with the opportunity to work on cutting-edge technologies and collaborate with various teams to drive impactful data solutions. As a Data Architect & Platform Developer, your primary responsibilities will involve designing and implementing robust cloud-based data architectures. This includes creating data lakes, data warehouses, and real-time streaming systems. You will be tasked with developing scalable ETL/ELT pipelines using cloud-native tools like AWS Glue, Azure Data Factory, GCP Dataflow, Databricks, or Apache Airflow. Your role will also require you to integrate structured, semi-structured, and unstructured data from various sources into centralized platforms. Your key responsibilities will include: - Designing and implementing cloud-based data architectures - Developing scalable ETL/ELT pipelines using cloud-native tools - Integrating data from various sources into centralized platforms - Building, automating, and optimizing high-performance data pipelines - Ensuring data availability, reliability, and integrity - Implementing data quality checks, validations, and monitoring frameworks - Working with cloud-native services and managing infrastructure as code - Supporting big data analytics and enabling analytics platforms - Implementing data governance practices and ensuring data security - Collaborating with stakeholders and communicating technical concepts clearly In terms of qualifications, you are required to have: - Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field - Expertise in one major cloud platform: AWS, Azure, or GCP - Strong programming skills in Python, SQL, Scala, or Java - Proficiency with ETL/ELT tools, data orchestration tools, and workflow scheduling frameworks - Experience with relational and NoSQL databases - Solid understanding of data modeling, warehousing concepts, and distributed systems This job will provide you with the opportunity to work on cutting-edge technologies and collaborate with various teams to drive impactful data solutions.

Posted on: March 6, 2026

Relevant Jobs

Information security engineering

NTT DATA Global Delivery Services Limited

All India, Chennai

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →

Vice President, Infrastructure Operations

Oaktree Capital Management, L.P.

All India, Hyderabad

View Job →