Cloud Data Infrastructure Engineer
KeyValue
All India, Kochi • 1 month ago
Experience: 3 to 7 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
After free-trial 6 Months subscription will be auto Activated @ $2.49 (Cancel Anytime).
Free Bluetooth earphones with 6 Months subscription only.
Enter Your Details
Job Description
Role Overview:
As a part of KeyValue, you will be joining a team dedicated to unlocking the passion of Start Ups & Scale Ups by developing innovative ideas and creating value for all stakeholders. You will play a crucial role in designing and building scalable data pipelines, extracting and loading data to data warehouses, analyzing data, and creating visualizations to support business decisions. Your contribution will be essential in troubleshooting and resolving issues in data processing and pipelines, while also staying updated on new technologies and setting up CI/CD processes.
Key Responsibilities:
- Work closely with the product and engineering team to understand domains, features, and metrics.
- Design and build scalable data pipelines to handle data from different sources.
- Extract data using ETL tools and load it to data warehouses such as AWS Redshift or Google BigQuery.
- Implement batch processing for both structured and unstructured data.
- Analyze data and create visualizations using tools like Tableau, Metabase, or Google Data Studio.
- Collaborate with the core data team to design and maintain the Data warehouse.
- Anticipate problems and build processes to avoid them.
- Learn new technologies quickly.
- Set up CI/CD processes.
Qualifications Required:
- Proficiency in database design and writing SQL queries.
- Experience with data warehouse solutions like AWS Redshift, Google BigQuery, or Snowflake.
- Knowledge of platforms such as Segment, HevoData, Stitch, Amplitude, or Clevertap.
- Hands-on experience with ApacheSpark, Python, R, Hadoop, or Kafka.
- Familiarity with working on connectors (REST, SOAP, etc.).
- Experience with BI platforms like Metabase, Power BI, Tableau, or Google Data Studio. Role Overview:
As a part of KeyValue, you will be joining a team dedicated to unlocking the passion of Start Ups & Scale Ups by developing innovative ideas and creating value for all stakeholders. You will play a crucial role in designing and building scalable data pipelines, extracting and loading data to data warehouses, analyzing data, and creating visualizations to support business decisions. Your contribution will be essential in troubleshooting and resolving issues in data processing and pipelines, while also staying updated on new technologies and setting up CI/CD processes.
Key Responsibilities:
- Work closely with the product and engineering team to understand domains, features, and metrics.
- Design and build scalable data pipelines to handle data from different sources.
- Extract data using ETL tools and load it to data warehouses such as AWS Redshift or Google BigQuery.
- Implement batch processing for both structured and unstructured data.
- Analyze data and create visualizations using tools like Tableau, Metabase, or Google Data Studio.
- Collaborate with the core data team to design and maintain the Data warehouse.
- Anticipate problems and build processes to avoid them.
- Learn new technologies quickly.
- Set up CI/CD processes.
Qualifications Required:
- Proficiency in database design and writing SQL queries.
- Experience with data warehouse solutions like AWS Redshift, Google BigQuery, or Snowflake.
- Knowledge of platforms such as Segment, HevoData, Stitch, Amplitude, or Clevertap.
- Hands-on experience with ApacheSpark, Python, R, Hadoop, or Kafka.
- Familiarity with working on connectors (REST, SOAP, etc.).
- Experience with BI platforms like Metabase, Power BI, Tableau, or Google Data Studio.
Skills Required
Posted on: March 9, 2026
Relevant Jobs
Step 2 of 2