Architect
Quantiphi Analytics Solution Private Limited
All India • 1 month ago
Experience: 5 to 15 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
After free-trial 6 Months subscription will be auto Activated @ $2.49 (Cancel Anytime).
Free Bluetooth earphones with 6 Months subscription only.
Enter Your Details
Job Description
You will be working as a Data Architect within the domain, designing and delivering big data pipelines for structured and unstructured data that are running across multiple geographies, helping healthcare organizations achieve their business goals with use of data ingestion technologies, cloud services & DevOps. You will be working with Architects from other specialties such as Cloud engineering, Software engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others.
- More than 15 years of experience in Technical, Solutioning, and Analytical roles.
- 5+ years of experience in building and managing Data Lakes, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on Cloud (GCP/AWS/Azure).
- Ability to understand business requirements, translate them into functional and non-functional areas, define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience etc.
- Experience in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.
- Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.
- Well versed with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.
- Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.
- Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.
- Exposure to any of the No-SQL databases like MongoDB, CouchDB, Cassandra, Graph DB, etc.
- Experience in architecting and designing scalable data warehouse solutions on cloud on Big Query or Redshift.
- Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.
- Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, Pytorch, etc.
- Good understanding of Cloud solutions for IaaS, PaaS, SaaS, Containers and Microservices Architecture and Design.
- Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.
- Good understanding of BI Reporting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.
- Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.
- Experience of having worked in business transformation projects for movement of On-Premise data solutions to Clouds like GCP/AWS/Azure.
You will lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. You will interface with multiple stakeholders within IT and business to understand the data requirements and take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction. Additionally, you will be responsible for the design and development of distributed, high volume multi-thread batch, real-time, and event processing systems. You will implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. You will work with the Pre-Sales team on RFP, RFIs, and help them by creating solutions for data. Furthermore, you will mentor Young Talent within the Team, Define and track their growth parameters and contribute to building Assets and Accelerators.
Other Skills:
- Strong Communication and Articulation Skills.
- Good Leadership Skills.
- Should be a good team player.
- Good Analytical and Problem-solving skills. You will be working as a Data Architect within the domain, designing and delivering big data pipelines for structured and unstructured data that are running across multiple geographies, helping healthcare organizations achieve their business goals with use of data ingestion technologies, cloud services & DevOps. You will be working with Architects from other specialties such as Cloud engineering, Software engineering, ML engineering to create platforms, solutions and applications that cater to latest trends in the healthcare industry such as digital diagnosis, software as a medical product, AI marketplace, amongst others.
- More than 15 years of experience in Technical, Solutioning, and Analytical roles.
- 5+ years of experience in building and managing Data Lakes, Data Warehouse,
Skills Required
Data Integration
Business Intelligence
Artificial Intelligence
Cloud Services
DevOps
ETL
Spark
Scala
EMR
Informatica
DataStage
OWB
Talend
DynamoDB
Oracle
MySQL
DB2
SQL Server
Mongo dB
CouchDB
Cassandra
Cloud Storage
Athena
Glue
Sqoop
Flume
Hive
Kafka
Airflow
Presto
Iaas
PaaS
SaaS
BI Reporting
Dashboarding
Tableau
Power BI
SAP BO
Cognos
RFI
RFP
PreSales
Business Transformation
Data Architect
Data Lakes
Data Warehouse
Pyspark
Dataflow
DataProc
Cloud SQL
Cloud Spanner
Big Table
RDS
Aurora
Teradata
NoSQL databases
Graph dB
Big Query
Redshift
S3
PubSub
Kinesis
Composer
Spark SQL
EMRFS
Machine Learning Frameworks
TensorFlow
Pytorch
Containers
Microservices Architecture
Looker
Superset
Security features
BI Reposting
Posted on: March 12, 2026
Relevant Jobs
Step 2 of 2