Snowflake / Databricks Developer
Optimum Solutions
All India, Chennai • 1 month ago
Experience: 5 to 9 Yrs
PREMIUM
Deal of the Day
--:--:--
A recruiter messaged CVX24 Premium users few seconds ago.
Upgrade to CVX24 Premium: Only $2.49
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Get practical interview tips and guidance
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$4.99
$2.49
🔥 50% OFF
Activate
$4.99
$2.49
all inc.
(Validity: 6 Months. After payment confirmation we will reach out to you)
Enter Your Details
Job Description
You will be responsible for working on a Banking Application project as a Data Migration / Snowflake / Databricks Developer in Chennai. Your main responsibilities will include:
- Creating, testing, and implementing enterprise-level apps with Snowflake
- Designing and implementing features for identity and access management
- Developing authorization frameworks for better access control
- Implementing query optimization and security competencies with encryption
- Solving performance and scalability issues in the system
- Managing transaction management with distributed data processing algorithms
- Owning the project right from start to finish
- Building, monitoring, and optimizing ETL and ELT processes with data models
- Migrating solutions from on-premises setup to cloud-based platforms
- Implementing the latest delivery approaches based on data architecture
- Documenting projects and tracking based on user requirements
- Integrating data with third-party tools including architecting, designing, coding, and testing phases
- Documenting data models, architecture, and maintenance processes
- Reviewing and auditing data models for enhancement
- Maintaining the data pipeline based on ETL tools
- Coordinating with BI experts and analysts for customized data models and integration
- Performing code updates, new code development, and reverse engineering
- Providing performance tuning, user acceptance training, and application support
- Ensuring confidentiality of data
- Conducting risk assessment, management, and mitigation plans
- Engaging with teams for status reporting and routine activities
- Performing migration activities from one database to another or on-premises to the cloud
Qualifications required for this role include:
- Minimum of 5+ years of intermediate-level experience
- Bachelor's degree in computer science or equivalent practical experience
- Knowledge of SQL language and cloud-based technologies
- Expertise in data warehousing concepts, data modeling, and metadata management
- Familiarity with data lakes, multi-dimensional models, and data dictionaries
- Experience in migration to AWS or Azure Snowflake platform
- Proficiency in performance tuning and setting up resource monitors
- Skills in Snowflake modeling roles, databases, schemas, SQL performance measuring, query tuning, and database tuning
- Familiarity with ETL tools, integration with cloud-driven skills, and building analytical solutions and models in languages like Python, Java, and JavaScript
- Experience with Hadoop, Spark, and other warehousing tools
- Ability to manage sets of XML, JSON, and CSV from disparate sources
- Knowledge of SQL-based databases like Oracle, SQL Server, Teradata, etc.
- Understanding of Snowflake warehousing, architecture, processing, and administration
- Experience in data ingestion into Snowflake
- Exposure to enterprise-level technical applications of Snowflake
Please note that this role requires immediate joiners or candidates with a notice period of 1 month. You will be working from the office in Chennai (Ramanujam IT Park Tharamani) from Monday to Friday, 9:00 AM to 6:00 PM. You will be responsible for working on a Banking Application project as a Data Migration / Snowflake / Databricks Developer in Chennai. Your main responsibilities will include:
- Creating, testing, and implementing enterprise-level apps with Snowflake
- Designing and implementing features for identity and access management
- Developing authorization frameworks for better access control
- Implementing query optimization and security competencies with encryption
- Solving performance and scalability issues in the system
- Managing transaction management with distributed data processing algorithms
- Owning the project right from start to finish
- Building, monitoring, and optimizing ETL and ELT processes with data models
- Migrating solutions from on-premises setup to cloud-based platforms
- Implementing the latest delivery approaches based on data architecture
- Documenting projects and tracking based on user requirements
- Integrating data with third-party tools including architecting, designing, coding, and testing phases
- Documenting data models, architecture, and maintenance processes
- Reviewing and auditing data models for enhancement
- Maintaining the data pipeline based on ETL tools
- Coordinating with BI experts and analysts for customized data models and integration
- Performing code updates, new code development, and reverse engineering
- Providing performance tuning, user acceptance training, and application support
- Ensuring confidentiality of data
- Conducting risk assessment, management, and mitigation plans
- Engaging with teams for status reporting and routine activities
- Performing migration activities from one database to another or on-premises to the cloud
Qualifications required for this role include:
- Minimum of 5+ years of intermediate-level experience
- Bachelor's degree in computer science or equiva
Skills Required
SQL
Data warehousing
AWS
Azure
Snowflake
ETL
Python
Java
JavaScript
Hadoop
Spark
XML
JSON
Oracle
SQL Server
Identity
access management
Query optimization
Encryption
Transaction management
BI
Risk assessment
Data migration
CSV
Teradata
Data ingestion
Enterpriselevel technical exposure
ETLand ELT processes
Cloudbased platforms
Posted on: March 3, 2026
Relevant Jobs
Step 2 of 2