Senior Data Scientist
GlobalLogic
All India, Pune • 3 weeks ago
Experience: 3 to 8 Yrs
PREMIUM
Deal of the Day
--:--:--
15 Days Free Trial
Upgrade to CVX24 Premium
- Free Resume Writing
-
Get a Verified Blue tick
- See who viewed your profile
- Unlimited chat with recruiters
- Rank higher in recruiter searches
- Get up to 10× more recruiter visibility
- Auto-forward profile to 10 top recruiters
- Receive verified recruiter messages directly
- Unlock hidden jobs, not visible to free users
$0
Activate
$0
A small token amount will be charged to verify.
Get Refund in 48 Hours.
After free-trial 6 Months subscription will be auto Activated @ $2.49 (Cancel Anytime).
Free Bluetooth earphones with 6 Months subscription only.
Enter Your Details
Job Description
You will be responsible for designing, developing, and deploying AI/ML models to enhance data mapping, anomaly detection, and reconciliation automation within large-scale projects. Your role will involve combining data science expertise with engineering skills to create intelligent systems that optimize Telecom Platform, streamline manual processes, and ensure high-quality results.
**Key Responsibilities:**
- Design, develop, and deploy ML models for automated data mapping, anomaly detection, reconciliation, fraud detection, and churn prediction.
- Perform data profiling, feature engineering, and exploratory analysis to enhance accuracy and performance.
- Choose suitable algorithms (supervised, unsupervised, reinforcement learning) based on business requirements.
- Construct end-to-end ML pipelines for data ingestion, preprocessing, training, validation, and deployment.
- Integrate models into Telecom Platform and Automation frameworks for seamless execution in production.
- Monitor model performance, implement retraining strategies, and optimize for scalability and reliability.
- Collaborate with cross-functional teams to align AI solutions with Telecom Platform and enterprise needs.
- Translate business requirements into clear technical specifications, user stories, and acceptance criteria.
- Contribute to platform innovation by adopting the latest AI/ML advancements in anomaly detection and reconciliation automation.
- Document AI models, frameworks, and best practices for reusability.
- Mentor junior engineers/data scientists to create a collaborative and learning-oriented environment.
**Qualifications Required:**
- Bachelors or Masters degree in Computer Science, Data Science, Statistics, or a related field.
- 5 to 8 years of experience in developing and deploying machine learning models in production.
- Hands-on experience in Classification, anomaly detection, or reconciliation automation is highly preferred.
- Proficiency in Python and ML libraries (scikit-learn, TensorFlow, PyTorch).
- Experience with data pipelines, ETL/ELT, Delta Lake, and data lakehouse architectures.
- Cloud-based ML experience (Azure Data Factory, Azure Databricks, AWS Sagemaker, GCP AI/ML).
- Skilled in PySpark for large-scale data processing.
- Familiarity with containerization (Docker, Kubernetes) for scalable deployment.
- Strong grounding in data reconciliation frameworks and automation techniques.
By joining GlobalLogic, you will be part of a high-trust organization that values integrity and prioritizes a culture of caring. You can expect continuous learning and development opportunities, interesting and meaningful work, balance, and flexibility in your work-life integration. You will be responsible for designing, developing, and deploying AI/ML models to enhance data mapping, anomaly detection, and reconciliation automation within large-scale projects. Your role will involve combining data science expertise with engineering skills to create intelligent systems that optimize Telecom Platform, streamline manual processes, and ensure high-quality results.
**Key Responsibilities:**
- Design, develop, and deploy ML models for automated data mapping, anomaly detection, reconciliation, fraud detection, and churn prediction.
- Perform data profiling, feature engineering, and exploratory analysis to enhance accuracy and performance.
- Choose suitable algorithms (supervised, unsupervised, reinforcement learning) based on business requirements.
- Construct end-to-end ML pipelines for data ingestion, preprocessing, training, validation, and deployment.
- Integrate models into Telecom Platform and Automation frameworks for seamless execution in production.
- Monitor model performance, implement retraining strategies, and optimize for scalability and reliability.
- Collaborate with cross-functional teams to align AI solutions with Telecom Platform and enterprise needs.
- Translate business requirements into clear technical specifications, user stories, and acceptance criteria.
- Contribute to platform innovation by adopting the latest AI/ML advancements in anomaly detection and reconciliation automation.
- Document AI models, frameworks, and best practices for reusability.
- Mentor junior engineers/data scientists to create a collaborative and learning-oriented environment.
**Qualifications Required:**
- Bachelors or Masters degree in Computer Science, Data Science, Statistics, or a related field.
- 5 to 8 years of experience in developing and deploying machine learning models in production.
- Hands-on experience in Classification, anomaly detection, or reconciliation automation is highly preferred.
- Proficiency in Python and ML libraries (scikit-learn, TensorFlow, PyTorch).
- Experience with data pipelines, ETL/ELT, Delta Lake, and data lakehouse architectures.
- Cloud-based ML experience (Azure Data Factory, Azure Databricks, AWS Sagemaker, GCP AI/ML).
- Skilled in PySpark for large-scale data processing.
- Familiarity wit
Skills Required
Machine Learning
Data Mapping
Anomaly Detection
Data Science
Data Engineering
Classification
Fraud Detection
Data Profiling
Unsupervised Learning
Reinforcement Learning
Python
Docker
Kubernetes
Reconciliation Automation
Cloudbased AIML Workflows
Telecom Domain
Churn Prediction
Feature Engineering
Exploratory Analysis
Supervised Learning
Scikitlearn
TensorFlow
PyTorch
Data Pipelines
ETLELT
Delta Lake
Data Lakehouse Architectures
Azure Data Factory
Azure Databricks
AWS Sagemaker
GCP AIML
PySpark
Data Reconciliation Frameworks
Cloud Computing Platforms
Posted on: April 12, 2026
Relevant Jobs
Step 2 of 2