30 Mar
|
LTIMindtree
|
Nellore
30 Mar
LTIMindtree
Nellore
Apply on Kit Job: kitjob.in/job/44t8xy
Skill: Python PySpark SQL (Senior Data & AWS Cost Optimization Engineer)
Exp: 8 to 12 years
Loc: Noida, Pune, Mumbai, Chennai, Bangalore, Hyderabad, Kolkata
NP: Immediate to 30 Days
Job Title : Senior Data & AWS Cost Optimization Engineer
Engagement Context
Following the Oracle EBS → Fusion migration , Broadridge has observed a significant increase (≈4x) in AWS costs. Initial analysis has identified 8–9 key hotspots driving compute and processing expenses. The immediate focus is to optimize data queries, stored procedures, batch jobs, and Glue pipelines , followed by a comprehensive review of end‑to‑end processes and AWS infrastructure setup for sustained cost reduction and performance improvements.
Role Objective
The resource will play a hands‑on role in identifying, analyzing, and resolving AWS cost hotspots by optimizing data processing workloads, SQL logic, and AWS services , while also providing architectural guidance on infrastructure efficiency
Key Responsibilities
- Analyze AWS cost drivers and investigate identified hotspots post EBS → Fusion migration
- Optimize SQL queries, stored procedures, and batch execution patterns
- Review and tune Python and PySpark-based data pipelines
- Analyze, configure, and optimize AWS Glue jobs for performance and cost efficiency
- Use AWS CloudWatch and monitoring metrics to identify bottlenecks, retries, over‑provisioning, and inefficiencies
- Review AWS infrastructure configuration (compute sizing, execution model, orchestration, job concurrency)
and recommend cost‑effective improvements
- Collaborate closely with Broadridge data, platform, and application teams
- Assist with validation of improvements and ensure measurable cost reduction
- Document findings, recommendations, and implemented optimizations clearly
Required Skills & Experience
Core Technical Skills
- Strong, hands‑on experience in:
- Python
- PySpark
- SQL (query optimization and performance tuning)
- AWS Glue
- AWS CloudWatch
- Experience working with large-scale batch processing and data workloads
- Strong understanding of AWS cost drivers and execution patterns (compute, memory, retries, parallelism, scaling)
Infrastructure & Architecture
- Ability to review and assess AWS infrastructure setup
- Experience providing architectural guidance for cost optimization and tuning
Soft Skills
- Strong analytical and problem‑solving skills
- Ability to work in a fast‑paced, high‑urgency environment
- Explicit communicator with good stakeholder and team interaction skills
- Ability to work independently and take ownership of outcomes
Experience Level
- 8–12+ years of overall experience
- Prior experience in AWS cost optimization, post‑migration stabilization, or performance tuning engagements is strongly preferred
Offshore Expectations
Offshore Resources
- Execute hands‑on optimization across SQL, Glue jobs, PySpark and Python code
- Support analysis, tuning, validation, and documentation activities
- Work closely with onsite or client stakeholders as required
Apply on Kit Job: kitjob.in/job/44t8xy
📌 Data Engineer (Nellore)
🏢 LTIMindtree
📍 Nellore