Apply on Kit Job: kitjob.in/job/42qfgt
Company Description
n
Codem Inc. is a leader in scalable, AI-driven eCommerce solutions designed to transform businesses and reduce software engineering and operational costs. With a proven track record of building a $3.5B eCommerce business and launching platforms in over 50 countries, Codem delivers customized, efficient, and cost-effective services. By leveraging expertise in cutting-edge technologies like AI, machine learning, and cloud-based systems, Codem empowers clients to tackle complex digital challenges and scale efficiently. With dedicated teams located in the US, Singapore, and India, the company provides turnkey eCommerce solutions, capability assessments, and innovative AI applications to foster growth and decision-making. Codem operates globally from its offices in San Francisco, Singapore, and Chennai.
n
Role OverviewWe are looking for a Data Engineer focused on metrics, logs, and analytics to build dashboards, perform data analysis, and generate actionable insights from observability datasets.
n
This role is not infrastructure or DevOps-focused. Instead, it requires solid capability in:
n
- querying and analyzing time-series, log, and trace data
- building Grafana dashboards and reports
- developing custom reporting pipelines using Python or Node.js
n
Key ResponsibilitiesDashboard & Data Visualization
- Build and maintain Grafana dashboards using:
- Prometheus / VictoriaMetrics (time-series data)
- VictoriaLogs (LogQL-based log analysis)
- Jaeger (trace data)
- ClickHouse / PostgreSQL (structured datasets)
- Design dashboards that provide clear business and operational insights, not just raw metrics
Querying & Data Analysis
- Write and optimize queries using:
- PromQL (metrics)
- LogQL (logs)
- SQL (ClickHouse / PostgreSQL)
- Analyze data to:
- identify trends and anomalies
- correlate logs, metrics, and traces
- derive meaningful KPIs
Reporting & Insights
- Develop daily / weekly / monthly KPI reports
- Build custom reporting solutions using:
- Python
- TypeScript / Node.js
- Automate report generation and distribution
- Translate technical data into business-relevant insights
Alerting & Anomaly Detection
- Configure Grafana alerting based on:
- dashboard panels
- thresholds
- anomaly patterns
- Improve alert quality by reducing noise and false positives
- Implement basic anomaly detection logic using query-based approaches
Data Integration & Processing
- Work with multiple data sources including:
- time-series data
- logs
- trace data
- structured databases
- Combine and normalize data across sources for unified analysis
n
Required SkillsCore Skills
- Strong experience in data engineering / data analysis
- Strong SQL skills
- Strong experience with Grafana dashboards
Query Languages
- Hands-on experience with:
- PromQL
- LogQL
- SQL (ClickHouse / PostgreSQL)
Programming
- Strong experience with:
- Python OR
- TypeScript / Node.js
- Experience building:
- data pipelines
- reporting scripts
- analytics workflows
Data Concepts
- Understanding of:
- time-series data
- log data structures
- trace correlation
- KPI and metrics design
Nice to Have
- Experience with:
- VictoriaMetrics / VictoriaLogs
- Jaeger tracing data
- Experience working with:
- high-volume observability datasets
- Basic understanding of:
- anomaly detection techniques
- performance metrics interpretation
Candidate ProfileThe ideal candidate is:
n
- Data-focused, not infrastructure-focused
- Strong in querying and analysis, not just setup
- Able to convert raw data into clear insights and reports
- Comfortable working across metrics, logs, and structured data
- Practical and results-driven in reporting and analytics use cases
Deliverables / Success Criteria
- H
Apply on Kit Job: kitjob.in/job/42qfgt
📌 Data Engineer – Metrics, Logs & Analytics (Grafana, PromQL, SQL) (Nellore)
🏢 Codem
📍 Nellore