Systems Engineer - Database (Thiruvananthapuram)

Systems Engineer - Database (Thiruvananthapuram)

11 Apr
|
Arch Capital
|
Thiruvananthapuram

11 Apr

Arch Capital

Thiruvananthapuram

About Arch:
Arch Capital Group Ltd. (Arch) is a leading global insurer with operations in more than a dozen countries. We write insurance, reinsurance and mortgage insurance on a worldwide basis, and our customers value us as an innovative partner and dependable risk manager with decades of fresh ideas and solid results. Part of the S&P; 500, Arch has the size and capital position to remain a market-leading specialty insurer.

Job Summary:

As a Senior Database Administrator , you will design, implement, and optimize cloud-native, highly scalable data platforms , with a primary focus on Snowflake and Databricks . You will be responsible for ensuring the availability, performance, security, and cost efficiency of enterprise data environments operating across hybrid and multi-cloud architectures .

Leveraging strong expertise in modern cloud data servicesincluding must have Snowflake, Databricks and additional nice to have PostgreSQL, MongoDB, Cosmos DB, and Azure SQL you will drive high availability, disaster recovery, automation, and operational excellence in alignment with industry best practices and Well-Architected Framework principles . Working closely with global engineering, analytics, and business teams, you will enable enterprise analytics, AI/ML workloads, and mission-critical applications across on-premises and cloud ecosystems.

Brief Description:

- Manages and supports enterprise-scale, cloud-first data platforms with a focus on Snowflake , Databricks and Azure SQL , ensuring performance, scalability, and business continuity across hybrid and multi-cloud environments.
- Designs and optimizes warehouse, and distributed database architectures using Snowflake for analytics and PostgreSQL/MongoDB for transactional and semi-structured workloads.
- Drives modernization, cloud migration, and platform consolidation initiatives to Snowflake, Databricks SQL Warehouses, Azure SQL MI, and other cloud native services with a focus on automation and FinOps.
- Automates database and platform operations using Python, Terraform, PowerShell , CI/CD, and infrastructure as code patterns.
- Implements and enforces security, compliance, and governance controls across data ecosystems including encryption, auditing, data masking, and Entra ID (Azure AD) integration.
- Leverages performance monitoring and tuning tools such as Snowflake Query Profile, Databricks SQL Analytics, PostgreSQL pg_stat_statements, Azure Monitor, MongoDB Atlas Metrics, and Cosmos DB Insights for proactive optimization.
- Collaborates with Data Engineering, Application, and Infrastructure teams to design and deliver resilient, scalable, and compliant data platforms supporting analytics, AI, and operational workloads.
- Develops and maintains disaster recovery (DR) , replication , and backup strategies across heterogeneous environments including Snowflake replication ,



Databricks Unity Catalog Replication, PostgreSQL streaming replication and MongoDB DB backups .
- Supports data integration and transformation pipelines leveraging Databricks , Azure Data Factory , and Snowflake Streams/Tasks different workloads.
- Continuously evaluates emerging capabilities in Snowflake , Databricks , PostgreSQL and MongoDB to enhance performance, scalability, and data governance maturity.
- Utilizes advanced observability and performance tuning tools such as Snowflake Query Profile, Databricks SQL Analytics, Azure Monitor, MongoDB Atlas Metrics.

Responsibilities:

- Design, deploy, and operate modern cloud data platforms with a primary focus on Snowflake and Databricks , supporting enterprise-scale analytics, AI/ML, and operational workloads; provide secondary support for Azure SQL, PostgreSQL, MongoDB, and Cosmos DB .
- Own Snowflake account administration and Databricks workspace management , including performance optimization, cost governance, workload isolation, clustering strategies, and resource scaling.
- Optimize data platforms for performance, reliability, and cost efficiency , applying best practices such as query tuning, partitioning, indexing, clustering, caching, and storage optimization.
- Build, operate, and optimize data pipelines and integrations using Databricks notebooks , Azure Data Factory , and Snowflake tasks, streams, and external tables for scalable ingestion and transformation.
- Implement and manage high availability and disaster recovery solutions, including Snowflake cross-region failover , Databricks job resiliency , PostgreSQL streaming replication , Cosmos DB multi-region writes , and MongoDB Atlas replica sets .
- Enforce data security, governance, and compliance across platforms using RBAC, encryption, access policies, auditing, masking, and lineage capabilities aligned with enterprise and regulatory standards.
- Perform advanced query optimization and workload tuning across Snowflake, Databricks, relational, and NoSQL platforms using native monitoring, query profiles, and telemetry insights.
- Automate database and data-platform operations including provisioning, monitoring, scaling, and maintenance using Python, PowerShell, CI/CD pipelines, and infrastructure-as-code (IaC) .
- Partner with Data Engineers, Data Scientists, Developers, and BI teams to design efficient data models, resolve performance bottlenecks, and improve data quality and reliability.




- Define and manage backup, recovery, retention, and archival strategies across Snowflake, Databricks, PostgreSQL, MongoDB, and Cosmos DB environments.
- Maintain clear and comprehensive documentation covering architectures, configurations, operating procedures, security controls, and disaster recovery plans.
- Lead capacity planning, platform upgrades, patching, and lifecycle management to ensure stability, compliance, and alignment with cloud and data-platform roadmaps.
- Enable analytics and reporting integrations by connecting Snowflake, Databricks, and PostgreSQL to Power BI, Synapse, and downstream analytics platforms .
- Provide production support and root cause analysis for incidents across cloud data platforms, ensuring timely resolution and continuous improvement.
- Evaluate, pilot, and adopt new platform capabilities such as Snowflake Native Apps , Databricks Unity Catalog , PostgreSQL extensions , and MongoDB Atlas features to enhance governance, performance, and automation.

Knowledge and Skills:

Must-Have Knowledge and Skills:

- Robust hands-on experience administering and optimizing Snowflake , including warehouse sizing and scaling, query performance tuning, clustering, caching, storage optimization, and cost governance using Resource Monitors and Account Usage views.
- Hands-on Databricks administration experience , including SQL Warehouses, cluster configuration and policies, workload isolation, and orchestration of ELT/ETL pipelines integrating with Snowflake.
- Proficiency in SQL and automation/scripting using Python, PowerShell, and/or Terraform to automate data platform provisioning, access control, monitoring, and operational tasks.
- Solid experience with Azure data services , particularly Azure Data Factory and Databricks , including secure connectivity, authentication, pipeline orchestration, and monitoring.
- Strong understanding of data platform security and governance , including RBAC, encryption in transit and at rest, data masking, auditing, and Microsoft Entra ID (Azure AD) integration.
- Advanced skills in performance tuning and workload optimization across Snowflake and Databricks, leveraging query profiles, execution plans, and resource utilization metrics.
- Experience implementing backup, recovery, and business continuity strategies , including Snowflake Time Travel and replication, and database recovery best practices.
- Proven ability to support production environments , perform root cause analysis, and resolve complex data platform incidents.
- Strong analytical and documentation skills , with experience creating SOPs, runbooks, and architecture documentation.
- Ability to translate business and analytics requirements into scalable, secure, and cost-optimized Snowflake and Databricks platform designs.

📌 Systems Engineer - Database (Thiruvananthapuram)
🏢 Arch Capital
📍 Thiruvananthapuram

Reply to this offer

Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.

Subscribe to this job alert:
Enter Your E-mail address to receive the latest job offers for: systems engineer - database (thiruvananthapuram) / thiruvananthapuram
Subscribe to this job alert:
Enter Your E-mail address to receive the latest job offers for: systems engineer - database (thiruvananthapuram) / thiruvananthapuram