Nagarro

Staff Engineer - DataOps Engineer

Staff Engineer - DataOps at Nagarro, Guadalajara. Manage data pipelines, ETL, analytics platforms. SQL, Python, Datadog, Terraform, ServiceNow, CI/CD expertise required. 6+ years DataOps/DevOps experience.

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
Department - JobBoardly X Webflow Template
Governance, Risk, and Compliance
Department - JobBoardly X Webflow Template
IT Service Management
Department - JobBoardly X Webflow Template
Incident Management
Department - JobBoardly X Webflow Template
Problem Management
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
November 20, 2025

We are seeking a DataOps Engineer to join Tech Delivery and Infrastructure Operations teams, playing a key role in ensuring the reliability, automation, and performance of our analytics and data platforms. This role is primarily DataOps-focused, combining elements of DevOps and SRE to sustain and optimize data-driven environments across global business units.

Requirements

  • Manage and support data pipelines, ETL processes, and analytics platforms, ensuring reliability, accuracy, and accessibility
  • Execute data validation, quality checks, and performance tuning using SQL and Python/Shell scripting
  • Implement monitoring and observability using Datadog, Grafana, and Prometheus to track system health and performance
  • Collaborate with DevOps and Infra teams to integrate data deployments within CI/CD pipelines (Jenkins, Azure DevOps, Git)
  • Apply infrastructure-as-code principles (Terraform, Ansible) for provisioning and automation of data environments
  • Support incident and request management via ServiceNow, ensuring SLA adherence and root cause analysis
  • Work closely with security and compliance teams to maintain data governance and protection standards
  • Participate in Agile ceremonies within Scrum/Kanban models to align with cross-functional delivery squads

Requirements Summary

6 years in DataOps, Data Engineering Operations, or Analytics Platform Support, with good exposure to DevOps/SRE practices, Proficiency in SQL and Python/Shell scripting, Experience with cloud platforms (AWS mandatory; exposure to Azure/GCP a plus)