ServiceNow

Staff Engineer - Hadoop Big Data DevOps - Federal - 2nd Shift

Join ServiceNow as a Staff Engineer - Hadoop Big Data DevOps in Santa Clara. Leverage your 6+ years of Hadoop expertise and Linux skills to support federal cloud infrastructure. Enjoy benefits like health plans and flexible time off.

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
September 21, 2025

ServiceNow is seeking a Staff DevOps Engineer to support the US Federal Government Cloud Infrastructure, working on the Federal Big Data team. This role involves deploying, maintaining, and supporting large-scale data infrastructure, leveraging AI/ML, and ensuring operational stability. The position requires a background check and passing a ServiceNow background screening.

Requirements

  • Experience in integrating AI into work processes
  • Deep understanding of Hadoop / Big Data Ecosystem
  • 6+ years of experience with systems like HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
  • Hands-on experience with Kubernetes
  • Strong knowledge of querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
  • Proficient with Git and version control systems
  • Strong Linux Systems Administration skills
  • Strong scripting skills in Bash, Python for automation and task management
  • Proficient with Cloudera Data Platform (CDP) and its ecosystem
  • Ability to learn quickly in a fast-paced team environment
  • GCS-23 (certification based on testing)

Benefits

  • Health plans
  • Flexible spending accounts
  • 401(k) Plan with company match
  • ESPP
  • matching donations
  • flexible time away plan and family leave programs

Requirements Summary

6+ years experience with Hadoop/Big Data. Strong Linux skills & scripting. Understanding of Kubernetes & AWS (CDP)