ServiceNow

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

Join ServiceNow as a Senior Hadoop Admin in Santa Clara, CA. Leverage your 6+ years in Big Data and Kubernetes to enhance federal cloud infrastructure. Benefits include health plans and stock options.

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
October 2, 2025

ServiceNow is seeking a Staff DevOps Engineer to support the US Federal Government Cloud Infrastructure. This 2nd shift position supports the creation, operation, and maintenance of critical infrastructure, including monitoring, analytics, and AI-powered insights for large data streams. The team focuses on delivering state-of-the-art services across various IT functions while ensuring system availability and performance. ServiceNow offers a stable, AI-enhanced cloud platform.

Requirements

  • 6+ years of experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL.
  • Deep understanding of Hadoop / Big Data Ecosystem.
  • 6+ years of experience using Kubernetes in a production environment.
  • Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana.
  • Proficiency with Bash, Python for automation and task management.
  • Strong Linux Systems Administration skills.
  • Familiarity with Cloudera Data Platform (CDP) and its ecosystem.
  • Ability to learn quickly and collaborate effectively in a fast-paced team environment.
  • GCS-23 certification

Benefits

  • Health plans
  • 401(k) Plan
  • Employee Stock Option Plan
  • Employee Paid Time Off

Requirements Summary

6+ years experience with Hadoop/Big Data systems. 6+ years Kubernetes experience. Strong Linux/Python skills