ServiceNow

Senior Hadoop Admin - Big Data - Federal - 2nd Shift

Join ServiceNow as a Senior Hadoop Admin in San Diego, CA. Leverage your 6+ years of Big Data expertise and Kubernetes skills to support federal cloud infrastructure. Enjoy competitive benefits!

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
October 2, 2025

ServiceNow is seeking a Staff DevOps Engineer to support its US Federal Government Cloud Infrastructure. This 2nd shift position involves deploying, monitoring, maintaining, and supporting Big Data infrastructure, applications, and pipelines. ServiceNow is a global market leader with a focus on AI-enhanced technology for over 8,100 customers, including a significant portion of the Fortune 500.

Requirements

  • Experience in leveraging or critically thinking about how to integrate AI into work processes.
  • Deep understanding of Hadoop / Big Data Ecosystem.
  • 6+ years of experience with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL.
  • Hands-on experience with Kubernetes in a production environment.
  • Strong knowledge of querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana.
  • Strong Linux Systems Administration skills.
  • Strong scripting skills in Bash, Python.
  • Familiarity with Cloudera Data Platform (CDP) and its ecosystem.
  • Ability to learn quickly in a fast-paced environment.
  • GCS-23 qualification

Benefits

  • Health plans
  • 401(k) Plan with company match
  • ESPP
  • Matching donations
  • Flexible time away plan
  • Family leave programs

Requirements Summary

6+ years Hadoop/Big Data experience, Kubernetes proficiency & knowledge of big data query tools are necessary. GCS-23 qualification is required, and strong scripting skills are expected