ServiceNow

Staff Hadoop Admin & Tableau Admin - Big Data - Federal

Join ServiceNow in Kirkland, WA as a Staff Hadoop & Tableau Admin. Leverage your 6+ years in Big Data, Hadoop, and Kubernetes to support federal cloud infrastructure. Benefits include health plans, 401(k), and flexible time off.

ServiceNow Role Type:
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
October 2, 2025

ServiceNow is seeking a Staff DevOps Engineer to help deliver 24x7 support for their Big Data Federal Cloud Infrastructure. The role involves deploying, monitoring, maintaining, and supporting various Big Data applications, ensuring the availability and performance of ServiceNow Platform-powered Customer instances. This position requires a strong understanding of Hadoop, Big Data systems, and Kubernetes.

Requirements

  • Experience in integrating AI into work processes
  • 6+ years of experience working with systems such as HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL
  • Deep understanding of Hadoop / Big Data Ecosystem
  • Hands-on experience with Kubernetes in a production environment
  • Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana
  • Experience with CI/CD pipelines for automated applications deployment to Kubernetes
  • Strong Linux Systems Administration skills
  • Strong scripting skills in Bash, Python
  • Proficient with Git and version control systems
  • Familiarity with Cloudera Data Platform (CDP) and its ecosystem
  • Experience as a Tableau administrator
  • Familiarity with Tableau Services Manager (TSM)
  • Ability to learn quickly in a fast-paced environment
  • GCS-23

Benefits

  • Health plans
  • 401(k) Plan
  • ESPP
  • Flexible time away plan
  • Family leave programs
  • Competitive salary

Requirements Summary

6+ years experience with Hadoop/Big Data. Kubernetes knowledge and experience with systems like HDFS, Yarn, Hive, Spark. Expertise in querying and analyzing large-scale data