ServiceNow

Staff Hadoop Admin & Tableau Admin - Big Data - Federal

Join ServiceNow in San Diego as a Staff Hadoop & Tableau Admin. Leverage Big Data skills to enhance federal cloud infrastructure. 6+ years in Hadoop, Kubernetes, and Linux required. Enjoy competitive benefits!

ServiceNow Role Type:
Department - JobBoardly X Webflow Template
Sales
ServiceNow Modules:
Department - JobBoardly X Webflow Template
DevOps
Department - JobBoardly X Webflow Template
Service Portal
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
October 2, 2025

ServiceNow is seeking a Staff Hadoop Admin & Tableau Admin to support its US Federal Government Cloud Infrastructure, ensuring availability, performance, and a seamless user experience for ServiceNow Platform-powered Customer instances. The role involves deploying, monitoring, and maintaining infrastructure, leveraging Big Data systems and AI methodologies to improve efficiency and deliver actionable insights across key functions within the company. This position requires a strong understanding of Hadoop and Big Data ecosystems.

Requirements

  • 6+ years of experience with systems like HDFS, Yarn, Hive, HBase, Kafka, RabbitMQ, Impala, Kudu, Redis, MariaDB, and PostgreSQL.
  • Deep understanding of Hadoop / Big Data Ecosystem.
  • Hands-on experience with Kubernetes.
  • Deep understanding of Kubernetes architecture and concepts.
  • Strong knowledge in querying and analyzing large-scale data using VictoriaMetrics, Prometheus, Spark, Flink, and Grafana.
  • Experience supporting CI/CD pipelines.
  • Strong Linux Systems Administration skills.
  • Strong scripting skills in Bash, Python.

Benefits

  • Health plans
  • 401(k) Plan
  • ESPP
  • Matching donations
  • Flexible time away plan
  • Family leave programs
  • Competitive On Target Earnings (OTE) incentive compensation structure
  • Sales positions often offer a competitive On Target Earnings (OTE) incentive compensation structure
  • Base pay is a guideline, individual total compensation varies based on qualifications, skills and work location

Requirements Summary

6+ years with Hadoop ecosystem. Strong knowledge of Kubernetes. Proficiency in Linux administration.