Synechron

Data Engineer with ITIL

Data Engineer with ITIL at Synechron, NYC. Design scalable pipelines using SQL, Python, PySpark & Snowflake. 10+ yrs required. Competitive pay, benefits, training.

ServiceNow Role Type:
ServiceNow Modules:
No items found.
ServiceNow Certifications (nice to have):

Job description

Date - JobBoardly X Webflow Template
Posted on:
 
December 9, 2025

We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will possess expertise in data pipeline development, data warehousing, and have a solid understanding of ITIL processes to support operational efficiency.

Requirements

  • Design, develop, and maintain scalable data pipelines using SQL, Python, and PySpark.
  • Build and optimize data warehouses leveraging Snowflake for efficient data storage and retrieval.
  • Collaborate with cross-functional teams to understand data requirements and deliver solutions.
  • Monitor and troubleshoot data workflows ensuring data quality and performance.
  • Document data processes and procedures following ITIL best practices.

Benefits

  • A highly competitive compensation and benefits package.
  • 10 days of paid annual leave (plus sick leave and national holidays).
  • Maternity & paternity leave plans.
  • A comprehensive insurance plan including medical, dental, vision, life insurance, and long-/short-term disability (plans vary by region).
  • Retirement savings plans.
  • Commuter benefits (varies by region).
  • Extensive training opportunities, focused on skills, substantive knowledge, and personal development.
  • On-demand Udemy for Business for all Synechron employees with free access to more than 5000 curated courses.
  • Coaching opportunities with experienced colleagues from our Financial Innovation Labs (FinLabs) and Center of Excellences (CoE) groups.
  • Cutting edge projects at the world’s leading tier-one banks, financial institutions and insurance firms.
  • A flat and approachable organization.
  • A truly diverse, fun-loving, and global work culture.

Requirements Summary

10 years of experience in data engineering or related roles, proficiency in writing complex queries, experience in Python coding and PySpark frameworks