Site Reliability Engineer/Support Analyst presso Westpac Banking Corporation
Westpac Banking Corporation · Sydney, Australia · Hybrid
- Professional
- Ufficio in Sydney
Create your best future and join Westpac as a Site Reliability Engineer/Support Analyst.
We are looking for a highly skilled Site Reliability Engineer/Support Analyst with expertise in Data Engineering and PySpark to join our dynamic team. The ideal candidate will be responsible for ensuring the reliability, scalability, and performance of our data infrastructure and applications.
Open to Sydney and Melbourne locations.
Key Responsibilities
- System Health & Reliability: Monitor, maintain, and proactively enhance the performance, availability, and reliability of data processing systems and Spark platforms through automated failure detection, proactive mitigation strategies, and disaster recovery planning.
- Collaboration & Integration: Work closely with data engineers, software developers, and stakeholders to support seamless integration, deployment, and optimisation of data solutions and infrastructure upgrades.
- Automation & Efficiency: Drive system efficiency by identifying automation opportunities, implementing scripts and tools to reduce manual intervention, and enhancing operational workflows.
- Incident & Problem Management: Provide end-to-end production support—including smoke checks, change control, on-call rotations, and incident response—while conducting root cause analysis to resolve recurring issues and maintain robust documentation.
- Governance & Compliance: Ensure adherence to ITIL frameworks, data security protocols, and backup and recovery standards by maintaining detailed runbooks and aligning with IT compliance controls.
What do I need?
Education
Bachelor’s degree in computer science, Data Science, or a related field.
Experience
Proven track record as an SRE or in a similar role within Data Engineering, particularly in managing Spark platforms.
Technical Expertise
- Proficiency in PySpark and related big data technologies (e.g., Spark, Hadoop, Hive).
- Strong understanding of data pipelines built using PySpark.
- Debugging expertise for Spark issues at both platform and application levels.
- Experience with data processing orchestration and scheduling tools.
- Good knowledge of the Spark ecosystem and distributed computing principles.
- Strong Linux, networking, CPU, memory, and storage fundamentals.
Soft Skills
- Excellent problem-solving skills with a keen eye for detail.
- Strong communication and collaboration abilities.
- Ability to work efficiently under pressure in a fast-paced environment.
- Special offers on banking products and discounts from top brands, including generous employee-only mortgage rates!
- Flexible work arrangements to help you achieve a greater work/life balance, and a variety of leave options including Culture, Lifestyle and Wellbeing leave.
- Tailored learning and development opportunities to help your grow your career within the bank.
- Lots of opportunities to ‘give back’ to the Community by getting involved in our many volunteering initiatives.