About the position
We are seeking a Data Engineer to support the delivery of enterprise data solutions aligned to the bank’s Data Architecture Roadmap.
This role focuses on building, maintaining, and optimising data pipelines and infrastructure, enabling advanced analytics, machine learning, and AI use cases across the organisation.
The successful candidate will play a key role in ensuring high-quality, reliable, and accessible data to support Nedbank’s journey toward becoming a data-driven organisation.
Key Responsibilities
1. Data Pipeline Development & Support
Maintain and support data pipelines across:
Data ingestion
Data provisioning
Data streaming
API-based data services
Monitor pipeline performance and ensure successful execution
Implement minor enhancements and fixes to pipelines
Support senior Data Engineers within data delivery initiatives (Epics)
2. Data Engineering Operations
Perform day-to-day data-related tasks, including:
Data profiling
Data cleaning and transformation
Data validation and quality assurance
Data configuration and support
Assist in building and maintaining basic data pipelines
3. Data Infrastructure Support
Support and maintain data platforms and infrastructure
Ensure systems are:
Secure
Available
Reliable
Monitor and support data warehouse environments
4. Data Warehouse Monitoring & Support
Provide first-line support for data warehouse issues
Monitor pipeline jobs and ensure SLA adherence
Troubleshoot failures and ensure data availability
Run daily operational checks and reporting
5. Cloud Data Platform Support
Monitor and manage cloud-based data environments (compute & storage)
Ensure cloud pipelines execute successfully
Support cloud operations aligned to enterprise standards
6. Data Visualisation & Access
Create and manage virtual databases
Assist with generating data extracts for business users
Support self-service data capabilities
7. Data Analysis & Documentation
Collaborate with Data Analysts on:
Data profiling
Data validation
Data documentation
Ensure proper documentation of pipelines and data assets
8. Stakeholder Collaboration
Work closely with business stakeholders to:
Understand data requirements
Improve query performance
Optimise data usage
Contribute to continuous improvement of data solutions
Minimum Requirements
Experience
3–6+ years’ experience in Data Engineering or related field
Experience working in data warehouse or big data environments
Exposure to banking or financial services (preferred)
Technical Skills
Strong SQL skills
Experience with:
Data pipelines (ETL/ELT)
Data ingestion and transformation
Data quality and validation
Exposure to:
Cloud platforms (Azure / AWS preferred)
Data warehousing technologies
APIs and data integration
Nice to Have
Experience with:
Streaming technologies (e.g., Kafka)
Data virtualisation tools
Big data ecosystems
Exposure to machine learning / AI data pipelines
Understanding of data governance and security
Key Competencies
Strong analytical and problem-solving skills
Attention to detail and data quality focus
Ability to work in Agile delivery environments
Strong collaboration and stakeholder engagement skills
Ability to operate in a fast-paced, enterprise environment
Environment
Agile, squad-based delivery model
Enterprise-scale data platforms
Why This Role?
Exposure to enterprise data transformation initiatives
Opportunity to work on AI, ML and advanced analytics enablement
Contribute to building a modern, scalable data ecosystem within a leading bank
Desired Skills:
- Data Engineer
- Data Engineering
- ETL