Search thousands of fresh jobs

×
This job is expired
Mediro Application Consulting

Data Engineer (Advanced) 1053

Mediro Application Consulting

  • R Undisclosed
  • Contract Senior position
  • Pretoria
  • Posted 09 Apr 2026 by Mediro Application Consulting
  • Expires in 29 days
  • Job 2636845 - Ref AH_503363643988
Apply Now

About the position

Job Description



Design, build and maintain scalable data pipelines and ETL workflows to ingest and transform data for analytics and reporting.



Implement and optimize data storage solutions including data lakes and data warehouses on cloud platforms.



Develop PySpark and Python applications for large-scale data processing and transformations.



Ensure data quality, consistency and integrity through testing, validation and the use of data quality tools.



Collaborate with stakeholders to translate business requirements into technical specifications and data models.



Propose and review system and solution designs and evaluate technical alternatives.



Maintain and operate cloud infrastructure and CI/CD pipelines for data platform components.



Create and maintain technical documentation, runbooks and artefacts for developed solutions.



Support production troubleshooting, monitoring and incident management for data services.



Work closely with BI teams to prepare and optimize data for reporting tools such as Business Objects or Tableau.



Coach and support fellow engineers, and help improve team capability through knowledge sharing and training.



Participate in Agile ceremonies and contribute to continuous improvement of delivery processes.


Minimum Requirements:

SKILLS REQUIREMENTS:



 



Qualifications/Experience:



Minimum 3-5 years’ experience as a data engineer with demonstrated hands-on experience in Python, PySpark and cloud data services (AWS and/or OCI).



Relevant IT/Computer Science/Engineering degree or equivalent proven experience; advanced degrees advantageous.



Certifications such as AWS Certified Cloud Practitioner, Oracle Cloud certifications or other relevant cloud/data engineering certifications preferred.



 



Essential Skills Requirements:



Strong experience with Python (Python 3.x) and PySpark for developing data processing jobs.



At least 3 years’ experience with AWS services commonly used by data engineers, such as Athena, Glue, Lambda, S3 and ECS.



Hands-on experience with NoSQL databases such as DynamoDB and relational databases (Oracle/PostgreSQL) including strong Oracle SQL skills.



Experience with Oracle Cloud Infrastructure (OCI) services and tooling for databases, storage, and data processing.



Expertise in data formats and schema design, including Parquet, AVRO, JSON, XML and CSV, and technical data modelling (“not drag and drop”).



ETL and data pipeline development experience, including building pipelines with AWS Glue or similar platforms.



Experience with containerization and orchestration technologies such as Docker (Kubernetes/OpenShift advantageous).



Proficiency with scripting for automation (Bash, PowerShell) and familiarity with Linux/Unix environments.



Experience with data quality tooling and validation (e.g., Great Expectations) and performing thorough data testing and validation.



Familiarity with cloud infrastructure as code and DevOps tools such as Terraform, CloudFormation, CI/CD pipelines, Git and Jenkins.



 



Advantageous Skills Requirements:



Knowledge of Kafka or other streaming technologies and AWS Kinesis for real-time data ingestion.



Experience with AWS Redshift, EMR and other analytics/warehouse technologies.



Familiarity with our client Cloud Data Hub (CDH) or similar organizational cloud data blueprints.



Java / JEE experience and understanding of Java application servers.



Experience with monitoring and observability tools such as CloudWatch and Grafana.



AWS solution architecture experience and certifications (e.g., AWS Certified Cloud Practitioner) are advantageous.



Familiarity with REST APIs and building integrations with external systems.



Experience with schema design for BI and data warehousing, and preparing specifications for development.



Experience with MongoDB or other NoSQL stores.



Familiarity with Agile/Scrum delivery models and working within cross-functional teams.


Desired Skills:

  • Python (Python 3.x) and PySpark
  • AWS services
  • NoSQL databases

Apply Now

Mediro Application Consulting

About the agency

Quality Placements Built on Trust Whether you are looking for a job or need to acquire top talent, Mediro IT RECRUIT is here to assist. We are technical recruiters who care. Our strength lies in fostering connections between candidates seeking employment and companies looking to employ. Our team consists of high achievers, strong individual contributors, and leaders who change lives through personal connections. With a community of professional recruiters, talent pooling and an internal referral programme, we provide multiple candidate recommendations within seven days, regardless of industry and across South Africa. The world of work is rapidly changing; people want to learn and grow, however work-life balance, equity and flexibility continue to play a major role. Companies across industries constantly require modernised and specialised skills. As a result, we strive to be your most valued IT recruitment partner by understanding individual and company needs and delivering the right resource solutions to build a workplace for the future. CONNECT with us: www.itrecruit.co.za #ResourceSolutions #talentacquisition #ITskills #itrecruitment #ITplacements #ITJobs

Receive a daily digest of all new jobs matching this job. Your information is safe with us and you can cancel any time.

Expires in 28 days

Email me jobs similar to: Data Engineer (Advanced) 1053

Receive a daily digest of all new jobs matching this job: Senior IT Auditor. Your information is safe with us and you can cancel at any time.