Data Engineer - Johannesburg
Salary Negotiable
Johannesburg, Gauteng
Johannesburg,
Gauteng
more than 14 days ago
26-02-2024 11:55:29 PM
Our client is a leading insurance and financial services provider that is dedicated to serving the diverse needs of its customers. Their Data Office is responsible for managing the company’s data needs, ensuring that they make informed decisions based on accurate and timely data. As a Data Engineer with DevOps expertise in the Data Engineering Department, you will play a key role in the design, development, and deployment of data pipelines and applications in a data-driven environment. You will work closely with other data engineers and the DevOps team to ensure seamless integration with AWS services using CodePipeline. Your expertise in Docker, Kubernetes, Apache Spark, and Kafka deployments will be essential to the team’s success.
Key Responsibilities
Design, develop, and deploy data pipelines that meet the Client’s data needs while ensuring scalability, reliability, and efficiency.
Collaborate with the DevOps team to build and manage application pipelines using AWS CodePipeline and other relevant tools.
Utilize Docker and Kubernetes for containerization and orchestration of data processing applications and services.
Manage Apache Spark and Kafka deployments, optimizing performance and ensuring reliability in data processing tasks.
Monitor and troubleshoot data pipeline performance, providing insights and solutions to improve efficiency and reduce errors.
Implement security best practices and maintain compliance with relevant regulations for data engineering tasks.
Continuously evaluate and implement new technologies and tools to enhance the team’s capabilities and efficiency in data engineering.
Provide technical support to the Data Engineering team, assisting in the resolution of complex issues and challenges related to data processing and DevOps.
Essential Competencies
Critical thinking skills.
Verbal and Written Communication Skills.
Adaptable.
Curious and Creative.
Location & Type
Johannesburg (Hybrid).
Non-Negotiables
Strong knowledge of AWS services, including CodePipeline, CodeBuild, CodeDeploy, and CodeStar.
Proficiency in Docker and Kubernetes for containerization and orchestration.
Experience with Apache Spark and Kafka deployments, including configuration, monitoring, and troubleshooting.
Familiarity with data pipeline tools and technologies such as Hadoop, Apache NiFi, or Apache Beam.
Excellent problem-solving skills and the ability to work both independently and as part of a team.
Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders.
Minimum Requirements
Bachelor’s degree in Computer Science, Engineering, or a related field.
3+ years of experience in data engineering, with a strong focus on DevOps and pipeline development.
AWS DevOps Engineer or Data Engineering certification (Preferred).
Experience working in the insurance or financial services industry (Preferred).