Know ATS Score
CV/Résumé Score
  • Expertini Resume Scoring: Our Semantic Matching Algorithm evaluates your CV/Résumé before you apply for this job role: ETL Data Senior Developer.
South Africa Jobs Expertini

Urgent! ETL Data Senior Developer Position in WorkFromHome - Circana Careers

ETL Data Senior Developer



Job description

At Circana we are fueled by our passion for continuous learning and growth we seek and share feedback freely and we celebrate victories both big and small in an environment that is flexible and accommodating to our work and personal lives.

We are a global company dedicated to fostering inclusivity and belonging.

We value and celebrate the unique experiences, cultures, and viewpoints that each individual brings.

By embracing a wide range of backgrounds, skills, expertise, and beyond we create a stronger, more innovative environment for our employees, clients and communities.

With us you can always bring your full self to work.

Learn more at .

Job Summary

What is the role & team

Join our Global Professional Services team as a Data Engineer where you will be instrumental in bridging the gap between client needs, business objectives and cutting‑edge technology solutions, particularly supporting our new Private Cloud clients.

This role is for more than just a service provider; we are seeking a consultant‑minded go‑getter who is client‑obsessed, a liquid thinker who thrives at pace and is bold and brave in challenging the status quo.

We are a team that values being part of a successful business and building partnerships based on trust, deep knowledge, respect and unwavering support.

We are passionate about technology, data and AI and how it can be leveraged to solve our clients’ most pressing business challenges.

You will be a key player in understanding our clients’ world, becoming their trusted partner and proactively identifying opportunities for innovation and efficiency.

Curiosity, accountability and a positive questioning mindset are at the core of our teams’ DNA.

We don’t just aim to meet expectations; we strive to exceed them.

You will work collaboratively with global Circana teams, acting as the vital link to ensure seamless, unified support and strategic technological partnership for our clients.

In this role we are seeking a skilled and motivated Data Engineer to join a growing Global Team based in the UK.

You will be responsible for designing, building and maintaining robust data pipelines and infrastructure on the Azure cloud platform.

You will leverage your expertise in PySpark, Apache Spark and Apache Airflow to process and orchestrate large‑scale data workloads ensuring data quality, efficiency and scalability.

If you have a passion for Data Engineering and a desire to make a significant impact, we encourage you to apply!

Key Responsibilities

  • ETL / ELT Pipeline Development :
  • Design, develop and optimize efficient and scalable ETL / ELT pipelines using Python, PySpark and Apache Airflow.

  • Implement batch and real‑time data processing solutions using Apache Spark.

  • Ensure data quality, governance and security throughout the data lifecycle.

  • Cloud Data Engineering :
  • Manage and optimize cloud infrastructure (Azure) for data processing workloads with a focus on cost‑effectiveness.

  • Implement and maintain CI / CD pipelines for data workflows to ensure smooth and reliable deployments.

  • Big Data & Analytics :
  • Develop and optimize large‑scale data processing pipelines using Apache Spark and PySpark.

  • Implement data partitioning, caching and performance‑tuning techniques to enhance Spark‑based workloads.

  • Work with diverse data formats (structured and unstructured) to support advanced analytics and machine‑learning initiatives.

  • Workflow Orchestration (Airflow) :
  • Design and maintain DAGs (Directed Acyclic Graphs) in Apache Airflow to automate complex data workflows.

  • Monitor, troubleshoot and optimize job execution and dependencies within Airflow.

  • Team Leadership & Collaboration :
  • Provide technical guidance and mentorship to a team of data engineers in India.

  • Foster a collaborative environment and promote best practices for coding standards, version control and documentation.

Required Skills & Experience

  • Client‑facing role: strong communication and collaboration skills are vital (ideally gained in a consulting organisation)
  • 5 years of proven experience in data engineering with hands‑on expertise in Azure Data Services, PySpark, Apache Spark and Apache Airflow.

  • Strong programming skills in Python and SQL with the ability to write efficient and maintainable code.

  • Deep understanding of Spark internals including RDDs, DataFrames, DAG execution, partitioning and performance‑optimization techniques.

  • Experience with designing and managing Airflow DAGs, scheduling and dependency management.

  • Knowledge of CI / CD pipelines, containerization technologies (Docker, Kubernetes) and DevOps principles applied to data workflows.

  • Excellent problem‑solving skills and a proven ability to optimize large‑scale data processing tasks.

  • Prior experience in leading teams and working in Agile / Scrum development environments.

  • A track record of working effectively with global remote teams.

Bonus Points

  • Experience with data modelling and data‑warehousing concepts.

  • Familiarity with data‑visualization tools and techniques.

  • Knowledge of machine‑learning algorithms and frameworks.

What are we looking for

  • An authentic optimist with a genuine passion for solving problems and driving progress.

  • Intensely curious—you love to understand the why and how behind things, especially when it comes to technology and client challenges.

  • Highly accountable for your actions, deliverables and the success of your clients.

  • Comfortable and adept at building strong professional relationships with both external clients and internal technical and business teams.

  • Deeply committed to giving your best for your team and ensuring our technology solutions exceed client expectations.

  • Confident in making informed decisions, weighing options and considering the technical and business implications.

  • Highly organised with meticulous attention to detail in managing documentation, communication and time.

  • Passionate about technology, data, AI and their application in solving real‑world business problems.

  • Previous experience translating client requirements into technical specifications.

  • Previous experience building relationships with clients.

#LI‑KM1

Required Experience : Senior IC

Key Skills : SQL, Pentaho, PL / SQL, Microsoft SQL Server, SSIS, Informatica, Shell Scripting, T‑Sql, Teradata, Data Modeling, Data Warehouse, Oracle

Employment Type: Full‑Time

Department / Functional Area: Software Engineering

Experience: years

Vacancy: 1

#J-18808-Ljbffr


Required Skill Profession

Database, Analytics & Bi



Your Complete Job Search Toolkit

✨ Smart • Intelligent • Private • Secure

Start Using Our Tools

Join thousands of professionals who've advanced their careers with our platform

Rate or Report This Job
If you feel this job is inaccurate or spam kindly report to us using below form.
Please Note: This is NOT a job application form.


    Unlock Your ETL Data Potential: Insight & Career Growth Guide