4 Oct 2024

Data Engineer at BURN Manufacturing

Recruit candidates with Ease. 100% recruitment control with Employer Dashboard.
We have the largest Job seeker visits by alexa rankings. Post a Job

Resubmit your Resume Today. Click Here to Start

We have started building our professional LinkedIn page. Follow


Job Description

BURN designs, manufactures, and distributes aspirational fuel-efficient cooking products that save lives and forests in the developing world.BURN has revolutionized the global cookstove sector by proving the business case for selling a high quality, locally manufactured and unsubsidized cookstoves.

About the Role:

BURN is seeking a skilled and experienced Data Engineer to develop & deploy the articulated solution. The role involves designing and deploying an end-to-end data pipeline system that centralizes data from various sources and enables data professionals to easily query the data. Additionally, the system should allow users to easily pull up all relevant information for a product and customer using a consumer-friendly user interface.

Duties and Responsibilities:

  • Develop, test and maintain data pipelines using AWS Cloud’s Python and SQL/ NoSQL databases.
  • Implement and manage workflow orchestration tools such as Prefect, Airflow and Airbyte.
  • Design and optimise data models to support efficient data storage and retrieval.
  • Collaborate with cross-functional teams to understand data requirements and deliver ETL processes.
  • Integrate and manage APIs and databases including PostgreSQL, My SQL and Microsoft SQL.
  • Ensure data quality and integrity through rigorous testing and validation.
  • Monitor and troubleshoot data pipeline issues to ensure smooth operation.
  • Contribute to the continuous improvement of data engineering practices and processes.
  • Conducted process improvements and automation for mundane and repetitive tasks.

Skills and Experience:

  • Bachelor’s degree in computer science, Information Systems, or related field.
  • At least 5 years of experience in designing and deploying end-to-end data pipelines.
  • Strong knowledge of SQL, ETL tools, and data warehousing concepts.
  • Experience working with APIs, batch exports, and SQL queries to extract data from various sources.
  • Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform.
  • Strong data analysis and problem-solving skills.
  • Experience working with Microsoft Dynamics, open-source data systems like KOBO, and Call Center platforms would be an added advantage.
  • Excellent communication skills and ability to work in a team environment


Method of Application

Submit your CV and Application on Company Website : Click Here

Closing Date : 31 October. 2024





Subscribe


Apply for this Job