27 Aug 2024

Data Engineer (Contractor) at GiveDirectly

Recruit candidates with Ease. 100% recruitment control with Employer Dashboard.
We have the largest Job seeker visits by alexa rankings. Post a Job

Resubmit your Resume Today. Click Here to Start

We have started building our professional LinkedIn page. Follow


Job Description

GiveDirectly is the first – and largest – nonprofit that lets donors like you send money directly to the world’s poorest. We believe people living in poverty deserve the dignity to choose for themselves how best to improve their lives – cash enables that choice

About this role

The mission of the Central Data team is to empower GiveDirectly with a rigorous, data-driven culture to maximize our efficiency, effectiveness, and scale of delivering dollars to recipients. Building datasets that help us fully understand our donors, improve fundraising interventions, and ultimately raise more money is key to this mission. However, we currently lack many necessary datasets because we do not have pipelines to extract data from key fundraising platforms or combine data across platforms.

We have been awarded a $100,000 grant to fund a contractor role that will work full-time (40 hours / week) within Central Data for around one year to address these needs. You will work closely with the Senior Data Architect who oversees our infrastructure (AWS, Databricks, Tableau) and data pipeline development; the Fundraising Data Manager who will turn these datasets into dashboards and insights; and fundraising stakeholders who will use these insights to raise more money for recipients.

Reports to: Graham Tyler (Director of Data)

Level: Manager

Travel Requirement: There are no travel requirements for this role.

What you’ll do:

Success in this role is determined by meeting these key objectives:

[20%] Ingest all relevant data sources into our data lakehouse (AWS, Databricks). 

  • We have existing pipelines, metrics, and dashboards leveraging data from our donor CRM and email provider. You will build new pipelines and jobs to ingest data from our website analytics platform and donor ticketing system.

[40%] Build unified datasets to fully understand donors and fundraising interventions. 

  • Once all data sources are in Databricks, you will work with stakeholders to define metrics, facts, and dimensions necessary for new dashboards, analysis, ML models, and experimentation that will drive fundraising strategy. Then you will build pipelines to clean, transform, and combine data from all platforms into these actionable datasets.

[30%] Reduce data quality incidents with automated data quality tests and monitoring. 

  • Implement data quality tests and alerting for key donation, donor, and donor engagement variables.
  • Monitor job and pipeline performance.
  • Proactively identify and implement improvements to our existing pipelines.

[10%] Make it easy to maintain your pipelines and tools. 

  • Create comprehensive, easy to understand documentation to ensure effective knowledge transfer of your work.
  • Build within our existing configurable pipeline framework and identify ways to improve this process.
  • Leave our systems and processes better than you found them.

What you’ll bring:

  • Exceptional alignment with GiveDirectly Values and active demonstration of our core competencies: emotional intelligence, problem solving, project management, follow-through, and fostering inclusivity. We welcome and strongly encourage applications from candidates who have personal or professional experience in the low-income and/or historically marginalized communities that we serve.
  • Language Requirement: English
  • Language Preferences: No additional language preferences
  • Critical thinking and analytical approach necessary to develop technical solutions that scale and are resilient to changes over time
  • Entrepreneurial mindset and stakeholder management skills required to identify, design, and execute technical solutions that solve important, ambiguous organizational problems
  • Python, SQL, and spark expertise, along with core competencies required to ship high quality data pipelines and data tools fast
  • Extensive experience with Databricks preferred; experience with Tableau is a plus
  • Intellectual humility, curiosity, and a commitment to being part of an exceptional team


Method of Application

Submit your CV and Application on Company Website : Click Here

Closing Date : 6 September. 2024





Subscribe


Apply for this Job