Data Engineer - Operations (Python + SQL)
Dataro
About Dataro
Dataro is an ethically minded SaaS startup using machine learning to help not-for-profits raise more money and do more good. Our platform powers fundraising for organisations around the world, helping them run smarter campaigns and improve donor engagement using data-driven insights.
Every day we ingest, transform, and analyse hundreds of millions of donor and campaign records from a wide range of fundraising systems. This data powers our predictive models, analytics tools, and core product features. If you want to solve meaningful data problems that create real social impact — while learning from a modern, supportive engineering team — we’d love to meet you.
The Role
We’re looking for a Data Engineer to join our Operations team and help build, maintain, and support the data pipelines that power Dataro’s products. This is a practical, hands-on role with full training provided. You’ll work across data ingestion, transformation, troubleshooting, and customer integrations, ensuring our platform continues to run smoothly as we scale.
The role is suitable for graduate, junior or mid-level engineers — what matters most is your curiosity, problem-solving ability, and willingness to learn.
What You’ll Do
- Build and maintain data pipelines that transform data from hundreds of fundraising and CRM systems.
- Develop integrations across dozens of third-party platforms used by non-profit organisations.
- Investigate data issues and support customers by diagnosing pipeline failures or transformation anomalies.
- Collaborate with data scientists and software engineers to understand data requirements and improve data quality.
- Work closely with clients to understand their source systems, data structures, and onboarding needs.
- Contribute to internal tooling and documentation that improves reliability and reduces manual operations work.
- Help us scale our data systems to handle increasing volume, complexity, and global usage.
What You’ll Bring (Day One)
You do not need to tick every box. We’re looking for strong learners.
- Solid SQL skills and the ability to reason about data quality and transformations.
- Experience with Python (or strong interest and comfort learning it quickly).
- Understanding of databases and basic query optimisation principles.
- Some exposure to data pipelines, ETL concepts, or backend scripting — through work, projects, or study.
- Clear communication skills and the ability to collaborate with internal and external stakeholders.
- Degree in Computer Science, Engineering, Mathematics, Information Systems — or equivalent practical experience (bootcamps, home projects, hobby coding, on-the-job learning).
- Must be Sydney-based.
Nice to Have (But Not Required)
We’ll help you learn any of these on the job:
- Experience with AWS (S3, Lambda, Athena, Batch, Step Functions).
- Familiarity with analytical tools such as DuckDB or AWS Athena.
- Git, CI/CD workflows, or basic DevOps practices.
- Exposure to the non-profit / fundraising sector.
- Experience working in a startup or operational engineering role.
Why You’ll Love Working With Us
- Work on socially meaningful technology that directly helps charities raise more money.
- Learn modern data engineering practices from an experienced, supportive team.
- Real ownership over pipelines and integrations used by organisations worldwide.
- Modern data stack (Python, Serverless AWS, S3, Postgres, DuckDB, Athena, etc.).
- We want smart, curious engineers — and we encourage the use of modern AI tools to learn faster and solve problems more effectively.
- Flexible working arrangements (WFH + office in Sydney).
- Friendly, transparent, mission-driven culture where your work truly matters.
Ready to code for impact?
Send your CV and answers to the three questions to careers@dataro.io.
- Why do you want to work at Dataro?
- What is something you've built (outside of work) that you are really proud of?
- What is the hardest problem you've ever solved?
You must be based in Sydney and be able to demonstrate competence with Python and SQL.