Azure Data Engineer – Data factory, Data modelling, Azure data services DevOps

Azure DevOps Data Engineer - Cybersecurity data - Azure Data factory Engineer Data modelling, Azure data services, Logic apps, Data lakes - WFH Hybrid
Listed
1 December 2025

By Evolut
1 December 2025

  • Great Rates $1000 to $1,200 per day;
  • Initial Till end of June 2026 with strong prospect of renewal;
  • Well-funded Contract with View for long Extension;
  • Key position within Cybersecurity Delivery;
  • WFH Hybrid, candidate must be in NSW;
  • End-user Organization NSW Government.

  
Are you a highly skilled Azure Data Engineer with deep expertise across modern Microsoft data platforms? This role offers the opportunity to design and deliver enterprise-grade data solutions that directly support strategic decision-making across a complex organisation.
  
About the Role

We’re looking for an experienced data professional with a strong background in Data Engineering, Azure cloud services, and end-to-end ETL/ELT pipeline development. You’ll work in a mature cloud environment using best-practice Azure DevOps processes and collaborate with architects, analysts, and senior stakeholders to build scalable, secure data solutions.

Essential Experience

  • 8+ years in Data Analytics / Business Intelligence
  • Minimum 3 years working in Azure cloud data environments
  • Strong hands-on capability with:
    • Azure Data Factory
    • Azure Data Lake
    • Azure Logic Apps
    • Azure DevOps (backend and pipelines)
    • Data modelling & Azure data services
    • SQL / T-SQL
  • Proven experience designing and building enterprise-scale ETL/ELT pipelines
  • Excellent communication skills across technical and business audiences
  • Relevant tertiary qualifications (Computer Science, IT or related)

Nice to Have

  • Experience with Erwin (ERwin) data modelling
  • Exposure to cybersecurity datasets or platforms, e.g., Qualys, Prisma, ServiceNow
  • Microsoft Azure certifications

What You’ll Be Doing

  • Designing and implementing robust, metadata-driven ETL/ELT pipelines using Azure Data Factory, Azure Data Lake, Logic Apps and SQL
  • Building out semantic layers and dimensional models to support enterprise reporting and analytics
  • Developing high-impact Power BI dashboards and reports built on optimised BI models
  • Working hands-on with Azure DevOps for version control, CI/CD and deployment
  • Integrating and analysing complex, disparate datasets to drive insights for senior stakeholders
  • Applying strong T-SQL skills, including stored procedures, views, indexing and performance tuning
  • Collaborating with technical and non-technical stakeholders and translating complex ideas into clear outcomes

  
Why You’ll Love This Role

  • Work with modern Azure cloud technologies
  • Large data estate with complex datasets to engineer and optimise
  • Hybrid working environment
  • High-impact, enterprise-wide projects

  
Best method to apply is using the application button on this advert. We can be contacted on (02) 9687 1025 for a confidential discussion but please ensure the resume has been sent.
  

Please ensure all documents are sent in Microsoft word format.
Scroll-down