|Posted||30 days ago|
About the Company
Our client is a well-known organisation with a fantastic culture, this organisation has a real tangible positive impact on Australians. They have a wide array of compliance, business, and technology transformation initiatives. It is a complex environment, which includes a multitude of stakeholders.
About the Role
You will report to the Data Delivery & Governance Manager, you'll work closely with internal teams to deliver data engineering solutions that enable data-driven insights for customers and commercial outcomes. the role is responsible for the smooth running of machine learning pipelines including automated testing and monitoring with a view to continual improvement of deployed models.
You will also provide technical expertise to Data Ops and Data Commercialisation teams and be an active participation in, deployment and implementation activities for data engineering solutions, working closely with data operations to ensure a smooth transition of projects to BAU
- Must have strong experience with cloud computing and GCP services
- Must be fluent in data manipulation and visualization through the use of enterprise tools including Tableau and ETL software
- Strong proven experienced in Python, scripting languages and pipeline automation tools such as Airflow and Dataflow.
- Excellent understanding of DevOps best practice
- Experience with database administration including permission management as well as writing SQL
- Strong understanding of Agile methodology and intermediate understanding of software development principles
- Bachelor's degree in Engineering, Computer Science or related IT qualification (preferred)
To apply, click the link and upload your current resume in Microsoft Word format only (.doc or .docx). If you would like to have a confidential discussion, please contact Simone Wilson at email@example.com, quoting ref no. JO-2101-102696. Want to know more about Davidson? Visit us at www.davidsonwp.com