IT Industry News. Daily.
The purpose of the Data Engineer is to leverage their data expertise and data related technologies, in line with the company Data Architecture Roadmap, to advance technical thought leadership for the Enterprise, deliver fit for purpose data products, and support data initiatives.
In addition, Data Engineers enhance the data infrastructure of the company to enable advanced analytics, machine learning and artificial intelligence by providing clean, usable data to stakeholders. They also create data pipelines, Ingestion, provisioning, streaming, self-service, API and solutions around big data that support the company strategy to become a data driven organisation.Responsible for the maintenance, improvement, cleaning, and manipulation of data in the companies operational and analytics databases. Data Infrastructure: Build and manage scalable, optimised, supported, tested, secure, and reliable data infrastructure e.g. using Infrastructure and Databases , Data Lakes Storage , Cloud-based solutions , Data Platforms . Ensure data security and privacy in collaboration with Information Security, CISO and Data Governancecreate data pipelines for data integration utilising both On Premise tool sets and Cloud Data Engineering tool sets efficiently extract data from Golden Sources, Trusted sources and Writebacks with data integration from multiple sources, formats and structuresprovide data to the respective Lines of Business Marts, Regulatory Marts and Compliance Marts through self-service data virtualisationtransform data to a common data model for reporting and data analysis, and to provide data in a consistent, useable format to companies data stakeholdersdrive utilisation of data integration tools and Cloud data integration tools Data Modelling and Schema Build: In collaboration with Data Modellers, create data models and database schemas on the Data Reservoir, Data Lake, Atomic Data Warehouse and Enterprise Data Marts. The Companies Data Warehouse Automation: Automate, monitor and improve the performance of data pipelines. Collaboration: Collaborate with Data Analysts, Software Engineers, Data Modelers, Data Scientistsm Scrum Masers and Data Warehouse teams as part of a squad to contribute to the data architecture detail designs and take ownership of Epics end-to-end and ensure that data solutions deliver business value. Data Quality and Data Governance: Ensure that reasonable data quality checks are implemented in the data pipelines to maintain a high level of data accuracy, consistency and security. Performance and Optimisation: Ensure the performance of the companies data warehouse, integration patterns, batch and real time jobs, streaming and API’s. API Development: Build API’s that enable the Data Driven Organisation, ensuring that the data warehouse is optimised for API’s by collaborating with Software Engineers.Cloud , DEVOPS or Data engineering certification. Any Data Science certification will be an added advantage, Coursera, Udemy, SAS Data Scientist certification, Microsoft Data Scientist.Type of experience: Experienced at working independently within a squad and has the demonstrated knowledge and skills to deliver data outcomes without supervision.Experience with big data technologies such as Hadoop, Spark, and Hive.Experience with relational databases and NoSQL databases. Experience with cloud computing platforms such as AWS, Azure, and GCP. Experience with data visualization tools. Result-driven, analytical creative thinker, with demonstrated ability for innovative problem solving. In addition, Data Engineers enhance the data infrastructure of the company to enable advanced analytics, machine learning and artificial intelligence by providing clean, usable data to stakeholders. They also create data pipelines, Ingestion, provisioning, streaming, self-service, API and solutions around big data that support the company strategy to become a data driven organisation.Responsible for the maintenance, improvement, cleaning, and manipulation of data in the companies operational and analytics databases. Data Infrastructure: Build and manage scalable, optimised, supported, tested, secure, and reliable data infrastructure e.g. using Infrastructure and Databases , Data Lakes Storage , Cloud-based solutions , Data Platforms . Ensure data security and privacy in collaboration with Information Security, CISO and Data Governancecreate data pipelines for data integration utilising both On Premise tool sets and Cloud Data Engineering tool sets efficiently extract data from Golden Sources, Trusted sources and Writebacks with data integration from multiple sources, formats and structuresprovide data to the respective Lines of Business Marts, Regulatory Marts and Compliance Marts through self-service data virtualisationtransform data to a common data model for reporting and data analysis, and to provide data in a consistent, useable format to companies data stakeholdersdrive utilisation of data integration tools and Cloud data integration tools Data Modelling and Schema Build: In collaboration with Data Modellers, create data models and database schemas on the Data Reservoir, Data Lake, Atomic Data Warehouse and Enterprise Data Marts. The Companies Data Warehouse Automation: Automate, monitor and improve the performance of data pipelines. Collaboration: Collaborate with Data Analysts, Software Engineers, Data Modelers, Data Scientistsm Scrum Masers and Data Warehouse teams as part of a squad to contribute to the data architecture detail designs and take ownership of Epics end-to-end and ensure that data solutions deliver business value. Data Quality and Data Governance: Ensure that reasonable data quality checks are implemented in the data pipelines to maintain a high level of data accuracy, consistency and security. Performance and Optimisation: Ensure the performance of the companies data warehouse, integration patterns, batch and real time jobs, streaming and API’s. API Development: Build API’s that enable the Data Driven Organisation, ensuring that the data warehouse is optimised for API’s by collaborating with Software Engineers.Cloud , DEVOPS or Data engineering certification. Any Data Science certification will be an added advantage, Coursera, Udemy, SAS Data Scientist certification, Microsoft Data Scientist.Type of experience: Experienced at working independently within a squad and has the demonstrated knowledge and skills to deliver data outcomes without supervision.Experience with big data technologies such as Hadoop, Spark, and Hive.Experience with relational databases and NoSQL databases. Experience with cloud computing platforms such as AWS, Azure, and GCP. Experience with data visualization tools. Result-driven, analytical creative thinker, with demonstrated ability for innovative problem solving.
Australia Latest News, Australia Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Data Engineer (Remote) - IT-OnlineIT Industry News. Daily.
Read more »
Data Engineer (Remote) - Western Cape Century CityIT Industry News. Daily.
Read more »
Mulesoft Integration Engineer - Remote RemoteIT Industry News. Daily.
Read more »
Senior Software Architect/ Senior Full Stack C# Developer (Remote, anywhere in South Africa) - Remote RemoteIT Industry News. Daily.
Read more »
Junior Data Engineer - Remote RemoteIT Industry News. Daily.
Read more »
Software Quality Engineer II) - Remote RemoteIT Industry News. Daily.
Read more »
