Systems Engineer (Data Pipelines/Engineering)

at AutoZone, Inc. in Memphis, Tennessee, United States

Job Description

The Data & Cloud Engineer (hybrid position) will collaborate with AZ IT Operation and data scientists / analytics experts and BI professionals. The Data & Cloud Engineer will support our data science software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects in an Agile environment.

Key Responsibilities: The Data & Cloud Engineer will be responsible for expanding and optimizing our data and data pipeline architectures, optimizing data collection and data flow into the Google Cloud Platform, and provide exceptional technical guidance and support throughout our adoption of Google Cloud Platform (GCP) services in migrating, building, modernizing, and maintaining cloud applications. The ideal candidate is an experienced data pipeline builder, a data wrangler who enjoys optimizing data systems and building them from the ground up, and a continuous improvement engineering always seeking a better way to deliver services. As the title implies, this is a multifaceted role involving the following responsibilities:

+ Architecture design

+ Design both data platforms and Cloud-based technical architectures, migration approaches, and application optimizations that enable business objectives

+ Implementing the architecture and design of database systems and applications, and of Datawarehouse and Business Intelligence systems

+ Recommend different ways to constantly improve data reliability and quality

+ Development of data related instruments/instances

+ In the first place, your role as a data engineer is that of a developer. You will use your programming skills to develop, customize and manage integration tools, databases, warehouses, and analytical systems, employing an array of technological languages and tools to connect systems together

+ Integrate up-and-coming data management and software engineering technologies into existing data structures

+ Develop set processes for data mining, data modeling, and data production

+ Data pipeline development/maintenance/testing

+ During the development phase, you will, in collaboration with the data science and testing teams, test the reliability and performance of each part of a system

+ You will be responsible for the development, deployment and maintenance of several tools/systems (i.e. Kafka, Talend, GCPGCS, Snowflake Data Exchange, etc.) used in the transport and sharing of data

+ Data pipeline stability monitoring

+ Responsible for monitoring the overall performance and stability of the data pipeline system(s) to ensure the uninterrupted flow of data from internal/external systems into the Datawarehouse, including the automated parts of a pipeline, and modify as data/models/requirements will change

+ Analytics/Machine Learning algorithm and models deployment

+ Implement Analytics and Machine Learning models designed and deploy into production on-premise and Cloud environments. This will also entail providing the model with data stored in a warehouse or coming directly from sources, configuring data attributes, managing computing resources, setting up monitoring tools, etc.

+ Manage data and meta-data

+ Play a key role in the integrity of data whether from the Datawarehouse or from Data Governance or Master Data Management initiatives.

+ Responsible for the management of the meta-data, and ensure that data are structured properly via database management systems

+ Manage data-access, data management and Analytics tools

+ Responsible for setting up, configuring and maintaining BI, Data warehousing and ML tools used by the team

Other responsibilities will include:

+ Working with AutoZone IT to deliver best practices and recommendations

+ Serving as a technical advisor and perform troubleshooting to resolve technical challenges

+ Creating tutorials, blog articles, and sample code for the rest of ALLDATA

+ Designing, constructing, installing, testing and maintaining data management and other production systems

+ Ensuring that all systems meet the business/company requirements as well as industry practices

+ Collaborating with members of your team on the project’s goals

+ Continuously updating disaster recovery procedures for system under our direct control

+ Ensuring end-to-end security at all levels in accordance with company policy

+ Working independently with limited supervision, and with other department personnel

+ Participating in the product team scrums

+ Ability to manage agile methodologies

Role Requirements:

+ Bachelor’s degree in Computer Science, CIS, or related field required

+ 5+ years with Java and/or Python

+ 2+ years of Unix Shell Scripting

+ 3+ years of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)

+ 5+ years of experience in software development or a related field

To view full details and how to apply, please login or create a Job Seeker account
How to Apply Copy Link

Job Posting: JC182277368

Posted On: Apr 26, 2021

Updated On: Sep 25, 2022