Big Data Engineer

at Wal-Mart Associates, Inc. in Bentonville, Arkansas, United States

Job Description

Duties: Responsible for defining data elements and business requirements for the Enterprise Data Lake; analyzing and strategizing programs to collect, store and visualize data from various sources and designing data elements into a structured Data Model; designing Data Model at Enterprise level using Modeling tool like Erwin, Visio, Lucid Chart etc.; preparing and documenting metadata definition for Enterprise Data Lake; designing and developing Big data applications using Core Java and Scala; deploying applications and workflow into cloud environments; utilizing real-time streaming technologies (Spark) to Competent knowledge on Hadoop eco system; implementing cloud-based systems including Infrastructure and Platforms as a Service; working with NoSQL databases Cassandra or MongoDB; creating data pipelines using ETL technologies; interpreting business requirements and converting them to technical design; writing shell scripts using Linux to invoke and schedule applications; utilizing version control systems (GIT or SVN); writing complex SQL and Database level programs, including Procedures, Functions, and Triggers, using Oracle SQL & PL/SQL; and monitoring and tuning applications in real time using Database Tools SQL Explain-Plan, SQL Profiler, AWR report, and SQL Analyzer.

Minimum education and experience required: Bachelor's degree or the equivalent in Computer Science, Information Technology, Engineering, Business Administration or related plus 5 years of software engineering experience or related.

Skills required: Must have experience with: implementing RESTful Web Services using Java, Spring Boot, Apache Maven, SonarQube, Tomcat, XML bans, Eclipse IDE and OOAD concepts; creating and deploying batch scripts using Cobol, JCL, DB2 SQL and EasyTrieve; Mainframe Cobol and Utilities,CA7 Scheduler and File Transmission using SFTP; providing enterprise solutions for the implementation of REST Based CICS webservices; creating report programs using Adabas, DB2 and IMS; developing applications using Sencha ExtJS, JavaScript, CSS, HTML, Azure Cloud, Micro services, Eureka, Elastic and WebLogic; Business Rules Extraction by analyzing legacy applications; creating and managing application environments and DevOps platforms using OneOps; creating DB2 and SQL stored procedure; using Scrum Agile Methodology and perform ceremonies like Daily Scrum Meeting, Sprint Planning, Sprint Backlog meeting and Sprint Retrospection; scheduling ETL and Spark jobs using Control M and Cron; working on incident management with Remedy and ServiceNow; and conducting bug tracking and project management with JIRA. Employer will accept any amount of experience with the required skills.

To apply for this position:  Send your resume to and reference the following Job ID number:  R-1110001

Copy Link

Job Posting: 3430751

Posted On: Jun 21, 2022

Updated On: Jul 22, 2022