3 days old

JPMorgan Chase & Co.
Wilmington, DE 19801
  • Job Code

Software Engineering

Job Description

Duties: Gather requirements and analyze the data sources. Build scalable and distributed data solutions using Hadoop and Spark. Analyze Hadoop cluster and other Big Data analytic tools including Pig, Hive HBase database and SQOOP. Develop real-time analytics using Spark Streaming and Scala. Assist with project development to ensure projects are executed in a timely manner. Coordinate end-to-end delivery of modulus. Design and document process flows.
Minimum education and experience required: This position requires a Bachelor’s degree in Computer Engineering, Computer Science, Information Technology, or related field of study plus seven (7) years of experience in the job offered or seven (7) years of experience as Programmer Analyst, Consultant, Lead Engineer or related occupation. 

Skills Required: This position requires seven (7) years of experience with the following skills: Hadoop; Hive; Spark; Java; Python; SQL; and Unix.
Req #: 200025318
Location: Wilmington, DE US
Job Category: Technology
Employment Type: Full Time
Potential Referral Amount: US Dollar (USD)

Keyword: consumer%20banking


Posted: 2020-03-26 Expires: 2020-04-25

Before you go...

Our free job seeker tools include alerts for new jobs, saving your favorites, optimized job matching, and more! Just enter your email below.

Share this job:

JPMorgan Chase & Co.
Wilmington, DE 19801

Join us to start saving your Favorite Jobs!

Sign In Create Account
Powered ByCareerCast