YOUR PARTNER FOR A STRONG CAREER IN TECHNOLOGY

Systems Developer/ Data Engineer in San Francisco, CA at APEX Systems

Date Posted: 2/15/2018

Job Snapshot

  • Employee Type:
    Contractor
  • Job Type:
    Other
  • Experience:
    Not Specified
  • Date Posted:
    2/15/2018

Job Description

Job #:  822855

Job Overview:

Our systems development team is looking for a Developer who has passion to work with data and to build solutions that supports our Analytic Systems Solution stack, which includes Hortonworks Hadoop distribution, SAS (Linux environment) and Tableau Server/Desktop.  The successful candidate would have extensive experience as a data engineer or ETL developer building and automating data transformation and loading procedures.  Strong knowledge and experience using Hive, SQL, SAS and Hadoop to conduct data profiling/discovery, data modelling and process automation is required.  The candidate must be comfortable working with data from multiple sources; Hadoop, DB2, Oracle, flat files.  The projects are detail intensive, requiring the accurate capture and translation of data requirements (both tactical and analytical needs) and validation of the working solution.  We work in a highly collaborative environment working closely with cross functional team members; Business Analysts, Product Managers, Data Analysts and Report Developers.  Perform other duties as assigned. 

 

 

Essential Functions:

 

• Design, develop and implement end-to-end solutions on Hortonworks Hadoop distribution; strong ability to translate business requirements into technical design plan.

• Automate, deploy and support solutions scheduled on Crontab or Control-M. Deployment includes proper error handling, dependency controls and necessary alerts. Triage and resolve production issues and identify preventive controls.

• Build rapid prototypes or proof of concepts for project feasibility.

• Document technical design specifications explaining how business and functional requirements are met.  Document operations run book procedures with each solution deployment.

• Identify and propose improvements for analytics eco-system solution design and architecture.

• Participate in Hadoop and SAS product support such patches and release upgrades. Provide validation support for Hadoop and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.

• Participate in full SDLC framework using Agile/Lean methodology.

• Support non-production environments with the Operations and IT teams.

• Regular, dependable attendance & punctuality.

Qualifications:

 

Education/Experience:

 

• Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.

• 4+ years of work experience in Data Engineering, ETL Development and Data Analytics.

• 4+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python.

• 3+ years of hands-on experience developing on a Linux platform.

• 2+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.

• 1+ years of hands-on experience working in Hadoop using HIVE, HDFS, TEZ, MapReduce, Sqoop.

• 1+ years of hands-on experience working scripting language such as Python or SAS with BASE SAS, SAS MACRO, and SAS STAT.

• Experience with Spark, PySpark, Zeppelin and Jupyter Notebook is nice to have.

• Demonstrated experience implementing and automating ETL processes on large data sets.

• Experience with report development and supporting data requirements for reporting.

• Strong knowledge of Hadoop / Big Data architecture and operational workings.

 

Communication Skills:

 

• Strong communications skills.

• Ability to collaborate and to negotiate in cross functional team environment.

 

Reasoning Ability:

 

• Ability to multi-task and meet deadlines.

• Ability to work with diverse teams and multiple technologies.

 

Work Hours:

 

• Ability to work a flexible schedule based on department and store/company needs.