Title: Big Data Engineer
Location: Jericho, NY (source locally first, can source outside of the state as long as the candidate is willing to relocate at their own expense)
Duration: C2H
Rate: $70 - $80.00 per hour C2C
Visa Type: H1, GC, US Citizen
Interview: 1 or 2 rounds of phone followed by an on-site interview (may be able to provide a Skype or Webex to out of state candidates)
Travel: no
Description:
The Big Data Engineer is responsible for the design, architecture, and development of projects powered by Google BigData and MapR Hadoop distribution
Must-Have Skills/Experience:
* Bachelors Degree required
* 5+ years of solution architecture in Hadoop
* Demonstrated experience in architecture, engineering, and implementation of enterprise-grade production big data use cases
* Extensive hands-on experience in MapReduce, Hive, Java, HBase, and the following Hadoop eco-system products: Sqoop, Flume, Oozie, Storm, Spark, and/or Kaftka.
* Extensive experience in Shell Scripting
* Solid understanding of different file formats and data serialization formats such as ProtoBuf, Avro, JSON.
* Hands-on delivery experience working on popular Hadoop distribution platforms like Cloudera, HortonWorks or MapR [MapR preferrably]
* Excellent communication skills
Nice to have:
* Coordinating the movement of data from original data sources into NoSQL data lakes and cloud environments
* Hands-on experience with Talend used in conjunction with Hadoop MapReduce/Spark/Hive.
* Experience with Google cloud platform (Google BigQuery)
* Source control (preferably Git Hub)
* Knowledge of agile development methodologies
* Experience in IDE framework like Hue, Jupyter, Zepplin
* Needs to have a good experience on ETL Technologies and concepts of Data Warehouse
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job
How strong is your resume?
Upload your resume and get feedback from our expert to help land this job