Senior Hadoop Architect/Developer - 8463 Internet & Ecommerce - Denver, CO at Geebo

Senior Hadoop Architect/Developer - 8463

Company Name:
UpStream Global Services
Senior Hadoop Architect/Developer
We have an immediate long term contract opportunity for a Hadoop Architect/ Developer, with one of our F500 clients in Denver, CO
Senior Hadoop Architect/Developer
Core
Responsibilities:
o Design and develop Big Data solutions on a Hadoop platform, leveraging current ecosystem tools
o Design and develop solutions for real-time and batch-mode event/log collecting from various data sources
o Coach and Cross-train team members in designing for and developing with Hadoop ecosystem tools
o Analyze massive amounts of data and help drive prototype ideas for new tools and products.
o Developing enterprise-grade integration solutions, leveraging 3rd party and custom integration frameworks
o Design, build and support APIs and services that are exposed to other internal teams
o Actively participate in team Agile planning and sprint execution
Requirements:
o Bachelors or Masters in Computer Science or equivalent
o Proven track record of delivering backend systems that participate in a complex ecosystem.
o 8
years designing and developing Enterprise-level data, integration, and reporting solutions
o 3
years' experience developing applications on Hadoop, utilizing Pig, Hive, Sqoop, or Spark
o Experience with Hadoop 2.0 and Yarn applications
o Proven experience with data modeling, complex data structures, data processing, data quality, and data lifecycle
o Current knowledge of Unix/Linux scripting, as well as solid experience in code optimization and high performance computing.
o Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
o Great design and problem solving skills, with a strong bias for architecting at scale.
o Good understanding in any: advanced mathematics, statistics, and probability.
o Adaptable, proactive and willing to take ownership.
o Keen attention to detail and high level of commitment.
Additional Skills:
o Experience in messaging and collection frameworks like Kafka, Flume, or Storm.
o 2
years of distributed database experience (HBase, Accumulo, Cassandra, or equivalent).
o Knowledge in Big Data related technologies and open source frameworks preferred.
o Experience in software development of large-scale distributed environments
o Experience with integration tools such as Pentaho or Informatica Big Data Edition
o Experience using Enterprise scheduling tools such as UC4, Tidal, or Autosys
For immediate consideration please contact,
Prescilla
UpStream Global ServicesEstimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.