Hadoop Developer in Denver, CO (12+ Months) - 8647 Internet & Ecommerce - Denver, CO at Geebo

Hadoop Developer in Denver, CO (12+ Months) - 8647

Company Name:
UpStream Global Services
Hi,
We are looking for Hadoop Developer to work with one of our major clients in the Denver, CO area. Please go through the details below and let me know your interest.
Hadoop Developer
Core
Responsibilities:
o Develop Big Data solutions on a Hadoop platform, leveraging current ecosystem tools
o Develop solutions for real-time and batch-mode event/log collecting from various data sources
o Analyze massive amounts of data and help drive prototype ideas for new tools and products.
o Developing enterprise-grade integration solutions, leveraging 3rd party and custom integration frameworks
o build and support APIs and services that are exposed to other internal teams
o Actively participate in team Agile planning and sprint execution

Requirements:
o Bachelors or Masters in Computer Science or equivalent
o Proven track record of delivering backend systems that participate in a complex ecosystem.
o 5
years designing and developing Enterprise-level data, integration, and reporting solutions
o 3
years' experience developing applications on Hadoop, utilizing Pig, Hive, Sqoop, or Spark
o Experience with Hadoop 2.0 and Yarn applications
o Proven experience with data modeling, complex data structures, data processing, data quality, and data lifecycle
o Current knowledge of Unix/Linux scripting, as well as solid experience in code optimization and high performance computing.
o Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly.
o Great design and problem solving skills, with a strong bias for architecting at scale.
o Good understanding in any: advanced mathematics, statistics, and probability.
o Adaptable, proactive and willing to take ownership.
o Keen attention to detail and high level of commitment.

Additional Skills:
o Experience in messaging and collection frameworks like Kafka, Flume, or Storm.
o 2
years of distributed database experience (HBase, Accumulo, Cassandra, or equivalent).
o Knowledge in Big Data related technologies and open source frameworks preferred.
o Experience in software development of large-scale distributed environments
o Experience with integration tools such as Pentaho or Informatica Big Data Edition
o Experience using Enterprise scheduling tools such as UC4, Tidal, or Autosys
For immediate consideration please contact:
Deepu
UpStream Global Services.
Reply to:
www.upstreamgs.comEstimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.