Position: Data Analyst - HADOOP
Location: Mountain View, CA
Duration: 6 months
**MUST have strong Hadoop and MapReduce experience.**
To elaborate on the role and expectations, the person should be strong in data warehousing, ETL, SQL, scripting and big data (MPP + Hadoop).
The candidate doesn't necessarily have to be a Java programmer (although nice to have) but should be comfortable in Hadoop and writing Pig/Hive scripts for data processing.
Client are not looking for people focused on using ETL tools (Informatica, Ab Initio, etc.) - it's okay to have tool experience/expertise but that doesn't count for this role.
Data Warehouse professional with proven experience in delivering business solutions at scale.
Must be familiar with big data technologies with demonstrable experience in building solutions with large data sets.
Work closely with architecture and engineering teams to come up with end-to-end data solutions and implement them in a timely fashion.
Work with business analytics/business operations and other partners (marketing, sales and finance) to understand and translate their business requirements.
Document business logic along with data lineage for solutions provisioned.
Data Warehousing, ETL, relational databases, SQL, data modeling,
Hadoop scripting (Pig/Hive)
Hadoop programming (Java MapReduce)