Monday, February 9, 2015

ECSI-6034-DevOps Engineer-contract- 9 to 12months-San Ramon, CA-SV

DevOps Engineer

Job Overview
Responsible for the activation, deployment, and daily operations of our industrial big data platform and partner technologies. The ideal candidate will be a top-notch, hands-on engineering leader who thrives on delivering world-class systems and has a proven history of running a 24X7 technical operation in the cloud at scale. This individual deals with the very complex and interrelated applications and systems being designed for today’s big data platforms. Leading an internal team of highly skilled technical resources as well as vendor engineering relationships, the selected individual will participate in big data platform projects from inception up and through the daily maintenance phase of a project.

Responsibilities
The selected individual will:
•    Hadoop system administrator and DevOps engineer
•    Expert knowledge of Pivotal Hadoop and Greenplum
•    Define and evolve workflow processes with a focus on scale for data management and cloud enabled platforms
•    Maintain 365/24/7 client facing production systems
•    Scale data access in a growing, geographically distributed system, driving availability up and driving latency down
•    Ensure uptime and performance SLA’s are achieved, while Improving operability, maintainability, stability and performance
•    Expert Linux experience
•    Disaster recovery and business continuity planning. Protect the business by identifying and mitigating any security risk and concerns
•    Work Collaboratively with all levels of business stakeholders to architect, implement and test Big Data based analytical solution from disparate sources
•    Security, encryption for Big Data environments
•    Monitoring, alerts, and automated recovery and scriptiing
•    Participate in an Agile SDLC to deliver new cloud platform services and components
•    Champion best practices for Linux administration and Security for delivery of cloud data services

Qualifications / Requirements

•    MS in Computer Science, Engineering, Math, or Physics.
•    10+ years' experience as a technology leader with more than 5 years specializing in big data architecture at the senior level.
•    Operational experience managing Hadoop clusters, data warehouse/ NoSQL and large database implementation and support.
•    5+ years hands on experience with UNIX, Linux system administration. in terms of both tools and systems administration
•    Equal comfort in Microsoft and Unix/Linux environments in terms of both tools and administration

No comments:

Post a Comment