PwC Hadoop Architect (PwC Labs) in Tampa, Florida
PwC Labs is focused on standardizing, automating, delivering tools and processes and exploring emerging technologies that drive efficiency and enable our people to reimagine the possible. Process improvement, transformation, effective use of innovative technology and data & analytics, and leveraging alternative delivery solutions are key areas of focus to drive additional value for our firm.
The Data Lab will provide you with the opportunity to redesign, redefine, and redeploy how PwC uses data and analytics to be a strategic asset across the enterprise. The focus is on assisting teams as they incorporate increased automation, machine learning, analytics, open source technologies into their processes to deliver better quality output and contribute more strategically to organizational decision making.
To really stand out and make us ?t for the future in a constantly changing world, each and every one of us at PwC needs to be an authentic and inclusive leader, at all grades/levels and in all lines of service. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future.
As a Manager, you’ll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to:
Pursue opportunities to develop existing and new skills outside of comfort zone.
Act to resolve issues which prevent effective team working, even during times of change and uncertainty.
Coach others and encourage them to take ownership of their development.
Analyse complex ideas or proposals and build a range of meaningful recommendations.
Use multiple sources of information including broader stakeholder views to develop solutions and recommendations.
Address sub-standard work or work that does not meet firm’s/client’s expectations.
Develop a perspective on key global trends, including globalisation, and how they impact the firm and our clients.
Manage a variety of viewpoints to build consensus and create positive outcomes for all parties
Focus on building trusted relationships.
Uphold the firm’s code of ethics and business conduct.
We are seeking an experienced talented and passionate Hadoop / Spark / Big Data software engineers with deep understanding of large scale distributed data processing applications and Java technologies to join our Data Lab team.
Job Requirements and Preferences :
Basic Qualifications :
Minimum Degree Required :
Additional Educational Requirements :
In lieu of a Bachelor Degree, 12 years of professional experience involving technology-focused process improvements, transformations, and/or system implementations
Minimum Years of Experience :
Preferred Qualifications :
Degree Preferred :
Preferred Fields of Study :
Analytics, Artificial Intelligence and Robotics, Business Analytics, Computer and Information Science, Computer Engineering & Accounting, Management Information Systems, Mathematics
Preferred Knowledge/Skills :
Demonstrates thorough knowledge and/or a proven record of success in the following areas:
Administering ETL solutions focused on moving data from highly diverse data landscape into a centralized data lake;
Performing Enterprise Data Lake administration, application scaling, performance evaluation, and architecture suggestions;
Administering Big Data Pipelines in a Hadoop Ecosystem using Kafka, Hive, Hive w/Tez, HDFS, HBase, and Spark on Yarn (Hortonworks HDP preferred);
Performing in attribute-based security systems such as Atlas/Ranger, Colibra, etc.;
Establishing scalability and high availability, fault tolerance, and elasticity within big data ecosystem;
Managing a Data Lake using either Hortonworks HDP, Cloudera or AWS EMR/S3;
Understanding Data Management, Data Lakes, Data Warehousing, IaaS to SaaS in essence the complete stack, as well as performing HDFS, Zookeeper, Ambari, Ranger, Kerberos, Apache NiFI Data Pipelines, Managing cost/elasticity on Cloud;
Applying full stack development with comfort and familiarity of continuous integration tooling such as Docker, Jenkins, Chef, and Puppet.
Managing scalability and high availability, fault tolerance, and elasticity within big data ecosystem;
Understanding intimately Data Management, Data Lakes, Data Warehousing, IaaS to SaaS in essence the complete stack;
Hands-on with HDFS, Zookeeper,Ambari, Ranger, Kerberos, Apache NiFI Data Pipelines,
Creating physical architecture documentation, assisting in project planning, and leading staff in the operations of complex systems;
Building and orchestrating multiple clusters across a distributed enterprise cloud architecture built for scalability and performance;
Executing in Agile development including Scrum and other lean techniques, as well as performance optimization for Spark & Hive SQL Workloads;
Understanding of “Build, Ship, Monitor” philosophy.
All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law. PwC is proud to be an affirmative action and equal opportunity employer.
All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law.