Job # 546645
What you’ll be doing...
Looking for a big data developer to work with a small team responsible for building, deploying, and supporting a solution that will enable financial operations for a large enterprise environment. You will design, build, and maintain enterprise level data pipelines utilizing the tools available within the big data ecosystem. You will work on advanced analytics using technologies such as Hadoop and Data Warehousing.
- Responsible for building the end to end data pipeline for the Enterprise Planning Platform (EPP).
- Partner with the business units, data governance, data scientists, and infrastructure teams to translate potential data-driven ideas into fully developed analytical solutions.
- Improve existing data pipelines by streamlining and increasing performance.
- Design, build and deploy new data pipelines required for supporting Digital Capital Allocation Tool (DCAT) in Oracle cloud.
- Apply proper architecture guidelines to ensure highly available services.
- Develop and automate processes for the maintenance of the environment.
- Implement security measures for all aspects of the cluster (SSL, disk encryption, role-based access).
- Ensure proper resource utilization between the different development teams and processes.
- Design and implement a toolset that simplifies provisioning and support of a large cluster environment.
- Review performance stats and query execution/explain plans; recommend changes for tuning.
- Create and maintain detailed, up-to-date technical documentation.
- Keeps management updated on projects and assigned work.
What we’re looking for...
You thrive in a collaborative environment and work well with multiple stakeholders at all levels.
You'll need to have:
- Bachelor’s degree or four or more years of experience.
- Six or more years of relevant work experience.
- Experience with Hadoop Hortonwork Data Platform.
- Experience with Relational databases like Oracle, Teradata, SQL Server.
- Demonstrated design and development work experience with Data platforms like Hadoop and Oracle.
Even better if you have:
- Master’s degree in Computer Science or related field.
- Strong understanding of ETL Design Patterns, best practices, and ETL Tools.
- Experience with Oracle PLSQL, Hive, Python, Spark.
- Experience with Linux, Unix, and Shell scripting.
- Experience implementing security measures for all aspects of the cluster (SSL, disk encryption, role-based access).
- Strong analytical and problem-solving skills.
- Ability to communicate clearly and succinctly both written and orally, and present products and ideas in a business-like manner.
When you join Verizon...
You’ll have the power to go beyond – doing the work that’s transforming how people, businesses and things connect with each other. Not only do we provide the fastest and most reliable network for our customers, but we were first to 5G - a quantum leap in connectivity. Our connected solutions are making communities stronger and enabling energy efficiency. Here, you’ll have the ability to make an impact and create positive change. Whether you think in code, words, pictures or numbers, join our team of the best and brightest. We offer great pay, amazing benefits and opportunity to learn and grow in every role. Together we’ll go far.
Equal Employment Opportunity
We're proud to be an equal opportunity employer - and celebrate our employees' differences, including race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, and Veteran status. Different makes us better.