Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers.
About Us:
Our global community of colleagues bring a diverse range of experiences and perspectives to our work. You'll find us working from a corporate office or plugging in from a home desk, listening to our customers and collaborating on solutions. Our products and solutions are vital to businesses of every size, scope and industry. And at the heart of our work you’ll find our core values: to be data inspired, relentlessly curious and inherently generous. Our values are the constant touchstone of our community; they guide our behavior and anchor our decisions.
KEY RESPONSIBILITIES:
Design and Develop Data Pipelines:
Architect, build, and deploy scalable and efficient data pipelines within our Big Data ecosystem using Apache Spark and Apache Airflow.
Document new and existing pipelines and datasets to ensure clarity and maintainability.
Data Architecture and Management:
Demonstrate familiarity with data pipelines, data lakes, and modern data warehousing practices, including virtual data warehouses and push-down analytics.
Design and implement distributed data processing solutions using technologies like Apache Spark and Hadoop.
Programming and Scripting:
Exhibit expert-level programming skills in Python, with the ability to write clean, efficient, and maintainable code.
Cloud Infrastructure:
Utilize cloud-based infrastructures (AWS/GCP) and their various services, including compute resources, databases, and data warehouses.
Manage and optimize cloud-based data infrastructure, ensuring efficient data storage and retrieval.
Workflow Orchestration:
Develop and manage workflows using Apache Airflow for scheduling and orchestrating data processing jobs.
Create and maintain Apache Airflow DAGs for workflow orchestration.
Big Data Architecture:
Possess strong knowledge of Big Data architecture, including cluster installation, configuration, monitoring, security, resource management, maintenance, and performance tuning.
Innovation and Optimization:
Create detailed designs and proof-of-concepts (POCs) to enable new workloads and technical capabilities on the platform.
Collaborate with platform and infrastructure engineers to implement these capabilities in production.
KEY REQUIREMENTS:
Minimum of 8 years hands-on experience with Big Data technologies e.g. Hadoop, Spark, Hive.
Minimum 3+ years of experience on Spark
Hands on experience with dataproc is a HUGE plus.
Minimum 6 years of experience in Cloud environments, preferably GCP
Any experience with NoSQL and Graph databases
Hands on experience with managing solutions deployed in the Cloud, preferably on AWS
Experience working in a Global company, working in a DevOps model is a plus
Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's Privacy Notice and Cookie Policy, which governs the processing of visitor data on this platform.
Diese Cookies sind für das Funktionieren der Website erforderlich und können in unseren Systemen nicht abgeschaltet werden. Sie können Ihren Browser so einstellen, dass er diese Cookies blockiert, aber dann könnten einige Teile der Website nicht funktionieren.
Sicherheit
Benutzererfahrung
Zielgruppenorientierte Cookies
Diese Cookies werden über unsere Website von unseren Werbepartnern gesetzt. Sie können von diesen Unternehmen verwendet werden, um ein Profil Ihrer Interessen zu erstellen und Ihnen an anderer Stelle relevante Werbung zu zeigen.
Google Analytics
Google Ads
Wir benutzen Cookies
🍪
Unsere Website verwendet Cookies und ähnliche Technologien, um Inhalte zu personalisieren, das Nutzererlebnis zu optimieren und Werbung zu indvidualisieren und auszuwerten. Indem Sie auf Okay klicken oder eine Option in den Cookie-Einstellungen aktivieren, stimmen Sie dem zu.
Die besten Remote-Jobs per E-Mail
Schliess dich über 5'000+ Personen an, die wöchentlich Benachrichtigungen über Remote-Jobs erhalten!