At Dun & Bradstreet, we believe data has the power to create a better tomorrow. As a global leader in business decisioning data and analytics, we help companies worldwide grow, manage risk, and innovate. For over 180 years, businesses have trusted us to turn uncertainty into opportunity. We’re a diverse, global team that values creativity, collaboration, and bold ideas. Are you ready to make an impact and help shape what’s next? Join us! Explore opportunities at dnb.com/careers.
The Role:
The Data Quality Automation Engineer is responsible for automating and scaling existing Data Quality Insights processes across D&B’s global platforms and products. This role focuses on transforming manual, rule-based, and analyst-driven workflows into reliable, repeatable, and observable automated solutions.
The engineer partners closely with Data Quality Insights leadership, product teams, and data engineering to improve the consistency, timeliness, and accuracy of how data quality is measured, monitored, and reported across the enterprise.
Key Responsibilities:
Design, build, and maintain automation solutions that replace or augment existing manual data quality processes.
Translate existing data quality rules, checks, and validation logic into scalable, automated, production-ready pipelines and services.
Automate recurring Data Quality Insights workflows such as:
Accuracy and completeness measurement
Rule execution and exception handling
Monitoring, alerting, and reporting
Partner with Data Quality Insights leadership to ensure automated solutions preserve business intent and data meaning.
Employ advanced data analysis and profiling techniques to support automation and monitoring efforts.
Automate data quality monitoring solutions and internal processes to improve scale and reliability.
Implement and enhance a robust data validation framework with automated testing processes.
Create or update data models to ensure data is stored in an organized and usable structure.
Improve reliability and efficiency of DQ processes through orchestration, scheduling, and error handling.
Generate regular reports on data quality metrics and review data to identify trends or patterns that may indicate errors in processing.
Implement logging, metrics, and dashboards to support ongoing monitoring and auditability of data quality outputs.
Collaborate with platform, product, and analytics teams to integrate automation into existing data ecosystems.
Develop and maintain documentation of data quality processes, procedures, and automation assets.
Recommend improvements to data quality team internal processes.
Comply with data governance policies and procedures.
Key Requirements:
Bachelor’s degree in computer science, engineering, or equivalent experience.
Strong experience with data engineering or data automation in large-scale environments.
Experience implementing automated data validation and testing frameworks.
Proficiency in SQL and at least one programming language commonly used for data processing (e.g., Python, Java, or similar).
Familiarity with Airflow, GCP Composer, and Terraform.
Experience working with:
Data pipelines, ETL/ELT frameworks, or workflow orchestration tools
Structured and semi-structured data
Solid understanding of data quality dimensions (accuracy, completeness, consistency, timeliness).
Ability to translate business-defined rules into technical implementations.
Strong analytical, problem‑solving, and process‑improvement skills.
Strong communication skills and the ability to articulate data issues and solutions.
Ability to work independently while collaborating effectively across teams and time zones.
Commitment to meeting deadlines and supporting release schedules.
Preferred Qualifications
Experience with DevOps best practices including CI/CD, automation, monitoring, and observability.
Experience automating data quality checks at enterprise scale.
Experience with cloud-based data platforms and distributed systems.
Understanding of ETL processes and their impact on data quality.
Experience working with monitoring, observability, or reporting tools.
Experience using AI‑assisted development tools such as Copilot Studio, Gemini Code Assist, or Claude Code.
Experience supporting global data products or platforms.
Additional Information
All Dun & Bradstreet job postings can be found at https://jobs.lever.co/dnb. Official communication from Dun & Bradstreet will come from an email address ending in @dnb.com.
Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever, a subsidiary of Employ Inc. Your use of this page is subject to Employ's Privacy Notice and Cookie Policy, which governs the processing of visitor data on this platform.
Questi cookie sono necessari per il funzionamento del sito e non possono essere disattivati nei nostri sistemi. È possibile impostare il proprio browser in modo da bloccare questi cookie, ma alcune parti del sito potrebbero non funzionare.
Sicurezza
Esperienza dell'utente
Cookie orientati al gruppo target
Questi cookie sono impostati attraverso il nostro sito web dai nostri partner pubblicitari. Possono essere utilizzati da queste aziende per profilare i vostri interessi e mostrarvi pubblicità pertinenti altrove.
Google Analytics
Google Ads
Utilizziamo i cookie
🍪
Il nostro sito web utilizza i cookie e tecnologie simili per personalizzare i contenuti, ottimizzare l'esperienza dell'utente e per indvidualizzare e valutare la pubblicità. Facendo clic su Ok o attivando un'opzione nelle impostazioni dei cookie, l'utente accetta questo.
Le migliori offerte di lavoro da remoto via e-mail
Unisciti alle oltre 5'000+ persone che ricevono notifiche settimanali sulle offerte di lavoro da remoto!