Platzhalter Bild

Principal Data Engineer | L'Oréal Careers at L'Oréal

L'Oréal · Mexico City, Mexico · Onsite

Apply Now
About the Role We seek a highly skilled and motivated Principal Data Engineer with a strong focus on cloud-native data platforms and modern DevOps practices to join our dynamic team. As the Principal Data Engineer, you will be pivotal in shaping the technical direction of our data-centric products and features, leveraging the power of AWS cloud services and the Databricks unified data analytics platform. This is a position requiring deep technical expertise, a passion for data, and a proven ability to guide and mentor engineering teams in building scalable, reliable, and automated data solutions. You will be the subject matter expert for all data-related aspects of our product development lifecycle, ensuring operational excellence through DevOps principles. Experience in the commerce domain is a significant advantage. If you're excited by the challenge of building and optimizing high-impact data systems within a global beauty leader, we encourage you to apply. Key Responsibilities: Technical Leadership: Provide technical guidance to a team of data engineers, fostering best practices in AWS data services, Databricks development, and DevOps methodologies for the successful delivery of high-quality, scalable data solutions.Architecture & Design: work in the design and implementation of robust data architectures, pipelines, and systems that support our product strategy. This includes selecting appropriate AWS services (e.g., S3, API Gateway, Lambda, DynamoDB, IAM, Kinesis, EC2), designing end-to-end pipelines for Databricks Lakehouse (Delta Lake), and ensuring data integrity and performance.Data Strategy: Collaborate with product management and business stakeholders to define and execute our data strategy, aligning technical solutions with business objectives, particularly through the strategic adoption of cloud and data platform capabilities.Code Quality & Performance: Champion best practices in code quality, testing, and performance optimization within Databricks notebooks/jobs and other data applications. Implement and advocate for CI/CD pipelines for data solutions, conduct thorough code reviews, and ensure the maintainability and scalability of our data infrastructure.DevOps & Automation: Drive the adoption of DevOps principles across the data engineering lifecycle, including infrastructure as code (IaC) using tools like CloudFormation or Terraform, automated testing, deployment automation for Databricks workloads, and robust monitoring and alerting for data pipelines and services on AWS.Innovation: Stay abreast of emerging technologies and features in the Cloud ecosystem, Databricks platform, and general data/DevOps space. Explore innovative solutions to enhance our products and data capabilities, promoting a culture of continuous improvement.Collaboration: Work closely with cross-functional teams, including product management, data science (especially in an AI Models), and business intelligence, to deliver impactful data-driven solutions.Technical Roadmap: Contribute to the development and execution of the technical roadmap for our data products, aligning with overall business goals and leveraging strategic cloud and data platform advancements.Mentorship & Development: Mentor and guide junior team members, fostering their technical growth and development in cloud data engineering, Databricks, and DevOps practices.Documentation: Create and maintain comprehensive technical documentation, including architecture diagrams for cloud data flows, design specifications for Databricks Lakehouse implementations, data dictionaries, and operational runbooks for automated pipelines and deployments. This documentation will be essential for knowledge sharing, onboarding new team members, and ensuring the long-term maintainability of our data systems.Qualifications: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.Proven experience as a Technical Lead or Principal Engineer, with a strong focus on data-intensive applications and cloud environments.Deep understanding of data warehousing, ETL/ELT processes, and data modeling techniques, particularly in a cloud-native context.Expertise in designing, building, and optimizing data solutions on AWS, utilizing services such as S3, Lambda, Kinesis, EC2, DynamoDB, IAM, API Gateway, Step Functions, and DMS.Extensive hands-on experience with the Databricks platform, including Spark, Delta Lake, Databricks SQL, MLflow (AI Models), and workspace management.Proficiency in SQL and at least one scripting language (e.g., Python, Java, Scala) for data engineering tasks.Demonstrated experience with DevOps principles and tools, including CI/CD pipelines (e.g., GitHub Actions), infrastructure as code (Terraform, CloudFormation), containerization (Docker), and monitoring tools.Strong communication and collaboration with an ability to articulate complex technical concepts to diverse audiences.Passion for data and its potential to drive business value.Bonus Points: Experience with data visualization tools and techniques, integrating with Databricks and AWS data sources.Familiarity with machine learning concepts and algorithms, and practical experience with MLOps on Databricks.Experience in the cosmetics or beauty industry.AWS certifications (e.g., AWS Certified Data Analytics – Specialty, AWS Certified DevOps Engineer – Professional).Experience in designing and maintaining CI/CD pipelines, including an agile methodology.Databricks Certifications (e.g. Associate Data Engineer, Professional Data Engineer).The L'Oréal Group is convinced that difference is a deep source of wealth, that allows everyone to grow, to challenge themselves and to go further.We strongly encourage everyone to always dare and never censor themselves. Skills can always be learnt.We will be delighted to exchange with you!Ready to build the future of L'Oréal together? Apply now! Apply Now

Other home office and work from home jobs