Platzhalter Bild

Senior Enterprise Data Engineer at Sandisk

Sandisk · Bengaluru, India · Onsite

Apply Now
Company Description:

Sandisk understands how people and businesses consume data and we relentlessly innovate to deliver solutions that enable today’s needs and tomorrow’s next big ideas. With a rich history of groundbreaking innovations in Flash and advanced memory technologies, our solutions have become the beating heart of the digital world we’re living in and that we have the power to shape.

Sandisk meets people and businesses at the intersection of their aspirations and the moment, enabling them to keep moving and pushing possibility forward. We do this through the balance of our powerhouse manufacturing capabilities and our industry-leading portfolio of products that are recognized globally for innovation, performance and quality.

Sandisk has two facilities recognized by the World Economic Forum as part of the Global Lighthouse Network for advanced 4IR innovations. These facilities were also recognized as Sustainability Lighthouses for breakthroughs in efficient operations. With our global reach, we ensure the global supply chain has access to the Flash memory it needs to keep our world moving forward.

Job Description:
  • Hands on involvement in designing, developing, and supporting low-latency data pipelines using Oracle Data Integrator (ODI, Informatica) while ensuring data quality, accuracy, reliability, and efficiency
  • Operations Support for Data Warehouse, Data pipelines & ETL Processes

  • Design and Build scalable ETL/ELT pipelines for structured and unstructured data.

  • Optimize SQL Queries and design structure in Data warehouse to improve overall performance and query efficiency
  • Collaborating with cross-functional teams including, Enterprise Architects, DBA’s, Business Analysts, Solution Architects, IT & Business teams to understand and implement cross-functional data strategy and architecture
  • Implement data quality, lineage, and governance frameworks for reliability and compliance.

  • Partner with BSA’s and other data engineers to ensure compliance to standards of Data integrations, data management and review the data solutions
  • Drive architecture reviews and long-term data strategy, ensuring alignment with business goals.

  • Stay updated with the latest industry trends and best practices and share knowledge and expertise to modernize the data platform.

Qualifications:

REQUIRED

  • Bachelor’s degree or higher in Computer Science or Engineering or related field
  • Minimum 4+ years of experience working with Data Management including Data Integration/ETL (ODI, Informatica, Talend, etc.), Data Warehouse (Oracle), Data Lake, Master Data Management, Data Quality, Data Modeling, Data Analytics/BI, Data Enrichment, Security and Governance
  • Minimum of 3 years of experience focused specifically on Solution Architecture of large complex data management programs
  • Strong understanding and experience building ELT/ETL solutions
  • Expertise in SQL and Python programming languages.

  • Demonstrable experience working with data structures coming from variety of SAP, ERP, CRM and other data sources
  • Experience working with at least one major cloud data platforms like AWS, Azure, GCP

  • Experience and certifications in modern data platforms - Databricks, Snowflake

  • Strong data Modeling experience.

  • Design and implement conceptual, logical, and physical data models to support business processes and reporting needs.

  • Tune complex solutions, monitor system performance, and provide recommendations and means for improvement
  • Prototype new technologies and implement innovative solutions to enable teams to consume and understand data faster
  • Responsible for metadata management of data domain, inclusive of data definitions, data catalog, data lineage and documentation of data flow for critical processes, Sox compliance
  • Partner with Data Governance analyst and Business Data Stewards
  • Maintain in-depth understanding of business functions, processes, and relationships as it relates to data

SKILLS

  • Design and Build scalable ETL/ELT pipelines for structured and unstructured data.

  • Data Modelling & Data warehouse.

  • Ability to analyze, communicate and solve complex problems

  • SQL – Advanced querying and optimization.

  • Python – Expert-level proficiency with data-focused libraries such as Pandas, NumPy, and PySpark.
  • ETL/ELT Concepts – Scalable pipeline design and orchestration (ODI, Informatica, Talend)

  • Data Visualization – Power BI

  • Cloud Fundamentals – AWS, GCP, or Azure basics.

  • Data Engineering & Analytics – Data modeling, pipeline optimization, API and frameworks

  • Collaboration & Communication – Work cross-functionally with teams.
  • Analytical Thinking & Troubleshooting – Diagnose and resolve data issues effectively.

Additional Information:

You’ll drive the evolution of the enterprise data warehouse strategy, enabling the organization to harness data at scale for smarter decision-making. This role offers the opportunity to lead transformative, enterprise-wide data initiatives, optimize data integration and governance, and shape how the company leverages advanced analytics and AI to deliver innovation, efficiency, and business growth.

Apply Now

Other home office and work from home jobs