Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionises customer engagement by transforming contact centres into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organisations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry.
Competencies:
Data Modelling: Skilled in designing data warehouse schemas (e.g., star and snowflake schemas), with experience in fact and dimension tables, as well as normalization and denormalization techniques.
Data Warehousing & Storage Solutions: Proficient with platforms such as Snowflake, Amazon Redshift, Google BigQuery, and Azure Synapse Analytics.
ETL/ELT Processes: Expertise in ETL/ELT tools (e.g., Apache NiFi, Apache Airflow, Informatica, Talend, dbt) to facilitate data movement from source systems to the data warehouse.
SQL Proficiency: Advanced SQL skills for complex queries, indexing, and performance tuning.
Programming Skills: Strong in Python or Java for building custom data pipelines and handling advanced data transformations.
Data Integration: Experience with real-time data integration tools like Apache Kafka, Apache Spark, AWS Glue, Fivetran, and Stitch.
Data Pipeline Management: Familiar with workflow automation tools (e.g., Apache Airflow, Luigi) to orchestrate and monitor data pipelines.
APIs and Data Feeds: Knowledgeable in API-based integrations, especially for aggregating data from distributed sources.
Responsibilities -
Design and implement analytical platforms that provide insightful dashboards to customers.
Develop and maintain data warehouse schemas, such as star schemas, fact tables, and dimensions, to support efficient querying and data access.
Oversee data propagation processes from source databases to warehouse-specific databases/tools, ensuring data accuracy, reliability, and timeliness.
Ensure the architectural design is extensible and scalable to adapt to future needs.
Requirement -
Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 Engineering institutes with relevant work experience with a top technology company.
3+ years of Backend and Infrastructure Experience with a strong track record in development, architecture and design.
Hands-on experience with large-scale databases, high-scale messaging systems and real-time Job Queues.
Experience navigating and understanding large scale systems and complex code-bases, and architectural patterns.
Proven experience in building high-scale data platforms.
Strong expertise in data warehouse schema design (star schema, fact tables, dimensions).
Experience with data movement, transformation, and integration tools for data propagation across systems.
Ability to evaluate and implement best practices in data architecture for scalable solutions.
Nice to have:
Experience with Google Cloud, Django, Postgres, Celery, Redis.
Some experience with AI Infrastructure and Operations.
Estas cookies son necesarias para que el sitio web funcione y no se pueden desactivar en nuestros sistemas. Puede configurar su navegador para bloquear estas cookies, pero entonces algunas partes del sitio web podrían no funcionar.
Seguridad
Experiencia de usuario
Cookies orientadas al público objetivo
Estas cookies son instaladas a través de nuestro sitio web por nuestros socios publicitarios. Estas empresas pueden utilizarlas para elaborar un perfil de sus intereses y mostrarle publicidad relevante en otros lugares.
Google Analytics
Anuncios Google
Utilizamos cookies
🍪
Nuestro sitio web utiliza cookies y tecnologías similares para personalizar el contenido, optimizar la experiencia del usuario e indvidualizar y evaluar la publicidad. Al hacer clic en Aceptar o activar una opción en la configuración de cookies, usted acepta esto.
Los mejores empleos remotos por correo electrónico
¡Únete a más de 5.000 personas que reciben alertas semanales con empleos remotos!