Hybrid Senior Snowflake production support - Bangalore presso Capco poland
Capco poland · India - Bengaluru, India · Hybrid

Exponent – Supera i tuoi colloqui tecnologici con sessioni simulate e coach esperti delle migliori aziende.
Sponsorizzato da ExponentJob Title: Senior Snowflake Production Support - Bangalore
About Us
“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO?
You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.
Location - Sarjapur, Bangalore
Experience - 8+yrs
Job description – Snowflake production support
Job description – Snowflake Applications Sr developer with production support – Offshore - Jefferies: Description: 1. Candidate with 8+ years of strong hands-on Snowflake Senior Developer with Production support experience specializing in the Snowflake cloud-based data warehousing platform on AWS/Azure cloud. 2. Responsible for designing, implementing, and managing complex data solutions within Snowflake, including data modeling, optimizing data pipelines, ensuring data integrity, and providing strategic guidance for complex data challenges. 3. Very good at building custom web applications using Python/Java and programmatically connecting to Snowflake databases, Data Exchanges and Snowflake Internal marketplace objects and able to expose the data on UI pages built on Python UI or Streamlit Libraries. 4. Good Data Engineering skills and thorough knowledge on Data sharing using Snowflake platform. 5. Good knowledge of Snowflake SQL/PL SQL and Snowpark functions development with data warehousing principles. 6. Good understanding of Snowflake organization/Account/History tables/views for capturing History information, Query tuning, Caching and etc. Skills & Responsibilities include: • Implementation & Production support skills: Designing scalable data models and schemas within Snowflake, considering data access patterns and performance optimization. • Data Pipeline Development: Building and managing efficient data ingestion pipelines using Snowflake's data loading capabilities to integrate data from various sources. • Good Snowflake Knowledge: Understanding of Snowflake features like data warehousing concepts, storage options, data sharing, and security mechanisms • SQL/PL SQL Optimization: Writing and optimizing complex SQL queries to ensure efficient data retrieval and analysis within the Snowflake environment. • Custom Web/native applications with Snowflake: Developing Python or Java web applications deployable on to AWS, with good CI/CD skills specifically on Snowflake CI/CD and Integrating with Snowflake tables/views/objects across various accounts. Integrating Snowflake with other data processing tools and applications through Snowflake SQL/REST APIs and connectors. • Proficient in SQL specifically with Snowflake database: Extensive experience writing complex SQL queries, including advanced functions and optimization techniques • Data Engineering Skills: Experience building data pipelines and ETL processes using Snowflake's data loading capabilities • Cloud Computing Understanding: Familiarity with cloud infrastructure (AWS, Azure, GCP) and how it integrates with Snowflake. Building web applications on AWS Containers and able to expose through End-points. • Programming Skills (Python/Java): Proficiency in Python or other scripting languages for data manipulation and automation. Good experience using IDEs like VSCode/Anaconda and Integrating with Cloud tools from AWS/Azure. • Communication and Collaboration: Excellent communication skills to collaborate effectively with data analysts, business users, and other technical teams.
If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Candidarsi ora