Job Description
Location: Mumbai/Pune
Experience: 8 – 12 Years
Notice Period: Immediate to 30 Days
Type: Work From Office
Educational Qualification: Bachelor’s degree or equivalent experience.
Preferred Experience: Snowflake, ETL,JSON, AWS(S3), any other cloud implementation expertise
Do you love finding innovative solutions to challenging issues? Do you have a digital worldview that allows you to dream, plan, design, develop and maintain the data architecture? If so, we have the ideal position for you!
We are looking for a Snowflake Senior Data Engineer to join our team and assist us in navigating the constantly shifting various Data Integration & Data warehouse projects in Snowflake.
We want to hear from you if you’re up for the challenge of being a Snowflake Senior Data Engineer!
Join our data engineer experts on a mission to address the most challenging data functions in one way or another and help us to implement the company’s data strategy while adhering to standard procedures and techniques. By submitting your application immediately.
Responsibilities:
- Lead and Architect migration of data analytics environment from RDBMS to Snowflake with performance and reliability.
- Assess and understand the ETL jobs, workflows, BI tools and reports Responsible for the end-to-end architecture of a Delivery, including its assembly and integration into the IT architecture principles defined with the client.
- Responsible for designing and implementing API integration as well Native Snowflake Capabilities (SnowPipe, SnowPack, Spark, Snow Stream) processes to load data from various sources into the Snowflake data warehouse.
- Responsible for optimizing Snowflake’s performance by designing and implementing efficient data processing, storage, and retrieval processes. This includes optimizing queries, indexing, partitioning, and other performance-related features.
- Must communicate benefits and limitations of Snowflake with interest of organization and help them understand how to best leverage capabilities.
- Lead and Manages the cloud infrastructure where the Snowflake data warehouse is hosted, including monitoring performance, managing scalability, and optimizing costs.
- Responsible for the management and mitigation of technical risks, ensuring that the Delivery services can be realistically delivered by the underlying technology components.
- Demonstrated ability to have successfully completed multiple, complex technical projects and create high-level design and architecture of the solution, including class, sequence and deployment infrastructure diagrams.
- Collaborates with other teams and stakeholders, such as data engineers, data analysts, and business users, to understand their data requirements and design solutions that meet their needs.
- Experience with gathering end-user requirements and writing technical documentation
- Time management and multitasking skills to effectively meet deadlines under time-to-market pressure
- Must Suggest innovative solutions based on technologies and the latest trends to teams.
- Review the architectural/ technological solutions for ongoing projects and ensure right choice of solution.
Skills Required
- Must have total 8-12+ yrs. in IT and 3+ years’ experience working as a snowflake Data Architect and 8+ years in Data warehouse, ETL, BI projects. Understanding on any snowflake-compatible ELT tools like DBT, FiveTran etc
- In depth understanding of Snowflake Architecture including Snow SQL, Performance tuning, Compute and Storage
- Experience in creating tables, partitioning, bucketing, loading and aggregating data using Snowflake
- Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.
- Expertise in Snowflake data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
- Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
- Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
- Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, SnowPack techniques using Python
- Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
- Experience with data security and data access controls and design
- Strong analytical and problem solving skills; strong communication skills
- Proven and routine attention to detail, organization, quality and deadlines
What We Offer:
- Career and competence support.
- Clearly defined career paths
- Personal Accident Policy
- Paid Maternity Leave and Paternity Leave
- Employee Assistance Program
- Gratuity
- Relocation Assistance
- Open Door Policy
- Disability Income Protection
- Equal Employment Opportunity