Data Engineer
Bengaluru, IN
About SKF
SKF started its operations in India in 1923. Today, SKF provides industry leading automotive and industrial engineered solutions through its five technology-centric platforms: bearings and units, seals, mechatronics, lubrication solutions and services. Over the years the company has evolved from being a pioneer ball bearing manufacturing company to a knowledge-driven engineering company helping customers achieve sustainable and competitive business excellence.
SKF's solutions provide sustainable ways for companies across the automotive and industrial sectors to achieve breakthroughs in friction reduction, energy efficiency, and equipment longevity and reliability. With a strong commitment to research-based innovation, SKF India offers customized value-added solutions that integrate all its five technology platforms.
To know more, please visit: www.skf.com/in
SKF India has been recognized as a "Top Employer 2024" by the Top Employers Institute, acknowledging excellence in People Practices.
To know more, Please visit: https://www.skf.com/in/news-and-events/news/2024/2024-01-18-skf-india- recognized-as-top-employer-2024-by-top-employers-institute
About Automotive
SKF is a leading global supplier in the automotive industry, providing solutions for various applications such as wheel end, powertrain, driveline, and suspension. SKF's automotive business is focused on delivering innovative solutions to OEMs and the aftermarket. The company has a strong presence in the automotive market, with a range of products and services designed to improve performance, reduce friction, and increase efficiency. SKF's automotive business is constantly evolving to meet the changing needs of the industry, with a focus on sustainability, electrification, and digitalization.
SKF Purpose Statement
Together, we re-imagine rotation for a better tomorrow.
By creating intelligent and clean solutions for people and the planet
JOB DESCRIPTION
Job Title: Data Engineer
Role Reports :Product & Data Engineering Manager, VA
Role Type: Individual Contribution
Location: Bangalore
Role Purpose:
Design, build, and maintain the data infrastructure and systems that support SKF VA data needs. By leveraging their skills in data modeling, data integration, data processing, data storage, data retrieval, and performance optimization, this role can help VA manage and utilize their data more effectively.
Key Responsibilities
- Build an VA data warehouse which is scalable, secured, and compliant using snowflake technologies. This would include designing and developing Snowflake data models
- Work with Central data warehouse like SDW, MDW, OIDW to extract data and enrich with VA specific customer grouping, program details etc.
- Data integration: Responsible for integrating data from ERP’s, BPC and other systems into Snowflake, SKF standard DW’s ensuring that data is accurate, complete, and consistent.
- Performance optimization: Responsible for optimizing the performance of Snowflake queries and data loading processes. Involves optimizing SQL queries, creating indexes, and tuning data loading processes.
- Security and access management: Responsible for managing the security and access controls of the Snowflake environment. This includes configuring user roles and permissions, managing encryption keys, and monitoring access logs.
- Maintain existing databases, warehouse solutions addressing support needs, enhancements Troubleshooting etc.
Metrics
- Technical metrics: Data quality for whole of VA BU, data processing time, data storage capacity and systems availability
- Business metrics: data driven decision making, data security and compliance, cross functional collaboration.
Competencies
- Should have a good understanding of data modeling concepts and should be familiar with Snowflake's data modeling tools and techniques.
- Specialized in native data applications development, with a proven track record of expertise in Snowflake, Snowpark Packages, Streamlit and more.
- Experienced in creating streamlined and efficient data applications, leveraging the capabilities of modern, open-source packages for user interface development and Snowflake for seamless data integration.
- SQL: Should be expert in SQL. Should be able to write complex SQL queries and understand how to optimize SQL performance in Snowflake.
- ETL/ELT: Should be familiar with Snowflake's ETL/ELT tools and techniques.
- Should have a good understanding of cloud computing concepts and be familiar with the cloud infrastructure on which Snowflake operates.
- Good understanding of data warehousing concepts and be familiar with Snowflake's data warehousing tools and techniques
- Familiar with data governance and security concepts
- Able to identify and troubleshoot issues with Snowflake and SKF’s data infrastructure
- Experience with Agile solution development
- Good to have – knowledge on SKF ERP systems (XA, SAP, PIM etc.), data related sales, supply chain data, manufacturing.
Candidate Profile:
- Bachelor’s Degree in computer science, Information technology or a related field
- 5 - 8 Years of overall experience with minimum two years of experience in Snowflake