FTC Data Engineer

June 14, 2021
Application ends: August 10, 2021
Apply Now

Job Description

Position Description  

To scale their ability to deliver consistently and globally, the Workplace Experience Services Team (WES) under WSS, is seeking individuals to contribute to the development of a large-scale data warehouse (shortly on cloud), leveraging that warehouse to cluster employees into personas and create recommendations for hardware, software, training et al. We are looking for a hands-on Data Engineer who will lead the way for our Experience Analytics environment. The ideal Engineer will bring their experience, best practice and collaborative attitude to help drive all Data Engineering efforts and is required to have a deep technical understanding of the underlying framework, application functionalities, programming knowledge & APIs.  

Job Functions/Duties and Responsibilities  

• We are looking for an individual who can take ownership to expand the opportunities as that of a Data Engineer  

• Work closely with the development team, architects and the Product Owner to build efficient data pipelines and lifecycle processes leveraging the appropriate technology stack  

• No compromise should be done in following the Agile ceremonies starting from backlog grooming, sprint planning, daily standups up to retrospectives  

Skills Required  

• Bachelor’s degree in Computer Science or Master’s degree in Computer Applications and related technical field  

• Possess at least 5 years of experience working with geographically distributed teams and infrastructure preferably in a high release velocity environment holding relevant experience across the Product lifecycle  

• Brings in a minimum of 3 strong years of experience in Python and SQL  

• Experience building schemas and data models to scale and automate workflows  

• Experience and knowledge of modern data stores including but not limited to Hadoop, relational databases, cloud services (AWS, GCP, Azure)  

• Good understanding of creating and automating data pipelines in a Big Data environment using Sqoop, Hive, Spark, and Airflow  

• Knowledge of reporting/analytic techniques and tools like Tableau and/or Power BI 

• Coordinate with the Product Owner & development Team to deliver one stable enterprise software product  

• Self-motivated with exceptional oral and written communication skills and the ability to communicate clearly and concisely  

• Good communication and presentation skills: ability to communicate in a clear and concise manner, across all stakeholder groups and with staff from junior to senior levels  

Skills Desired  

• Strong knowledge of Financial markets, Institutional Banking & Wealth Management businesses  

• Working knowledge of JIRA or any other workflow management tool  

• Strong SQL, HiveQL, Spark, ETL experience  

• Working knowledge of scheduling tools like Autosys  

• Knowledge of working with a collaborative data science platform such as Dataiku  

• Strong technical desire to use analytics proactively, leveraging the right tools that are needed to be data-driven in the financial services domain  

• Good eye for detail, value the quality and polish the work collaboratively with the team to produce great results  

Additional Details  

This role is based out of Bangalore, India. Work timings – 11am IST to 9pm IST