Data Engineer – ETL

Data Engineer – ETL

ProViso Consulting

Business group:

• The purpose of this role is to support the Director of Analytics at bank’s Global Contact Centres in developing advanced analytics (AI / ML) capabilities through data engineering.
• The primary focus of the role is to work with the Customer Insights & Analytics Team and the Global Data & Analytics Team in building and maintaining data pipelines and assisting the MIS & Reporting team with designing and evolving departmental reporting platforms and infrastructure

Candidate Value Proposition:

• The successful candidate will have the opportunity to work within bank. We are technology partners who help the business transform how our employees around the world work. You will get to work with and learn from diverse industry leaders, who have hailed from top technology.

Typical Day in Role:

• Collaborate with stakeholders to deliver data models to address operational needs
• Combine multiple data sources across all contact center platforms and applications to support advanced analytics products
• Ingest massive volumes of structure and unstructured format data, model, transform and store it in a variety of data stores
• Support the Senior Data Engineer in defining data quality metrics and processes to monitor data in production environment
• Assist MIS & Data Analytics team with infrastructure development
• Develop ETL/ELT for analytics solutions using Python, Spark, SQL and Power BI
• Produce ad hoc analyses, deep-dives, and drill downs on specific issues, topics, or areas of opportunity (e.g. process improvements)
• Support the Senior Data Engineer in preparing reports and presentations to communicate findings to stakeholders
• Assist in mentoring and up-skilling peers for advanced analytics
• Streamline, enhance and automate existing products to create capacity for team to develop new solution

Candidate Requirements/Must Have Skills:

• 8- 10+ years using python or other programming languages , package management, dependencies, and deployment
• 8- 10+ years using SQL for ETL and data analysis, flexibility on syntax (SQL server, PostgreSQL)
• 8- 10+ years of data engineering experience working with cross-functional data teams
• 8- 10+ years of experience with data modelling, data warehousing and database design
• 8- 10+ years’ experience designing and building ETL/ELT, data pipelines, or data engineering solutions
• Strong Experience with Linux tools and shell scripting

Nice-To-Have Skills:

• Experience with cloud architecture and the security (Azure, AWS, GCP)
• Experience and understanding of various ML techniques including NLP
• Hands-on experience with Big Data ecosystem tools (e.g. Hadoop, Hive, Spark, BigQuery) and object storage (e.g. blob, MinIO, GCS).
• Understanding of Agile and Scrum methodologies and experience working in a Scrum environment (Jira and Confluence)
• Experience with Docker, CI/CD tools, and Airflow and Kubernetes
• French and / or Spanish fluency an asset
• Contact center experience an asset
• Experience with telephony data (Avaya, Genesys) and WFM data (Verint, Aspect) an asset

Soft Skills Required:

• Professional development skills
• Strong technical navigation skills to determine data sources, interpret technical documentation, and communicate requests and project requirements to IT partners
• Strong collaboration skills with ability to translate technical knowledge into business value
• Effective communication skills with ability to prepare project documentation and presentation for both technical and non-technical audiences
• Advanced problem-solving skills, and able to navigate an uncertain environment
• Excellent written, presentation, and verbal communication skills to be able to work well with technical peers and business stakeholders at different levels within the organization.
• Strong decision making, forward thinking and creative problem-solving skills to anticipate and respond quickly to technological/market influences.
• Ability to work as part of a team, as well as work independently or with minimal direction.


• University degree in science, computer science, math, statistics, finance, economics, or another quantitative field, or equivalent experience.

Best VS. Average Candidate:

• The best candidate is someone with the must and Nice to have
• The best candidate will have solid experience with Python and PostgreSQL, also data modeling and pipeline designing, implementation, SQL, Query skills, Linux shell script skills and cloud experience machine learning and advanced data analysis skills.

Candidate Review & Selection:

• 1 round – Technical interview -Coding test using Python language- MS Teams Video Interview – 1 hour – with HM and another team member
• 2nd round Director and HM MS Teams Video Interview – 30 mins – Attitude and behavioral questions and previous experience

Job Details



9 months



Latest Blogs

© 2020 ProViso Consulting - Toronto Recruitment and Staffing Agency

× Chat

Send this to a friend