Data Engineer – GCP & Python

Data Engineer – GCP & Python

ProViso Consulting

Story Behind the Need:

• Business group: Enterprise Platform Architecture & Enablement – building data pipelines from end to end
• Project: Initiative working with an external vendor (fraud and know your customer technology) – ensuring data is properly ingested, structured; work in coordinating with fraud, digital teams and business partners to ensure data is effectively shaped-stage: through proof of concept, current coordination with vendor partners and ensuring their technologies align to ours, early stages and ramping up

Candidate Value Proposition:

• The successful candidate will have the opportunity to gain exposure to advanced technologies in big data analytics, such as Spark, GCP, Kafka, MongoDB; work in a top 5 Canadian Bank.
• An inclusive and collaborative working environment that encourages creativity, curiosity, and celebrate success
• Exposure to different business lines where analytics techniques are being applied
• Assignment in hands-on practical projects which provide an opportunity to gain new knowledge and develop skills

Typical Day in Role:

• Collaborate with business lines and other stakeholders and identify opportunities to drive business value by leveraging data engineer.
• Efficiently handle large volumes of structured and unstructured data through ingestion, modeling, transformation, and storage across diverse data stores. Leverage distributed computing tools (e.g., Spark, Cloud) for analysis, data mining, and modeling
• Collaborate with operation, and other analytics teams to deploy models and automate pipelines/ workflows in production across different channels and platforms
• Work with the team to design the project architecture and road map.
• Create and apply model and algorithm testing strategies to measure the effectiveness of models, and make ongoing changes.
• Prepare detailed documentation to outline data sources, models, and algorithms used and developed.
• Present results to business line stakeholders and help implement real world data-driven changes.

Candidate Requirements/Must Have Skills:

• 10+ years of progressive experience as Data Engineer, Software Developer, or similar technical roles.
• 5+ years’ experience with SQL/NOSQL and relational databases technologies/concepts for both querying/manipulating datasets and managing/controlling databases.
• 3+ years’ experience with Google Cloud Platform about Data Ingesting and on-prem to cloud environment data replication.
• 5+ years’ combined experience programming in Python or equivalent languages (Scala, Java), and tools such as Spark, Airflow (please list which)

Nice-To-Have Skills:

• Experience with DBT (Data Building Tool)
• Experience in creating Docker images
• Experience/knowledge of MongoDB
• Experience from banking/financial services

Soft Skills Required:

• Excellent written, verbal, and interpersonal skills for executive level communication and collaboration
• Able to work in a fast-paced, constantly evolving environment and manage multiple priorities.
• Pragmatic and capable of solving complex issues.
• Experience working with a variety of cross-functional teams


• Bachelor’s Degree or equivalent in in Computer Science, Engineering, or relevant field.

Best VS. Average Candidate:

• GCP, Python and Spark experience best for programming

Candidate Review & Selection:

• MS Teams Video Interviews – 1st – HM interview round – 45 min Max
• 2nd – Panel Technical round – 45 min max – potential exercise/case study component

Job Details



6 months



Latest Blogs

© 2020 ProViso Consulting - Toronto Recruitment and Staffing Agency

× Chat

Send this to a friend