Data Engineer

Data Engineer

ProViso Consulting

Story Behind the Need:

• Business group: GBT Risk & Resiliency
• You will be part of the Resilience Analytics team that is part of Technology Internal Controls and Regulatory Management (ICRM) for Global Technology & Enterprise Platforms (GTEP) at Client. GTEP ICRM team plays an important role in the Bank’s Risk Governance Framework, providing First Line of Defense for GTEP and the Bank for all technology risk domains, including Cyber Security, Data Privacy, Software Currency, Disaster and Backup Recovery, Third Party Management, and Audit and Regulatory issue remediation.
• Project: Stable and resilient systems drive delivery of reliable business applications to our customers. The Resilience Analytics team’s mandate is to design metrics, gather data, analyse and provide insights to executives, execution teams, board representatives, and external regulators. Many of our projects involve the designing of new metrics and gathering of new data sources where creativity, and relationship building with possible sources is critical. We tell our story of resilience to a very senior audience and as such, a command of English is important as is the ability to analyze data and find insights. We work with process owners, technology owners, application owners, various risk, audit, and regulatory bodies across the globe.

Candidate Value Proposition:

• A 12-month contract position focused on delivering value within a data analytics framework using technology operations data.
• You will be part of the team transforming the way we do and measure IT services to build reliable and resilient systems for our customers.
• The role is mostly remote with occasional rips to work with team members (usually downtown). On-site access can be made available if required in several locations.
• We have an inclusive and collaborative environment encouraging creativity and curiosity.
• You’ll get to work with and learn from diverse industry leaders.
• We foster an environment of innovation and continuous learning.
• We care about our people, allowing them to design how they work to deliver amazing results.

Typical Day in Role:

• Reporting to the Director, Resilience Data Analytics, the Data Analytics Solution Expert (DASE) will be a senior contributor to the overall success of the Technology Internal Controls and Regulatory Management for GTEP by ensuring goals, plans, and initiatives are executed and delivered in support of the team’s business strategies and objectives. The DASE conducts all activities in compliance with governing regulations, internal policies and procedures.
• You are coming into this role to solve a single problem – but, we hope to leverage your wide and varied skills to enhance our team’s overall performance.
• You love solving a problem from beginning to end and are not shy about contributing to all levels of that solution from presentations to facilitating discussions to designing and even developing the solution.
• You can find the pattern in the chaos to recommend areas of continued study for risk mitigation that will improve technology stability.
• You will innovate to streamline and automate the processes within Resilience Analytics with an eye to the dimensions of data quality including timeliness, completeness, and consistency.
• You have an eye for technical design to help guide the engineering team in architecting data solutions involving ETL from data sources and streamlining presentation in dashboards.
• You are comfortable talking infrastructure and technology on a wide variety of platforms (Cloud – GCP, private cloud, Virtual machines, mid-range, AS/400, Mainframe, etc)
• You enjoy interfacing & managing stakeholders and data contributors across multiple business unit and technical areas of expertise.
• Your goal is to tell our resilience and reliability story around backups taken for our applications. Where are data backups taken and stored. Can they be used and access to meat our recover time and recovery point objectives? Are they immutable protecting us from cyber incidents. Help us identify gaps in our resilience and create a roadmap for improved performance.
• You will interface with multiple technology teams (back-up tool owners, application owners) to find data sources. You will work to normalize the data regardless of source to identify and measure key metrics. You will design and create ETL that can absorb large amounts of data to create the story. You will design and automate the discovery, ingestion, and normalization of data into SQL databases and then work to present them to tell the story in Power BI.
• When there are breaks or other priorities that might arise, you will drive a data driven focus on technology resiliency across the enterprise. A key contributor to a to designing metrics, finding data points, working with engineering teams on solutions, and designing resulting dashboards. On a day-to-day basis, you may be asked to review existing solutions to provide insights into potential areas of risk, to streamline the data gathering process,, or to help find new data sources

Candidate Requirements/Must Have Skills:

• You have 10+ years in as a technical engineer/architect resource in a data analytics environment.
• You have at least 7+ years of expertise in ETL & pipeline construction, ensuring efficient data flow and integration from various sources, and maintaining high standards of data quality, timeliness, completeness, and consistency.
• You have at least 7+ years of experience designing Data Warehouse experience and Data Modeling solutions ensuring data quality.
• You have 5+ years’ experience with solution architecture and enterprise architecture. That includes implementing data models and has knowledge on backup requirements for applications.
• You have 5+ years of working experience and demonstrated ability using most of the tools we use: Python, SQL, Power BI. Your experience includes leveraging Python for data manipulation, analysis, and automation, and Power BI for building comprehensive data models and visualizations that drive business insights.
• You have 2+ years’ experience working with back-ups and restores to backup solutions including some of the following: NetBackup, Tivoli, Veritas, Cohesity, HP Solutions.

Soft Skills Required:

• Critical thinking skills: using logic and reasoning to evaluate alternative solutions, summarize conclusions and approaches to mitigate risks. Excellent analytical, problem-solving, and communication skills.
• Strong leadership skills with a proven track record of mentoring and guiding data engineering teams to achieve project goals and foster a collaborative and innovative environment.
• Excellent problem-solving skills, with the ability to analyze complex problems, design innovative solutions, and implement them effectively, contributing to all levels of the solution from development to execution and presentation.
• Experience managing multiple projects, understanding the business needs, and prioritizing task effectively
• Ability to develop strong working relationships with business and technology partners globally.
• You have proven interpersonal and negotiation skills that allow you to build consensus and obtain co-operation from both support teams and senior management.
• Experience in writing for and presenting to a director + audience & can demonstrate the ability to synthesize quantitative & qualitative information and content for this audience.

Nice-To-Have Skills:

• Project Management, Risk Management, or ITIL experience or certifications is an asset.
• Experience in a Google Cloud environment is an asset.
• Spanish is an asset.

Education:

• Post-secondary education or Advanced degree in computer science. Education in data analytics is an asset.

Best VS. Average Candidate:

• The ideal candidate can understand complex data problems to create an architecture that would solve the problem.
• This would include architecture, onboarding of data pipelines, creating effective ETL visualizations of solutions, and the ability to manage a small team so the objectives can be achieved within a specific timeline.
• Has experience with IT operations experience especially backups, solutioning experience in Data analysis / Data engineering in a recent position. Also has great communications skills and Python, Power BI & ETL expertise.

Candidate Review & Interview Selection:

• 1st round – 30mins technical & 30mins soft skills – Panel of 4 – In-person interview is must
• 2nd round – with VP & Sr technical resource – 30mins on MS teams.

Job Details

12777

Contract

10 months

Toronto

 





Latest Blogs

© 2024 ProViso Consulting - Toronto Recruitment and Staffing Agency

× Chat

Send this to a friend