Story Behind the Need
• Business group: The GWRT Compliance Technology group ensures ongoing compliance and optimization of trade platforms on a global scale.
• Project: The group is working on a Trade Surveillance Project that will leverage the banks Data Lake for vendor applications to create new data feeds utilizing Kafka and Nifi and Spark technology.
Candidate Value Proposition
• The successful candidate will have the opportunity to use the newest Nifi technology to optimize trade platforms on a global scale. The candidates will also have an opportunity to join a flexible team that utilize Agile best practices to ensure projects deliverables continue to be met.
Typical Day in Role
• Analyze highly complex business requirements; generate technical specifications to design or redesign complex software components and applications
• Design, implement, automate and maintain large scale enterprise data ETL processes.
• Build high-performance algorithms, prototypes, predictive models and proof of concepts.
• Leverage industry best practices to design, test, implement and support a solution
• Assure quality, security and compliance requirements are met for supported area
• Be flexible and thrive in an evolving environment
• Adapt to change quickly and adjust work accordingly in a positive manner
Candidate Requirements/Must Have Skills:
• 2 + years’ experience with Python programming language
• 2 -3 years’ experience with Apache Spark (as long as it is new and hands on)
• 2 + years’ experience updating data lakes with S3/Hadoop
• 2 + years’ using Agile Methodology throughout the SDLC Cycle
• Strong communication skills required to clearly articulate technical requirements with sprint teams in virtual settings
• Apache Nifi
• Elastic Search
• Capital Markets experience in Regulatory or Derivative technology is a plus
Degrees or certifications:
• Bachelor’s degree in a technical field such as computer science, computer engineering or related field required