Story Behind the Need
• Business group: As a part of the Enterprise Data Architecture team, the Data Architect is required to translate requirements into desired end state logical data models that are sustainable, adaptable and aligned to business and enterprise needs. And also conform to and implement Enterprise architecture best practices for Conceptual, Logical and Physical Data Modeling, in the efforts in adopting the new Client Enterprise data strategy which is shifting from various source systems to Google Cloud Platform.
Candidate Requirements/Must Have Skills:
• 8 years+ of experience as a data architect working with Data Quality Tools/ Data Governance (Special emphasis on data implementation architecture principles and data governance guidelines), enterprise data model preferably with Google Cloud Platform/AWS/Azure, Google Hadoop, DataStage, SAS, Cloud Storage, Cloud Bigtable, Cloud SQL, Cloud Datastore and Python (the more experience they have with each of these the better)
• Experience with established and emerging data management and reporting technologies, like columnar and NoSQL databases, predictive analytics, data visualization, and unstructured data. (not necessarily required to have deep knowledge in all of these technologies but must understand them sufficiently to guide the organization in understanding and adopting them.
• Understanding of indexing, partitioning and data design performance considerations for industry standard DBMS
• Familiarity with Big Data, in-memory databases, OLAP & other emerging technologies
• 5+ years Expertise in leading, designing, developing, testing, maintaining, implementing, and documenting data architecture and data modeling (normalized, dimensional, logical, and physical) solutions for Enterprise Data Warehouse and Enterprise Data Marts
• Experience in Data Lake / Data Warehouse implementation – Transformation – 5+ years of experience
• Experience with Structured & Unstructured Data – 5+ years of experience
• Proficient in data modeling tools such as ER Studio or ERwin or related tools – 5+ years of experience
• 3+ years Experience in complex large-scale data warehouse and data integration projects
• Financial Services Industry Experience (North American).
• Open to convert to FTE
• Knowledge of Capital Markets and Risk Domain
• Knowledge of IBM Financial Services Data Model (FSDM)
• Good knowledge of LAMBDA Architecture and Patterns.
• Hands-on experience deploying in GCP using a combination of BigQuery, CloudSQL, Cloud Storage, Cloud Dataflow
• Experience Datalab, Cloud Dataproc, Cloud Pub/ Sub.
• A certification such as Google Cloud Professional Cloud Architect, Google Professional Data Engineer or related AWS
• Certified Solutions Architect / Big Data or Microsoft Azure Architect
• Data management and business Cloudary mapping experience with models
• Fluent in at least 2 programming languages, preferably Java and Python