Technical Lead-Master Data Management

Accenture


Date: 1 week ago
City: Calgary, Alberta
Contract type: Full time

We Are:

Accenture Data & AI, the people who love using data to tell a story. We’re also the world’s largest team of data scientists and experts in machine learning and AI. A great day for us Solving big problems using the latest tech, serious brain power, and deep knowledge of just about every industry. We believe a mix of data, analytics, automation, and responsible AI can do almost anything—spark digital metamorphoses, widen the range of what humans can do, and breathe life into smart products and services. Want to join our crew of sharp analytical minds.

You are:
A hands-on Consultant who knows how to develop, design, and maintain technologies that work for our clients. You are a next-generation thinker when it comes to what we call Data in the New. Your superpower Using the latest data and analytics technologies to help clients glean the most value from their data. And you know how to come up with strategies, solutions, and processes for managing enterprise-wide data throughout the data life cycle. You’re a digital native who’s right at home when the challenge calls for teamwork and originality.

What You’ll Do:

  • Design MDM solution with Informatica tools and utilities such as SaaS data integration and data quality components (e.g., IICS, CDQ etc.)
  • Defining Master Data Management strategies, designing MDM services and capabilities, Metadata Management, Data Governance, and Data Quality solutions ((e.g., IICS, Informatica CDQ, Informatica CDGC, Collibra etc.)
  • MDM Hub Data modeling & Data Mapping, Data validation, Match and Merge rules, building and customizing services with SaaS MDM tools like Informatica C360/B360, Reltio etc. and enterprise data integration technologies and methods such as ETL, data replication, data quality validations and golden records publication.
  • Enterprise level data analysis and integration work and/or providing data focused systems integration solutions that demonstrate business acumen and the ability to apply technology solutions to solve business problems.
  • Support Installation and perform administration functions for the industry popular MDM architecture/platform including custom objects. Own technical deliverables including the codebase.
  • Support development of "data models", and define MDM rules for best matching algorithm,


What You Need:

  • A minimum of 4-5 years in a role that discovered captures the current state of the system, encompassing processes such as data discovery, profiling, inventories.

  • A minimum of 2 years that defined the data architecture for Master data management with expert level knowledge using Informatica Intelligent data management Cloud

    • Data Layers (e.g. raw, curated)

    • Integration design (e.g. APIs), ETL (Extract, Transform & Load), velocity (e.g. streaming)

    • Informatica, Snap logic, Boomi etc.

    • Data architecture design patterns

  • A minimum of 4 years’ experience defining the data architecture for cloud native data management with focus on Data Definitions & Metadata

    • Reference Data Management

    • Master Data Management (core & critical business data entities)

    • Data Lineage

  • A minimum of 4 years’ experience in- Data Storage Tools & Techniques such as:

    • Data Archiving Patterns & Techniques

    • Cloud Backup Architecture and Design

    • Data Warehouse Tools

    • Data Conversion & Migration

    • Master Data Management

    • Metadata Management

    • Data Management and Integration

    • Systems Development Lifecycle (SDLC)

    • Data Modeling Techniques and Methodologies

    • Database Management

    • Database Technical Design and Build

    • Extract Transform & Load (ETL) Tools

    • Cloud Data Architecture

    • Data Architecture Principles

    • SAS Data Flux ETL Tools

    • Online Analytical Processing (OLAP)

    • Data Processes

    • Data Architecture Principles

    • Data Architecture Estimation

Bonus Points:

  • Expertise in using Hadoop, MapReduce, Spark, Pig, Hive technologies, Data Ingestion in Data Lake, Data Storage on key Cloud providers (AWS, Azure, GCP), Redshift, Data Retrieval on Big Data platforms and Interfacing Data Science and Data Visualization tools on Big Data platforms

  • Experience in Data Modeling, Big Data Platforms (e.g. Cloudera, Hortonworks, AWS, Talend, etc.), Data Migration and Quality (using ETL e.g. Informatica), MySQL/NoSQL, CQRS Event Sourcing

  • working as a consultant or directly with clients in a role that had taken ownership of the data assets for the organization to provide users with high-quality data that is accessible in a consistent manner.

Post a CV