Feedback on the implementation of Master Data Management : Tool selection phase

See this article in

Following the preparatory phase described in our previous article, we tackled the crucial stage of MDM tool selection. This phase proved particularly instructive, bringing to light some sometimes surprising market realities.

Selection methodology

We have opted for a funnel-shaped approach, structured into four distinct phases:

  1. Pre-selection and initial contact
  2. Functional assessment using a requirements grid
  3. Quick demo with small team
  4. In-depth demonstration with all stakeholders

This approach enabled us to progressively filter out solutions while optimizing the time invested by our teams.

Phase 1: Pre-selection and initial contacts

Identifying the players

Our first step was to identify all the key players in the market, mainly via :

  • Gartner reports
  • In-depth Internet monitoring
  • Peer recommendations

Among the names contacted were Informatica, Semarchy, Tibco, Stibo, Ataccama, Blueway, Pimcomre, Quable and 3C-e/Nextpage.

Surprises and initial lessons

This initial phase brought several surprises:

  1. Segmentation by company size
    Some of the market’s major players declined our request, considering our organization (€200m sales) too small. Their target? Companies with sales of at least €500 million, or even €1 billion.

  2. The language barrier
    A significant number of tools did not offer an interface or project management in French. In our context – an educational establishment where many business users were not at ease with English – this point was prohibitive. It was unthinkable to impose an English interface on users whose work habits we were already going to upset. The same applied to project management.

Phase 2: The functional requirements grid

This phase proved less discriminating than expected. Our functional requirements, though precise, remained relatively standard in the MDM world. However, we did notice an interesting phenomenon: some solutions, even though positioned as market leaders, lacked the basic functionalities we took for granted.

Phase 3: Preliminary demonstrations

This stage of short demonstrations (30 minutes) with a small CIO team revealed some surprising situations, to say the least:

  1. Fake MDMs
    Some tools presented as MDM solutions turned out to be specific development frameworks. In other words, each project required almost complete development. An approach at odds with our quest for a proven, packaged solution.

  2. Refusal to demonstrate
    What’s even more astonishing is that some players have categorically refused any demonstrations other than paid POCs. A commercial position that is difficult to understand in a tool selection process.

Phase 4: In-depth demonstrations

For this last phase, we selected three players who took part in two-hour demonstration sessions. These sessions were structured around :

  • Specific use cases, prepared by us
  • Test scenarios specific to our context
  • An assessment involving all stakeholders (technical and business)

Key lessons

  1. Functional equivalence
    On paper, the finalist solutions offered relatively similar functional coverage. The differences were mainly in terms of user experience.

  2. The price factor
    In the end, it was the financial criterion that proved decisive, with spectacular price differences between the finalist solutions – from simple to triple! Even more surprisingly, the pricing models varied considerably:

    • Some offered fixed prices
    • Others opted for variable pricing based on data volume.
      These differences in business models greatly influenced our final decision.

The chosen solution

At the end of this in-depth selection process, we chose the Maps System solution. This decision was motivated by :

  • Comprehensive functional coverage
  • A user interface adapted to our context
  • A pricing model consistent with our budget constraints
  • Proven ability to provide support

Conclusion and next steps

This selection process enabled us to discover a market that was more complex and contrasted than expected. Between players who position themselves on very specific segments, those who confuse MDM with specific development, and significant price differences, selecting an MDM solution requires a rigorous methodology and a good understanding of the issues at stake.

In our next article, we’ll look at the project phase itself: how we deployed the chosen solution and the initial lessons learned from this implementation.

To be continued in the next article: “The project phase: implementation challenges and successes”.

Leave a Reply

Your email address will not be published. Required fields are marked *