Master Data Management (MDM) looks at all areas of an organisation, tracks and records the information for each division and stores it in one central area i.e a data warehouse.  Master Data Management is the technology, tools and processes necessary to maintain and create accurate data lists compiled from financial information to stock levels to employee details.

When such systems are being developed there are many obstacles to clear including changes to business processes, internal politics and data ownership these are just some of the things that have to be addressed before you starting looking at technical issues.   The issue of data ownership is one of the biggest hurdles, as each area feel they own their data and are not willing to look at the bigger picture.  With an MDM system only those who need the data have access to the data.  This is one of the biggest advantages of centralised data sharing, permissions can be granted depending on the individual’s role.  So if you don’t work directly with the data why should you see it?  This feature is also very important when it comes to maintaining accurate data.  Editing rights can be granted to those who need it.  This eliminates the chances of good data being replaced with wrong data.

Even the smallest of companies can generate a large amount of data.  With each department/section or person collecting data, the amount of replication, inaccurate and redundant data accumulates very quickly over time.  With a structured data management system in place it can lead to a more efficiently run business.

The following outlines just some of the benefits an MDM system can provide

Redundant Data
This is one of the main advantages an MDM systems offers, it eliminates the collection of redundant data as the data is stored centrally.  This forces the creation of accurate and specific records.

Data Editing
One data edit eradicates inaccuracies in your data as the edit is reflected throughout the database.  Individual lists are no longer used as all data is recalled from the central database.

Better Analytics
This paves the way for better and more useful analytics as you are not working with redundant or inaccurate data.

Data Consistency
Setting parameters means you only collect the data that is relevant to your organisation.  Again this speeds up processes.

Stronger Security
The data is not accessible by all and access can be granted based on each individual’s role within the company.  This also relates to editing and deleting data.

Although the benefits largely outweigh the disadvantages like any system there are going to be pitfalls, but these can be avoided with proper planning and research.  Here are some of the pitfalls to avoid when trying to introduce a Master Data Management System

Ignoring Data Governance
When choosing the MDM model to fit your company it is essential that the data governance policy of that system also fits your needs.  Being able to apply the correct control and accesses to the data is essential.

Trying to do too much
When implementing an MDM system it can be a seismic change to an organisation.  It is addressing the collection and management of company data and the changing of personal customs.  This will take time and careful planning it won’t happen over night.

Collecting the wrong data
Your system is destined to fail if the information being stored is irrelevant.  He main concept being an MDM system is that all the data is of importance to the company.  This is something that has to be agreed upon from the outset.

Cleaning the data outside the system
This spells disaster as the data being stored is not up-to-date.  If your data is consistent the users loose faith in it and bad habits begin, personal lists grow and the business suffers.  It should always be one change updates all – of course once it has been validated.

To me one central system is an essential part of any company as it can only be of benefit to all.






What is big data?

What exactly is big data, is this just the next “BIG” buzz word, will it follow the same path as Y2K or is it here to stay?

Big data is exactly that what it says, it is large amounts of stored raw data.  Studies show that the data generated every two days equals the total amount of data that was created up to the year 2003 and over 90% of all the data in the world was generated over the past two years.  So where does all this data go, who has it and what are they doing with it?  Here is a quick guide to how we generate so much data:

Every minute:

  1. We send 204 million emails
  2. Generate 1.8 Facebook likes
  3. Send 278K Tweets
  4. Upload 200K photos to Facebook

And this is one a drop in what is the big data ocean.

So with all this information floating around how is it being used?

Big data is being used every day across all industries from retail to farming to health and within government bodies.

In industry the use of big data helps business understand and target customers.  Yes those little fobs your local Tesco, SuperValue and Petrol station give you are tracking everything you do within their store.  Was it by coincidence that you received the coupon for your favourite shampoo?

With Big Data improving our shopping experiences, improving health systems and the fighting  against cancers and other serious illnesses it has now turned its attention to the fight against crime and joined the Los Angeles Police Department, and no it does not come in the form of Tom Cruise or Colin Farrell.  After all police departments around the world have their customers too and they need targeting.


Professor Jeff Brantingham and a team at UCLA studied over 13 million recorded crimes, spanning 80 years and applied an algorithm that is used to predict the likelihood of aftershocks from earthquakes.  The original algorithm that looks at the probability that aftershocks occur close by in space and time, was developed by Assistant Professor George Moher.  Applying a theory that aftershocks happen in close proximity the teams approach was the same when looking at crime and human behaviour.  Using the data they wanted to see if there was any relation.  Strangely enough the patterns were similar.  Although it couldn’t prevent any of these crimes as they had already happened, they decided to build on the algorithm and use live data to see if it could predict potential crime hot spots.  With some tweaking and the joining of forces with the company PredPol, they can now predict where crime is likely to happen on a given day.

The software breaks the patrols  in to 12 hour shifts covering a 500sq feet geographical area, where it has predicted that criminal activity might occur.  As the patrol officers had a new commanding officer it took time for them to warm to their new partner, as who knows crime best a computer or a police officer?  But attitudes changed as the city has seen a 33-percent drop in burglaries, 21-percent drop in violent crimes and a 12-percent decrease in crimes against property.  On Thursday, 13 February, 2014, LAPD’s Foothill area recorded zero crime activity over a 24 hour period, “a day without (recorded) crime” the first in their fifty year history.

So Big Data is making the world a healthier, safer and better place to shop.

BBC documentary on LAPD Big Data.