Tag Archives: data management

Three Data And Analytics Considerations Every Organisation Should Make In The Pursuit Of Digital Transformation

The tools exist – and are affordable – to utilise Procurement data throughout the organisation for actionable intelligence. So how do you make that transformation? IBM Procurement’s Laura Beth Hirt-Sharpe writes the definitive guide to clearing the myriad hurdles.


A Procurement organisation’s success relies on transformation from standard spend visualisation tools to a comprehensive strategy to monitor, maintain and utilise Procurement data throughout the organisation. With the advent of inexpensive, efficient and reliable data collection and curation capabilities, many Procurement executives have the opportunity to efficiently create actionable intelligence from their data. Though a myriad of tools, methods and services are available to support this work, a significant hurdle remains for organisations: leaders must determine the best tools and services and curate an appropriate data strategy and data-driven culture to drive the change necessary to remain competitive.  All this, while cutting costs and reducing complexity.

As leaders embark on their Procurement analytics transformation, they face three major considerations: data and data governance strategy, data cleansing and curation, and skill gaps in core analytics and data science skills. In this blog, I will provide suggestions for each consideration based on my experience with global clients at various levels of maturity.

1. Data and data governance strategy 

Many Procurement organisations begin their digital transformation by thinking that data strategy and technology strategy are one and the same, when in reality these are two distinct, codependent pillars. A best-in-class approach to data strategy is to begin with the outcomes you are looking to drive from your Procurement data.  These outcomes can range across various domains beginning with traditional spend analytics, risk and compliance monitoring, to AI-based trending of key metric behavior within your environment, and many more in between. Once you have a clear view of the outcomes you want to drive, begin thinking through important questions like: 

·   What data needs to be captured and what level of structure is required within those elements? 

·   Is this data captured today, and if so, how?

·   What data gaps are present against target outcomes?

·   Does reasonably consistent master data exist across various source systems?

·   How can data completeness, accuracy, and meaningfulness be assured over time?

·   What is the best way to collect and curate data over time?  (This is not a “one and done” event!)

·   How can Procurement processes be optimised to ensure efficient and effective data capture?

These types of questions will help shape your data and data governance strategy. It is important to understand that there will always be a trade-off between speed of execution and granularity of data capture. Finding the right balance is key, and ensuring you have the right technology and innovation partners in place is crucial to optimising this balance. 

2. Data cleansing and curation 

There are two primary factors to consider with regard to data cleansing and curation: determining who from your organisation should be involved and maintaining value drivers in your dataset. 

Who should be involved?

Procurement data teams within an organisation typically lean toward one of two strengths: data science or Procurement. Some organisations focus on pulling data experts from other parts of the organisation to Procurement to help curate an accurate merge of their datasets into a “source of truth” dataset. However, through this method, Procurement subject matter experts (SMEs) have a limited stake in the data cleansing activities.  Knowledge of Procurement is essential to rapidly increase the data return on investment, such as supplier name normalisation and logic flagging.  If those knowledge assets are not brought into the process early, the path to monetisation will be slow and spotty. 

Alternatively, some organisations choose to assemble a team of Procurement professionals who can educate themselves on data techniques and procedures and curate the source of truth data. For these organisations, technical issues and lack of repeatability of process steps ensure the source of truth dataset will require a similar pruning process again in the future. This also has drawbacks in that data architecture is best left to data professionals – especially data that will be used for AI and Cognitive algorithms.  Merging Procurement SME talent with data design in a Procurement environment is tricky. Couple that with the reality that top data and Procurement talent have “day jobs” makes this investment in talent critical, complex and expensive. 

What are the key value drivers?

Organisations that pull their data into a central repository and want to utilise it to its fullest should maintain two value drivers within their dataset: 1) Procurement-specific categorisation and 2) knowledge-infusion based on outside information. 

Cleansing data to support a Procurement taxonomy cannot rely alone on a set of off-the-shelf tools built for classification of natural language – sentences and paragraphs – but will need to be curated for terms and phrases specific to Procurement’s categories. Furthermore, high-accuracy categorisation of spend data hinges on multiple fields such as supplier name, GL classification and rich line-item text fields. 

Utilising these Procurement-specific fields in classification requires more advanced algorithms to decide between potentially disagreeing field content.  To further complicate categorisation and curation, data experts are regularly tasked with combining non-structured information into the source of truth dataset. This work requires technical knowledge and industry acumen to execute as well as regular refreshes of data and terminology.  For example, these data points could include diversity supplier type, occupancy and building information as well as market intelligence purchased from third party providers. This work requires an in-depth knowledge of the source of truth dataset and supporting datasets which may be unstructured. These fields must be updated and verified with Procurement stakeholders. Categorisation work and additional field inclusion require a significant investment by Procurement organisations to create and maintain.

Determining the right team and the value drivers within Procurement-specific data is a task that takes dedicated individuals, time, and effort. However, the size and forethought of this effort will determine the return on data initiatives.

3. Core analytics and data science skills 

A pervasive issue I see with organisations that hire data scientists from top schools at high salaries is that they struggle to extract value from the data that already exists in their systems due to lack of Procurement acumen.

Another common issue is that an organisation’s current team cannot afford incremental budget for the aforementioned data resources, and therefore leans on its existing Procurement and IT staff to monitor, maintain, and report utilising spreadsheets and visualisation tools.

Cross-collaboration

Both approaches leave a significant amount of value unrealised for the business. Instead, I recommend cross-collaboration across the organisation, designating analytics champions and emphasising grassroots training.  Without these, the value of your data will remain untapped and will require a significant amount of future investment to digitally transform your business.

A successful data-focused organisation is one that is fully integrated within your Procurement function. The data team cannot be a siloed organisation, building point solutions for the loudest stakeholder’s pain point. There needs to be an agile approach to daily activities, with a robust backlog and tasks prioritised for highest return to the business.

Analytics champions

Analytics champions are an important, yet often overlooked, position. Data Translators are another name for this role, as organisations need to treat data as another language with certain speakers of the database and statistics “dialects.” 

For example: if an executive has a short turnaround project that is important for continuing operations, they need to meet with their function’s analytics champion before they meet with the data team. The intent of this role and meeting is to vet, assess and format answers to the rudimentary questions that often derail otherwise productive data initiatives. Potential topics to cover include data availability, awareness of the project backlog, agreement on fair timelines and investment, and blockers. 

Organisational growth

Analytics champions need to be cultivated internally first as functional experts and grow as the organization evolves. There are positives to hiring versus training, but as discussed earlier, without the proper functional understanding you will likely see a lack of results without the proper structure in place. 

Your current functional team knows your business, processes, industry, and supply base best, so enable them to make decisions and give structured guidance to the data experts, even if a data translator is required.

Meaningful transformation through modern Procurement

Analytics is at the forefront of high-impact Procurement organisations as a trusted business advisor role, as a supplier relationship reference source, and as the foundation of effective compliance management. Through analytics, modern Procurement can be predictive in their actions and trusted throughout the broader business. To produce the granular level data required for actionable intelligence, source data has to expand beyond basic accounts payable and purchase order elements.  New sources of information, such as demand, consumption and compliance data from a variety of internal and external sources must be linked. This process appears daunting, but we have seen meaningful transformation happen over small, structured, prioritised steps with a focus on data as the foundation for meaningful transformation.

To solve for this complex need, Procurement Business Process Outsourcing services are innovating through AI-based technology infused with an influx of new and re-purposed Procurement talent skilled in data science, mathematics, statistics and computer science. Ensuring the correct mix of skilled data resources with Procurement experts has proven to be an expensive challenge for CPOs, and an opportunity for market-leading specialists such as IBM Procurement Services. These services assist Procurement organisations to meet their analytics demands while empowering their sourcing practitioners to focus on taking action based on the analytically discovered opportunities. Incorporating knowledge built across clients and industries, these services allow Procurement to adjust focus around high yield data and statistically verified opportunities.

The Dangers Of Dirty Data

Is your organisation working with ‘dirty data’? How would you know? And, what impact is it having? This article has everything you need to know about doing a quick spot check, spotting procurement problems, identifying savings, and more importantly, making sure your data has its COAT on.


We all think we know what dirty data is, but it can mean very different things depending on who you speak to.  At its most basic level, dirty data is anything incorrect.  In detail within procurement, it could be misspelled vendors, incorrect Invoice descriptions, missing product codes, lack of standard units of measure (e.g. ltr, L, litres), currency issues, duplicate invoices or incorrect/partially classified data.

Dirty data can affect the whole organisation, and we all have an impact on, and responsibility for the data we work with.  Accurate data should be everyone’s responsibility,  but currently across many organisations data is the sole responsibility of a person or department, and everyone trusts them to make sure the data is accurate.

But, they tend to be specialists in data, analytics and coding, not procurement.  They don’t have the experience to know when a hotel should be classified as accommodation or as venue hire, or what direct, indirect or tail spend is and its importance or priority.

How many times have you been working with a data set and noticed a small error but not said anything, or just manually corrected something from an automated report, just get it out the door on time?  It feels like too much of an inconvenience to find the right person to notify, so you just correct the error each time yourself, or you raise a ticket for the issue but never get round to checking if it’s resolved. 

These small errors that you think aren’t that important can filter all the way up to the top of an organisation through reports and dashboards where critical decisions are being made.  It happens almost every day.

How does this affect my organisation?

There are many ways, but one of the most widespread and noticeable impacts is around reporting and analytics.  If you’re in senior management, you will most likely receive a dashboard from your team that you could be using to review cost savings, supplier negotiations, rationalisation, forecasting or budgets.

What if within that dashboard was £25k of cleaning spend under IBM?  I can already hear you saying “that’s ridiculous” – well, it is obvious when pointed out, but I have seen with my own eyes IBM classified as cleaning.  It can happen easily and occurs more frequently than you might think.

Back to that dashboard that you are using to make decisions, you’ll see increased spend in your cleaning category, and a decrease in your IT spend, which could affect discounts with your supplier, your forecast for the year, monitoring of contract compliance etc…  It could even affect reporting of your inventory,  it appears you need more laptops, and unnecessary purchases are made. 

When there are tens or hundreds of thousands of rows of data, errors will occur multiple times across many suppliers.  And for the wider organisation, this could affect demand planning, sales, marketing and financial decisions.

And then there are technology implementations.  Rarely is data preparation considered before the implementation of any new software or systems, and there can even be the assumption that the software supplier will do this, which may not be the case, and if they do provide that service it might not be good enough.

It can be very far into the process of implementation before this is uncovered, by which time staff have lost faith in using the software, are disengaged, claim it doesn’t work, or they don’t trust it because “it’s wrong”.  

At this point, it either costs a lot of money to fix and you have to hope staff will engage again, or the project is abandoned.  In either case, this can take months and cost thousands, not millions of pounds/euros/dollars in abandoned software or reparation work.

You might also be considering using, or engaging with a 3rd party supplier that uses AI, machine learning or some form of automation.  I can’t emphasise enough the importance of cleansing and preparing your data before using any of these tools. 

Think back to the IBM example, each quarter the data is refreshed automatically with the cleaning classification, that £25k becomes £50k, then £75k the following quarter, it’s only when the value becomes significant that someone notices the issue.  By this stage, how many decisions have been based on this incorrect information?

How can this be resolved?

Truthfully, it’s with a lot of hard work.  There’s no magic bullet or miracle solution out there to improve the accuracy of your data: you have to use your team or an experienced professional to get the job done. Get your team to familiarise themselves with the data. If they are reviewing and maintaining it regularly they will soon be able to spot errors in the data quickly and efficiently.

If you think about data accuracy in terms of COAT, this will help to manage your data.

It should always be Consistent – everyone working to the same standards; Organised – categorised properly; and Accurate – correct.  And only when you have these things will it also be Trustworthy – you wouldn’t drive around in a car without a regular inspection would you?

How to spot procurement problems and identify savings

Accurate data is important, but in its raw state, it’s not the whole story.  As a procurement professional you’re tasked with ensuring the best prices for products or services, as well as ensuring contract compliance on those prices, along with cost reductions and monitoring any maverick spend … to name but a few!

Accurate data alone will not help achieve this, I strongly recommend supplier normalisation and spend data classification to help quickly and efficiently manage spend and suppliers, monitor pricing and spot any potential misuse of budgets.

How do I get started?

With a spreadsheet of spend transactions over a period of time such as 12 to 24 months, the first step should be Supplier Normalisation, where a new column is added to consolidate several versions of the same company to get a true picture of spend with that one supplier.  For example, I.B.M, IBM Ltd, I.B.M. would all be normalised to IBM.

Data can be classified using minimum information, such as Supplier Name, Invoice/PO line description and value. To get more from the data, other factors can then be added in, such as unit price. Where unit price information is not available, the quantity can be divided by the overall value.

A suitable taxonomy will then need to be found to classify the data.  It can be an off the shelf product such as ProClass, UNSPSC, PROC-HE, or a taxonomy can be customised so it’s specific to your organisation or industry.

This initial stage may take months if you are working with large volumes of data. It might be worth considering outsourcing this initial task to professionals experienced in this area, who will be able to complete the project in a shorter time, with greater accuracy.

Avoiding common pitfalls

There are a number of ways to classify the data> However, to get started, look for keywords in the Supplier Name and then the Description column.  The description of services could include ‘hotel, taxi, cleaning services, cleaning products, etc., however, it’s important to carefully check the descriptions before classifying, or errors could be introduced.  A classic example is “taxi from hotel to restaurant”, depending on which keyword you search for first, it could end up being misclassified as transport, or venue costs.

I wouldn’t advise classifying row by row, as it could take more than twice as long to complete the file using this method.  Start with keywords, followed by the highest value suppliers which you can get from a pivot table of the data if you’re working in Excel.

Identifying opportunities

Once classified, charts can be built to analyse the data.  The analysis could include, ‘top 80% of suppliers by spend’; ‘number of suppliers by category’; ‘unit price by product by month’;  ‘spend by category’; or ‘spend by month.’

Patterns should start to emerge which could reveal unusually high or low spend in a category, irregular pricing, higher than expected use of services, or a higher than expected number of suppliers within a category. 

Why you should strive for data accuracy and classification?

Data accuracy is an investment, not a cost.  Address the issues at the beginning: while it might seem like a costly exercise, you will undoubtedly spend less than if you have a to resolve an issue further down the line with a time-consuming and costly data clean-up operation.  And by involving the whole team or organisation, it will be much easier to manage and maintain the most accurate data possible.

Spend data classification shows you the whole picture, as long as it’s accurate.  You can get a true view of your spend, allowing improved cost savings, better contract compliance and possibly the most important – preventing costly mistakes before they happen.

So, does your data have its COAT on? What does ‘dirty data’ mean to you? Let me know below!

Susan Walsh is the founder of The Classification Guru, a specialist in spend data classification, supplier normalisation and taxonomies.  You can contact her at [email protected] https://www.procurious.com/professionals/susan-walsh

Why Knowing your Market Can be the Key to Success

Are you dooming yourself to failure in procurement by not knowing your market before you start? Market research and analysis is a key component of the procurement process – but it needs to be done right.

By PHOTOCREO Michal Bednarek/ Shutterstock

When Martin Luther King Junior stood on the steps of the Lincoln Memorial in Washington, D.C., in front of 250,000 civil rights supporters, he knew his audience. He knew that the people he was addressing supported his cause and agreed with his words. The speech was a success and helped paved the way for President John F. Kennedy’s Civil Rights Act.

This is not intended to be a crass use of what is one of the finest speeches in global history, but an example of how success can be tied to knowing how an audience will react to words, proposals and actions. 

Conversely, not taking the time to understand the audience or the market can lead to painful rejection (though in fairness, sometimes the failure to understand the market lies on the other side of the table). Steve Jobs and Steve Wozniak were rejected by Atari and HP in when they presented the concept of the personal computer. Perhaps just as famously, record label Decca rejected The Beatles stating that “guitar groups were on their way out”. 

Both of these cases, and many more, are a prime example of organisations not understanding their market and ending up without that all important ‘win’ in the column.

Criticality of Analysis 

You’re probably wondering how this relates to public procurement. The examples here show how critical it is to know your audience and market, and that the key is that hard work needs to be put in to provide the foundation for success. 

Take a look back at your own career in procurement. How many times have you gone to market on the back of flawed or non-existent market research and analysis? When you have laid your hands on the final draft of a specification, did you always trust that the input was from a good cross-section of the market? 

You may think you lack the time or resources to carry out market analysis as part of your tender process, but the business case for doing it well is there for all to see. Market research can be critical in ensuring that the goods, services or works being procured meet the needs of the taxpayers, at a cost that is acceptable and provides best value. 

Public sector organisations can use market research and analysis to get a greater understanding of their customers (usually the end-users of the services), to analyse the market and the competition in a particular area, and then to test before launching services.

Informed Decision Making

The same applies in procurement, just from a different angle. Procurement gets to understand the supply market, its competitiveness, how mature it is and the key suppliers, some of whom may already be supplying to the public sector.

It creates a level of informed decision-making, rather than approaching every tender in the same way. As it’s put in the Procurement Journey, analysis of key trends and market dynamics and how the goods or services in question sit within this can help to shape a specification, tender and route to market. 

It can also help procurement to understand the role of SMEs in the market and how they could better set out a tender to increase SME involvement. Even down to using market analysis in order to understand how commercial models can be set up and what Community Benefits suppliers would or could offer as part of tender submissions. 

Market Research Favourites

There are a variety of methods available to procurement too, some which are desktop based, others which require direct interaction with the market itself.

A few of the most common are listed below:

  • Prior Information Notice (PIN) – The PIN can be used to gather information on almost any aspect of a tender and allows procurement to understand and gauge the interest in the supply market. The added benefit is that, depending on the type of PIN used, they can also be used as a call for competition and reduce procurement timescales. 
  • Soft Market Engagement – This doesn’t have the formality of a PIN, but can be just as useful. It can be done via email or phone calls and is particularly useful if there is a smaller, known supply market, and the engagement is being done to test the water on a specification or aspect of the Technical or Commercial Evaluation.
  • SWOT, PESTLE, Kraljic – Old favourites for anyone who has ever done courses in procurement! These can provide a picture on the suppliers (SWOT), market conditions (PESTLE) and product category (Kraljic), better informing decision-making and strategy.
  • Applied Analytics – The likes of Dun & Bradstreet and Spikes Cavell provide information on the supply market, from spend analytics to market analysis. All of this data is presented in a usable form, saving procurement from having to carry this out themselves. 

Paralysis by Analysis 

While market analysis is a critical part of the procurement process, it’s important to remember that it’s only one part of a much wider whole. Perfection is the enemy of progress – striving to capture all the information possible, to speak to every supplier and put this all together can lead to stagnation in the process and actively hinder decision making. 

Avoid decision-making by committee at all costs and decide where you, as procurement, will draw a line under the analysis and move to the next stage. Mark this out at the start of the process and stick to the timelines. After all, you don’t want to spend so long analysing the market that you never actually go to market. 

Know your audience, pick your method and crack on! 

I’d love to hear your thoughts on this article and the series of articles on the challenges facing public sector procurement in 2019. Leave your comments below, or get in touch directly, I’m always happy to chat!

Data Is The Alpha And Omega Of The Future

Do you feel like your procurement team is in good shape when it comes to your existing e-procurement solutions? Sure you won’t get  “stuck” with an obsolete system? Eric Wilson talks on the importance of data. The event might be over, but you can still  register for The Big Ideas Summit Chicago to access footage  from the event. 

It’s not an exaggeration to say 90 per cent of today’s procurement technologies will be obsolete in the coming years. While much of today’s tech has some great functionality, when you put it up against the backdrop of a world where the big value is in the data more than the tactical functionality, it’s clear that they’ll simply be left behind!

Don’t believe me? Think this is “out there”? Let me elaborate…

Why can’t Alexa answer my questions?

Nowadays, many of us use Amazon’s intelligent personal assistant, Alexa, or similar AI applications. If you have, you’ll know that they’re not always adept at answering the questions we ask of them. Why? It’s simply because they don’t have enough data…yet!

Imagine machines that could:

  • Manage all your discrepancies for you
  • Detect fraudulent procurement
  • Code your non-PO invoices

This is the point at which technology gets a lot more exciting, and we’re not far from reaching these dizzying heights. The question procurement teams must ask is whether their organisation has the volume, quality and completeness of data to allow these machines to learn, provide accurate predictions and take accurate actions on the organisation’s behalf.  And to be sure of that, we need to look ahead…

The here and now won’t help you tomorrow!

I’ve spoken before on the downfall of Siebel as an example of what happens when organisations only live in the here and now, solving the problems of today without looking ahead to tomorrow.

In 2017, the situation hasn’t changed. But this time, it’s not just about Software-as-a-Service (SaaS). Procurement technologies, and technologies in general, have fully embraced SaaS and the big tech shift that’s coming next is data.

If the system you’re looking to install is not capable of actually capturing all your transactional data – and doing so in a centrally architected manner such that you can get more value from data beyond just the data that your organisation itself generates – then all those snazzy pieces of functionality, all that beautiful user interface, all those pretty little graphs aren’t worth a dime!

Not only will your existing business case completely fail.

Not only will you not receive the ROI you planned on today.

Tomorrow the system will be obsolete, and you might as well have selected Siebel!

When it comes to selecting SaaS procure-to-pay systems, business cases are built on the ability to:

  • Eliminate maverick spend
  • Identify opportunities for strategic sourcing
  • Consolidate the supply base
  • Automate approval processes
  • Automate matching
  • Eliminate paper
  • Take advantage of terms discounts.

Indeed, organisations build up very detailed business cases based on these factors. But the basic assumptions and prerequisites for those components of the business case to actually generate real ROI are based on three things:

  1. You get 100 per cent of your suppliers connected to the system
  2. You get 100 per cent of your end users actually using the system – all the time (not just some of the time)
  3. You run all your invoices through the system – 100 per cent of them – not just the indirect invoices, but also direct, facilities, vertical specific invoices, non-PO invoices, the whole gamut!

In procure to pay, if you don’t have those three things, not only does today’s business case fall apart, but more critically for this conversation, you can’t leverage the power of all that data in the future.

There’s no two ways about it: You can’t use artificial intelligence if you don’t have the centralised data for those machines to learn from. It is data that feeds AI and other emerging technologies – you need data more than anything else for success in the future.

And so, my key takeaway now and always is: when you are putting together your RFPs for systems, data better be first and foremost on your mind.

Want to see more from The Big Ideas Summit Chicago.  Register now  (It’s FREE!) to gain access to all of the day’s action including video interviews with our speakers and attendees. 

 

No More Guessing Games! Time To Use Innovative Data Leveraging

There’s no longer a need for guessing games when it comes to  driving value! Innovative data leveraging is possible in any environment and can help to lead organisations towards an analytics enabled procurement.  

 

Join BravoSolution’s webinar, Innovative Data Leveraging for Procurement Analysis, which takes place on 28th March.

Many purchasing executives are looking to drive procurement transformation but this is reliant on three major factors:

  1. Level of stakeholder engagement
  2. Ability to align with the overall business strategy
  3. Use of advanced tools and technologies

My research suggests there exists a noticeable gap between procurement executives’ explicit intentions of driving value for the business, and documented results in these three areas.

These gaps can be attributed to a lack of critical data and analytical insight that can support a truly meaningful conversation with the business about spend, supply base, and supplier performance.

Annual budgeting becomes a guessing game, with little input solicited or provided by procurement. It might be due to a lack of data. Or, it could be procurement’s inability to take the lead in order to anticipate and gather the data required. This disconnect is causing significant challenges for businesses.tech

BravoSolution is running a  webinar on the 28th of March, Innovative Data Leveraging for Procurement Analysis.  I will be  discussing a common process that every executive we met with cited as critical for engaging stakeholders and building analytical insight. We call it “innovative data leveraging” (IDL).

Innovative Data Leveraging (IDL)

Innovative data leveraging is a fact-based, data-driven approach to driving change and influencing stakeholders to create procurement value for the business.

The IDL process was described in different contexts, but the common thread was that cross-functional engagement was powered by stakeholder influence through analysis and presentation of data. Of course, leveraging analytics is difficult without some prior investment in procurement systems such as transactional spend analytics, contract management, and supplier performance measurement. However, our analysis also showed that innovative data leveraging is possible in any procurement environment.

The process starts with procurement executives conducting working sessions with business stakeholders to develop a deep understanding of their business strategy, the challenges they face in executing this strategy, and the role that procurement can play in helping to shape and support this strategy. Successful procurement leaders are the ones who can effectively articulate the questions that need to be answered and pursue the data requirements to provide analysis, insight and advice in order to address stakeholders’ business concerns.

Several additional insights emphasize the importance of innovative data leveraging.

  1. IDL was found to be important during any stage of procurement transformation maturity.
  1. The development of IDL capabilities depends on successful initial business engagements, especially when reliable procurement systems and data are lacking.
  1. Advanced analytics in the form of predictive capability is the most highly evolved form of IDL.

What are the benefits of IDL?

At the earliest stages, preliminary insights on spend may provide opportunities for deeper involvement in functional sourcing initiatives, creating a platform for further engagement and integration. In emerging stages, organisations can drive significant insights into total cost of ownership and working capital improvements that go above and beyond simple price leveraging capabilities. In advanced stages, predictive analytics (using both structured and unstructured data) that produce insights into revenue forecasts, supplier risks, emerging market opportunities, and other value drivers begin to emerge.

The innovative data leveraging approach can help organisations at all maturity levels to build a solid path towards an analytics-enabled procurement, in their pursuit of value and excellence. This does more than bridge the gap between procurement’s goals and the overall business strategy.

When you start by leveraging data analytics, no matter what stage your organisation is in, you can build a foundation for innovative capabilities for procurement excellence, like predictive analytics and cognitive computing.

You’ll  learn more about all of these issues in BravoSolution’s  upcoming webinar!

Sign up to join BravoSolution’s webinar, Innovative Data Leveraging for Procurement Analysis, on 28th March

Data, You’re The One That I Want – I’m Just Not Sure Why!

When it comes to managing data, we all know we need it. But it’s important to note that the quality of your output is entirely dependent on the quality of the planning. 

criben/Shutterstock.com

Register for  free webinar, Innovative Data Leveraging for Procurement Analysis, on the 28th March. 

In the information age, data is everything. With our ability to store swathes of that binary gold, and to pull it from scores of different sources, we have access to more information than ever before. What’s more, by using analytical tools, we can blend datasets and create rich insights that were previously impossible to do (or at least incredibly arduous!)

At the heart of this utopia is the premise that data is ‘great’; if we’re not measuring something, then we’re missing out.  After all, data tells the ‘truth’…right?

Well actually, that depends on what you mean by ‘truth’. After all, the ‘truth’ can be subjective and open to interpretation – and the same goes for data; the conclusions you draw ultimately depend upon what you’re looking at and how you’re looking at it.

Have a roadmap before embarking on your analysis

An important consideration when working with data is that the quality of your output is wholly dependent on the quality of the planning at the start – specifically the aims of any analytical outputs.

Having a clear roadmap for the aims of your analysis in the first instance is important in providing direction for the project, allowing you to ask the right questions and draw on the appropriate datasets. There’s a lot of information out there and it’s easy to find yourself in a sinking quagmire of sources that bear little relevance to your intended analysis.

Whilst scoping the aims of a data analysis project may seem daunting, there are three simple steps that you can follow to ensure you give yourself the best hope of arriving at a meaningful outcome:

  1. Decide on a purpose – what, in a general sense, is it that you’re trying to achieve with any analysis?
  2. Pitch to the right audience – Who is going to consume the information? It may be at many levels of seniority (from Analysts to Executives), and each will require and expect different things.
  3. Define the questions to be answered (and then the supplementary questions that arise from that) – these are not just the pure data questions but rather the business question – i.e. the reasons for conducting the analysis in the first place.

Leverage your data in innovative ways

With the above three areas documented and the information acquired, the next step is the exciting bit – making it work for you to answer your questions.

Again, there are three considerations to bear in mind for making the most your data:

  • Create quality visualisations – Choose your visualisations carefully and with the audience and questions to be answered in mind. Data visualisation, as with all visual communication, requires thought and discipline to present it in the most meaningful way (don’t just include a bubble or other fancy charts because it looks nice – it needs more justification than that).
  • Make sure the data has context – Bring in those external metrics that help you make sense of it all. Having worked with data for my entire career it’s fair to say I’ve seen good data, bad data and everything in between. When it’s bad (and anything short of ‘good’) you’re going to struggle to get any ‘truth’ from your analysis – remember, “garbage in, garbage out”. However, one of the trends that I’ve noticed more and more is that even with the good stuff people are quick to justify it – reaching for a readily accessible context; and that’s normally the context of their business or organisation. This is context, and context can take many forms. It could be measuring your procurement against a commodity index or allowing for the impact of currency fluctuations, or indeed measuring against many others.
  • Blend your procurement data for greater insight – Data is an incredibly valuable resource for any procurement team and its wider organisation. By pooling your internal data for spend, sourcing, contracts and projects (to name a few) and combining that with external metrics and benchmarks, you suddenly open up another level of insight into your data. Better yet, that insight can then be used to inform strategy across the organisation, increasing efficiency, improving savings and identifying opportunities for further innovation that yields yet more value for your organisation.

“If we have data, let’s look at data. If all we have are opinions, let’s go with mine.”

Jim Barksdale, former CEO of Netscape.

In the digital era, every procurement team has access to an invaluable source of strategic insight in the form of its data. By using technology to prod and probe that data, Procurement has the means to draw informed action plans that deliver innovation and value to the function and, more importantly the wider organisation. However, knowing the research questions to ask of your data and applying the right context to it is essential to realising this potential.

If you are interested in learning more about the kind of questions you need to be asking when looking to gain greater insight from your data, then please register for our free webinar, Innovative Data Leveraging for Procurement Analysis, on the 28th March. In it, distinguished US professor, Dr Robert Handfield will be taking a more in-depth look at pooling datasets to perform innovative procurement data analysis.