August 22, 2007

CPGmatters: What’s next for data synchronization?

By Jack Grant

Through a special arrangement, what follows is an excerpt of a current article from CPGmatters, a monthly e-zine, presented here for discussion.



Data synchronization is key to the efficient operation of the supply chain. In recent years, progress in the vision and scope of data sharing is driving a demand-based planning model as opposed to the more traditional push-based model.

For example, Supervalu is synchronizing supply chain data with hundreds of suppliers using the 1SYNC data pool via the Global Data Synchronization Network. But much work remains, especially around data accuracy. Unless CPG manufacturers ensure that their product data accurately reflects the physical attributes of their products, the benefits of data synchronization will be lost.

“Unfortunately many retailers are providing inaccurate on-hand information in their activity files to vendors,” said Chad Symens, president and CEO of Rainmaker Data Warehousing. “When a vendor is unable to trust the on-hand data, they are left with a large hole in their decision making toolkit. Most vendors are receiving activity data from multiple retail customers so it becomes a difficult exercise to determine the data quality for each file and then determine the most appropriate course of action to deal with the missing, or inaccurate data.”

Rory Granros, the director of industry & product marketing for process industries at Infor, said data synchronization must move from a late-in-the-cycle activity done to meet retailer mandates to a holistic process at the beginning of the product lifecycle and integrated into new product development and introduction.

“Understanding the retailer’s requirements ensures the product is developed to meet specifications and mandated data is captured and integrated via a Product Information Management (PIM) – solution with data synchronization,” said Mr. Granros.

Mr. Symens sees the need for more direct collaboration between buyer and vendor. Some progress has been made in this regard using CPFR (Collaborative Planning Forecasting and Replenishment) processes but the model tends to be too intensive for all but the largest vendors to engage in.

“In the future, retailers need to define a more lightweight collaborative process built around demand activity data sharing and simple yet measurable goals for each party,” he said. “The primary objectives will be to monitor sell-through and on-hand for the top items in a given category for each vendor. A simple scorecard and exception-based analysis tools must be created and agreed upon between the retailer and the vendor to achieve the full benefits of this model.”

Tom Duffy, the director of Business & Industry Partnerships of Nielsen’s TDLinx division, promotes the need for a clean customer master data. Enhancing the process would be a unique identifier, the ability to track and manage company hierarchies, store location openings, closings, change of ownership and so on.

“Having the ability to use one number to aggregate to a common view of the customer, integrate disparate data and related activity, communicate seamlessly on a code-to-code basis and evaluate in any frame of reference delivers great return on investment.”

Discussion Questions: What do you think of the ideas presented in the article to improve data accuracy and overall data synchronization between retailers and suppliers? What hurdles still need to be overcome?

Discussion Questions

Poll

15 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Nikki Baird
Nikki Baird

I definitely think that data sync considerations need to happen much much earlier in the process–as part of the product development process from the beginning. But now we’re talking about a master data strategy for a retailer or a manufacturer, and we’re talking about basic data quality practices.

We’ve gone through the initial rush of external integration, and now I believe we are in the middle of an internal rush to improve how data is handled and managed internally–so as to reduce the amount of inaccurate information that is shared with partners. The next phase will take us back to external considerations, where manufacturers and retailers will link their data quality processes together as part of their data sync efforts. But that’s a ways down the road….

Ed Dennis
Ed Dennis

The mechanics of data synchronization are not the problem. The problem is that the systems are not open. Every one wants it but no one wants to pay for it. It’s like using a toll road–everyone likes the road but everyone also thinks the government, or someone else should pay for it. The software community has had working programs (internet based, secure) for years, but no one wants to adopt a standard. Wal-Mart wants me to use theirs, SuperValue wants me to use theirs, Ahold wants me to use theirs. Wouldn’t you think that the merger of GNX and the World Wide Retail Exchange would give the retail and supplier communities an impartial, neutral platform to solve problems? The retailers, however, seem to be so concerned about not adapting to “better solutions” (this is a constant theme) in favor of trying to take their old, under performing systems and make them work. If we all used their mentality, we would be driving a 1950 Bel Air and signaling turns with our hands.

James Tenser

While most valuable, this discussion is constrained by retail implementation issues that thwart even the most earnest and sophisticated efforts to synchronize supply chain data. I call this “the elephant in the store”: When accurately measured and recorded, shelf conditions rarely match the plan or the forecast and, by the way, we hardly ever accurately measure or record shelf conditions in the first place.

So the dream of a demand-driven supply chain remains a bit of a fantasy because we do not possess a key element of the demand data that this goal requires. No matter how clever the inferential algorithm, it’s near impossible to generate an accurate store-level demand signal based solely on back-door and front-door measures. The results are upstream inventory voids, stock-outs, and sub-par planogram and promotion compliance. Yes, yes–the same old symptoms we’ve been tearing our hair out over for decades.

The shelf remains the deep, dark place in the supply chain–the place we dare not look for fear of confronting the horrible truth. Poor in-store compliance rates remain an embarrassment for our industry. They persist in large measure because we do not have an adequate plan and processes in place to communicate and implement the work, then track and measure it continuously in near real time.

If this sounds too critical, let me be clear: Our industry has made commendable leaps forward on supply chain issues in the past one and one half decades, resulting in billions of dollars in cost savings and value to the consumer. Today the plain facts are that in-store implementation issues are emerging as the greatest factor limiting further progress. Tackling them will require a new collaborative industry initiative.

Kai Clarke
Kai Clarke

The complexities of achieving this goal–that of the sharing of accurate logistical data–are mind-boggling. Most retailers have no true count as to what is in their warehouse and back rooms, let alone what is on their shelves. Add to this the shrink concerns regarding each of these locations and the complexities needed to record and report on each SKU along each step of the logistics cycle and the potential for errors is tremendous. This is the key reason this type of shared data pooling has not worked. Add to this the different data recording and reporting software, equipment needed to do this and the exceedingly high cost to implement a system like this and you have a recipe for disaster. This is why retailers have shied away from this process and do not embrace it as one of their future goals. It would be a great thing but it is not a reasonable expectation that we will see this any time in the near future.

Ron Margulis

The primary hurdle remains the requirement that individual retailers and manufacturers first get their internal data in order before trying to sync up with their trading partners. This has been a challenge because of rapidly changing technology and a rapidly changing marketplace. Companies have had to update their systems to get into the 21st century and while they were trying to do this, the Internet happened, causing additional tech issues. And, the blurring of channels happened, causing additional marketplace issues. While there may be unforeseen impediments blocking the road to near-full industry-wide data synchronization, the future looks good for all of the benefits the process promises.

Dan Gilmore
Dan Gilmore

I guess I can only say: Another year, the same old story.

We’re been hearing this data synchronization story for so long, and I don’t doubt the benefits. Yet, it seems so little progress is ever really made.

I would liken it to this: there is a small treasure out there, nice to have but not one that is going to let you live in luxury. To get that small treasure, there is a huge, difficult mountain that must be crossed, a journey that will take years. You would like this small treasure, but the investment in gear and supplies you will require before you can start the journey is usually just beyond your means.

Even if you are prepared for the trip over the mountain, looking up, it just seems too high, and you decide to wait a bit longer. Other times, just as you are about to set forth, an opportunity for an even smaller treasure, but one that can be had for much less effort and time, presents itself, so you decide to go there first. And I supposed we should probably add to the story that there are actually two treasures over the mountain, one for a friend of yours as well as your own, but the two of you have to make the long journey together to retrieve them. If you get separated too long–poof!–you are back down at the start of the mountain.

Now that I have totally confused everyone, I will add the idea of a “light” CPFR is quite sound and one I have promoted myself.

mark douglass
mark douglass

At the end of the day, 98% of all data required is provided and should be managed by suppliers. Retailers add about 2% to the total pool. But far too often retailers are seen to be the bad guys by dragging suppliers along a supply chain process that embraces the efficient usage of e-commerce or B2B protocols to eliminate days and paper based data entry.

At the heart of this is data change management, the synchronization component through a preferred exchange should be simple stuff as long as suppliers understand the end game, which is selling to customers. Too often Suppliers are so busy selling to retailers they forget the needs of the end customer and consequently do not pay enough attention to data change management, and this is where it gets out of sync. Pack size changes, bonus packs, substitutions, duplicate GTINS, etc.

Retailers are not squeaky clean, but at least they are embracing and promoting common standards in a broad sense, either through an exchange or within their own business. But what about suppliers, they need to embrace what they can control, and manage it as though they are selling to the end customer.

Les McNeill
Les McNeill

The issue remains a priority for the sector and the comments thus far collectively reinforce the need while effectively illustrating the challenges. While clean, accurate data is the essential starting point for both suppliers and merchants, the longer term driving force for GDS is, surprisingly, not supply chain optimization. It is the need for richer, accurate, product information at the consumer’s point of purchase. This is illustrated in recent research on the positive impact of richer product information on buying decisions. The more important issue, however, is that of emerging statutory compliance requirements around the type of product information visible to the customer in the food, health and wellness product categories. The PIM attributes synchronized between the suppliers and the merchants are the natural source of this descriptive information. The combination of statutory information compliance and the drive to more customer-centric merchandising will keep the GDS momentum in place beyond the initial gains in supply chain.

Ian Piddock
Ian Piddock

Product Data Quality is now the hot topic. The GDSN goes a long way toward improving the quality of data that is passed through the supply chain, because the GDSN enforces standards and validation rules for input, thus normalising. However, we all know that if the wrong data is input at first point of entry, then that is the data that is passed through the supply chain. Integrating back-end systems with a PIM product that is GDSN certified, combined with collaborative role based workflow to control the creation and update of product information, will enforce better data quality because users will become accountable.

Accountability will/should ensure that humans take more care over data entry. PDQ problems stem from human beings not being made accountable. Unfortunately, there is no computer system that can replace the human at the start of the process.

Chad Symens
Chad Symens

I’ll add a point which I provided when I was interviewed for the article, but which sadly did not make it into the final version.

While the benefits of an accurate global data catalog are extremely attractive there are simply too many constituents with competing objectives and too many disparate technologies to make the vision a reality. At best, a few industry leaders will create a solution which will have only modest adoption, but more likely the project will fail. Industry trade journals document this to be true and numerous comments above lament this reality.

But vendors who cite the inaccuracy of on-hand data as justification not to use POS activity data for decision making and planning are missing a huge opportunity. Sure, data accuracy can be questionable but if the current situation is no visibility into store level on-hand and you are running 3 percent to 5 percent out of stock, isn’t some visibility better than no visibility? I have personally worked with dozens of vendors who are tracking store level on-hand and comparing the quantities they have shipped to those stores and the unit sales at those stores to estimate what they feel is an accurate on-hand. Even if that effort increases your in-stock by 1 percent, you have captured lost sales and grown your top line. The bottom line is this: if you are a vendor, don’t focus on the complexity of global data sync, focus on the small and inexpensive ways you can reduce your stock-outs and increase your sell-through.

Len Lewis
Len Lewis

The ideal situation would be the development of a single global data pool in order to eliminate redundancies. However, this is unlikely to happen any time soon since that part of the data sync business is still highly fragmented and data quality and accuracy are pretty spotty. It’s been said that up to 80% of dimensional data on products alone is incorrect.

As such, the best hope for data sync is data accuracy and the development of true interoperability between data pools. Also, an operation like 1Sync, which came about from the consolidation of Transora and UCCnet is great for multinationals, but may not be feasible for small and medium sized companies.

Mark Lilien
Mark Lilien

Data synchronization might be more easily achieved if retailers using the same ERP software banded together with their merchandise suppliers and ERP solution providers. It’s costly to solve technical problems one retailer at a time.

Ed Dennis
Ed Dennis

Why don’t we get Homeland Security and the FDA to mandate a system in the interest of public safety?

Mike Bann
Mike Bann

There is a way to currently sync data but it involves the deployment of a loyalty program (Full disclosure: this is a loyalty program we offer). CPGs would get the data relevant to their SKUs at a given store, while the store gets to view all SKU data regardless of CPG. The downside is you only capture transaction data if our loyalty card is swiped at the time of purchase. But CPGs could more readily incent consumers to do so by offering a higher rebate on to the consumers card. This also alleviates the rebate burden from the retailer whose margin may already be too thin. Bottom line, the CPG can garner transactional data and further can (if OKed by the retailer) even open a direct channel of communication to the end user. The current marketing position is not to address this specific issue but after reading this dialog, maybe we need to re-think!

Ed Dennis
Ed Dennis

I don’t believe the cash value card Mike speaks of will synchronize date to the degree needed unless it will allow data coordination between grower, processor, shipper, insurer, inspector, receiver, warehouse, distributor, headquarters, store, department, marketing, advertising, management, regional management, purchasing, store management, department management, cashier and consumer. Most systems involving cards miss the entire front end of the supply chain which happens to include the base product cost.

15 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Nikki Baird
Nikki Baird

I definitely think that data sync considerations need to happen much much earlier in the process–as part of the product development process from the beginning. But now we’re talking about a master data strategy for a retailer or a manufacturer, and we’re talking about basic data quality practices.

We’ve gone through the initial rush of external integration, and now I believe we are in the middle of an internal rush to improve how data is handled and managed internally–so as to reduce the amount of inaccurate information that is shared with partners. The next phase will take us back to external considerations, where manufacturers and retailers will link their data quality processes together as part of their data sync efforts. But that’s a ways down the road….

Ed Dennis
Ed Dennis

The mechanics of data synchronization are not the problem. The problem is that the systems are not open. Every one wants it but no one wants to pay for it. It’s like using a toll road–everyone likes the road but everyone also thinks the government, or someone else should pay for it. The software community has had working programs (internet based, secure) for years, but no one wants to adopt a standard. Wal-Mart wants me to use theirs, SuperValue wants me to use theirs, Ahold wants me to use theirs. Wouldn’t you think that the merger of GNX and the World Wide Retail Exchange would give the retail and supplier communities an impartial, neutral platform to solve problems? The retailers, however, seem to be so concerned about not adapting to “better solutions” (this is a constant theme) in favor of trying to take their old, under performing systems and make them work. If we all used their mentality, we would be driving a 1950 Bel Air and signaling turns with our hands.

James Tenser

While most valuable, this discussion is constrained by retail implementation issues that thwart even the most earnest and sophisticated efforts to synchronize supply chain data. I call this “the elephant in the store”: When accurately measured and recorded, shelf conditions rarely match the plan or the forecast and, by the way, we hardly ever accurately measure or record shelf conditions in the first place.

So the dream of a demand-driven supply chain remains a bit of a fantasy because we do not possess a key element of the demand data that this goal requires. No matter how clever the inferential algorithm, it’s near impossible to generate an accurate store-level demand signal based solely on back-door and front-door measures. The results are upstream inventory voids, stock-outs, and sub-par planogram and promotion compliance. Yes, yes–the same old symptoms we’ve been tearing our hair out over for decades.

The shelf remains the deep, dark place in the supply chain–the place we dare not look for fear of confronting the horrible truth. Poor in-store compliance rates remain an embarrassment for our industry. They persist in large measure because we do not have an adequate plan and processes in place to communicate and implement the work, then track and measure it continuously in near real time.

If this sounds too critical, let me be clear: Our industry has made commendable leaps forward on supply chain issues in the past one and one half decades, resulting in billions of dollars in cost savings and value to the consumer. Today the plain facts are that in-store implementation issues are emerging as the greatest factor limiting further progress. Tackling them will require a new collaborative industry initiative.

Kai Clarke
Kai Clarke

The complexities of achieving this goal–that of the sharing of accurate logistical data–are mind-boggling. Most retailers have no true count as to what is in their warehouse and back rooms, let alone what is on their shelves. Add to this the shrink concerns regarding each of these locations and the complexities needed to record and report on each SKU along each step of the logistics cycle and the potential for errors is tremendous. This is the key reason this type of shared data pooling has not worked. Add to this the different data recording and reporting software, equipment needed to do this and the exceedingly high cost to implement a system like this and you have a recipe for disaster. This is why retailers have shied away from this process and do not embrace it as one of their future goals. It would be a great thing but it is not a reasonable expectation that we will see this any time in the near future.

Ron Margulis

The primary hurdle remains the requirement that individual retailers and manufacturers first get their internal data in order before trying to sync up with their trading partners. This has been a challenge because of rapidly changing technology and a rapidly changing marketplace. Companies have had to update their systems to get into the 21st century and while they were trying to do this, the Internet happened, causing additional tech issues. And, the blurring of channels happened, causing additional marketplace issues. While there may be unforeseen impediments blocking the road to near-full industry-wide data synchronization, the future looks good for all of the benefits the process promises.

Dan Gilmore
Dan Gilmore

I guess I can only say: Another year, the same old story.

We’re been hearing this data synchronization story for so long, and I don’t doubt the benefits. Yet, it seems so little progress is ever really made.

I would liken it to this: there is a small treasure out there, nice to have but not one that is going to let you live in luxury. To get that small treasure, there is a huge, difficult mountain that must be crossed, a journey that will take years. You would like this small treasure, but the investment in gear and supplies you will require before you can start the journey is usually just beyond your means.

Even if you are prepared for the trip over the mountain, looking up, it just seems too high, and you decide to wait a bit longer. Other times, just as you are about to set forth, an opportunity for an even smaller treasure, but one that can be had for much less effort and time, presents itself, so you decide to go there first. And I supposed we should probably add to the story that there are actually two treasures over the mountain, one for a friend of yours as well as your own, but the two of you have to make the long journey together to retrieve them. If you get separated too long–poof!–you are back down at the start of the mountain.

Now that I have totally confused everyone, I will add the idea of a “light” CPFR is quite sound and one I have promoted myself.

mark douglass
mark douglass

At the end of the day, 98% of all data required is provided and should be managed by suppliers. Retailers add about 2% to the total pool. But far too often retailers are seen to be the bad guys by dragging suppliers along a supply chain process that embraces the efficient usage of e-commerce or B2B protocols to eliminate days and paper based data entry.

At the heart of this is data change management, the synchronization component through a preferred exchange should be simple stuff as long as suppliers understand the end game, which is selling to customers. Too often Suppliers are so busy selling to retailers they forget the needs of the end customer and consequently do not pay enough attention to data change management, and this is where it gets out of sync. Pack size changes, bonus packs, substitutions, duplicate GTINS, etc.

Retailers are not squeaky clean, but at least they are embracing and promoting common standards in a broad sense, either through an exchange or within their own business. But what about suppliers, they need to embrace what they can control, and manage it as though they are selling to the end customer.

Les McNeill
Les McNeill

The issue remains a priority for the sector and the comments thus far collectively reinforce the need while effectively illustrating the challenges. While clean, accurate data is the essential starting point for both suppliers and merchants, the longer term driving force for GDS is, surprisingly, not supply chain optimization. It is the need for richer, accurate, product information at the consumer’s point of purchase. This is illustrated in recent research on the positive impact of richer product information on buying decisions. The more important issue, however, is that of emerging statutory compliance requirements around the type of product information visible to the customer in the food, health and wellness product categories. The PIM attributes synchronized between the suppliers and the merchants are the natural source of this descriptive information. The combination of statutory information compliance and the drive to more customer-centric merchandising will keep the GDS momentum in place beyond the initial gains in supply chain.

Ian Piddock
Ian Piddock

Product Data Quality is now the hot topic. The GDSN goes a long way toward improving the quality of data that is passed through the supply chain, because the GDSN enforces standards and validation rules for input, thus normalising. However, we all know that if the wrong data is input at first point of entry, then that is the data that is passed through the supply chain. Integrating back-end systems with a PIM product that is GDSN certified, combined with collaborative role based workflow to control the creation and update of product information, will enforce better data quality because users will become accountable.

Accountability will/should ensure that humans take more care over data entry. PDQ problems stem from human beings not being made accountable. Unfortunately, there is no computer system that can replace the human at the start of the process.

Chad Symens
Chad Symens

I’ll add a point which I provided when I was interviewed for the article, but which sadly did not make it into the final version.

While the benefits of an accurate global data catalog are extremely attractive there are simply too many constituents with competing objectives and too many disparate technologies to make the vision a reality. At best, a few industry leaders will create a solution which will have only modest adoption, but more likely the project will fail. Industry trade journals document this to be true and numerous comments above lament this reality.

But vendors who cite the inaccuracy of on-hand data as justification not to use POS activity data for decision making and planning are missing a huge opportunity. Sure, data accuracy can be questionable but if the current situation is no visibility into store level on-hand and you are running 3 percent to 5 percent out of stock, isn’t some visibility better than no visibility? I have personally worked with dozens of vendors who are tracking store level on-hand and comparing the quantities they have shipped to those stores and the unit sales at those stores to estimate what they feel is an accurate on-hand. Even if that effort increases your in-stock by 1 percent, you have captured lost sales and grown your top line. The bottom line is this: if you are a vendor, don’t focus on the complexity of global data sync, focus on the small and inexpensive ways you can reduce your stock-outs and increase your sell-through.

Len Lewis
Len Lewis

The ideal situation would be the development of a single global data pool in order to eliminate redundancies. However, this is unlikely to happen any time soon since that part of the data sync business is still highly fragmented and data quality and accuracy are pretty spotty. It’s been said that up to 80% of dimensional data on products alone is incorrect.

As such, the best hope for data sync is data accuracy and the development of true interoperability between data pools. Also, an operation like 1Sync, which came about from the consolidation of Transora and UCCnet is great for multinationals, but may not be feasible for small and medium sized companies.

Mark Lilien
Mark Lilien

Data synchronization might be more easily achieved if retailers using the same ERP software banded together with their merchandise suppliers and ERP solution providers. It’s costly to solve technical problems one retailer at a time.

Ed Dennis
Ed Dennis

Why don’t we get Homeland Security and the FDA to mandate a system in the interest of public safety?

Mike Bann
Mike Bann

There is a way to currently sync data but it involves the deployment of a loyalty program (Full disclosure: this is a loyalty program we offer). CPGs would get the data relevant to their SKUs at a given store, while the store gets to view all SKU data regardless of CPG. The downside is you only capture transaction data if our loyalty card is swiped at the time of purchase. But CPGs could more readily incent consumers to do so by offering a higher rebate on to the consumers card. This also alleviates the rebate burden from the retailer whose margin may already be too thin. Bottom line, the CPG can garner transactional data and further can (if OKed by the retailer) even open a direct channel of communication to the end user. The current marketing position is not to address this specific issue but after reading this dialog, maybe we need to re-think!

Ed Dennis
Ed Dennis

I don’t believe the cash value card Mike speaks of will synchronize date to the degree needed unless it will allow data coordination between grower, processor, shipper, insurer, inspector, receiver, warehouse, distributor, headquarters, store, department, marketing, advertising, management, regional management, purchasing, store management, department management, cashier and consumer. Most systems involving cards miss the entire front end of the supply chain which happens to include the base product cost.

More Discussions