Posted by Mark Brousseau
As hospitals seek to survive and thrive in the new world of bundled payments, ACO and medical home programs, many are actively seeking to employ more physicians and acquire community practices. In fact, a recent survey by the Medical Group Management Association (MGMA) shows a nearly 75 percent increase in the number of active doctors employed by hospitals since 2000.
This shift has intensified the perennial challenge of making employed providers revenue positive for the organization. A recent study published in The New England Journal of Medicine estimated that hospitals lose between $150,000 and $250,000 per year over the first three years of employing a physician (Kochner and Sahni; "Hospitals' Race to Employ Physicians" – March 30, 2011).
Against this backdrop, hospitals must establish a corporate chargemaster file to standardize aspects of physician charging for greater operational efficiency, optimal reimbursement and reduced compliance risks, says Keith Neilson, CEO of Craneware, which is exhibiting at HFMA's ANI Conference this week in Orlando.
To this end, Craneware is launching its Physician Revenue Toolkit to help hospitals manage multiple physician operations.
What do you think?
Monday, June 27, 2011
10 strategies for reducing healthcare supply costs
Posted by Mark Brousseau
Faced with growing medical-surgical supply costs as reimbursements shrink and healthcare reform looms, healthcare providers can reduce their medical-surgical supply spending immediately—and GHX is recommending the top 10 ways to do it.
The healthcare technology company released its list today at HFMA's ANI Conference in Orlando.
The GHX Top 10:
1.Save an average $12.00-$27.00 per order by conducting as much of your purchasing electronically with as many of your trading partners as possible.
2.Automate the procurement process, from the point of contracting to the point of payment, to streamline operations and boost efficiencies.
3.Centralize purchasing across your organization to provide visibility into and control over as much of your supply spending as possible.
4.Develop a master data management strategy, including the use of global industry data standards, to ensure that you are keeping critical information as up-to-date as possible and that you have “one source of truth” to feed clinical and financial IT systems.
5.Understand the total cost of ownership of your supply chain; in addition to the price paid, consider the financial implications of procurement, logistics, inventory management, charge capture and reimbursement, among others.
6.Create visibility into both the total cost and efficacy of the products being used in patient care, so that you can determine the role supplies play in both the cost and quality of the care your organization provides.
7.Focus on bringing more non-file and off-contract spend under contract, especially high-cost physician preference items.
8.Save an estimated 1-3 percent in avoided overpayments by validating contract pricing and making sure you’re using the most up-to-date contract information.
9.View the supply chain as a function that operates across your organization; establish partnerships with clinical and financial departments to develop and work together to achieve mutual objectives.
10.Collaborate with your trading partners to achieve mutual benefits. Share insights into what happens to products once they arrive at your facility and ask your suppliers for insights into how you can become a lower-cost customer to serve.
What do you think?
Faced with growing medical-surgical supply costs as reimbursements shrink and healthcare reform looms, healthcare providers can reduce their medical-surgical supply spending immediately—and GHX is recommending the top 10 ways to do it.
The healthcare technology company released its list today at HFMA's ANI Conference in Orlando.
The GHX Top 10:
1.Save an average $12.00-$27.00 per order by conducting as much of your purchasing electronically with as many of your trading partners as possible.
2.Automate the procurement process, from the point of contracting to the point of payment, to streamline operations and boost efficiencies.
3.Centralize purchasing across your organization to provide visibility into and control over as much of your supply spending as possible.
4.Develop a master data management strategy, including the use of global industry data standards, to ensure that you are keeping critical information as up-to-date as possible and that you have “one source of truth” to feed clinical and financial IT systems.
5.Understand the total cost of ownership of your supply chain; in addition to the price paid, consider the financial implications of procurement, logistics, inventory management, charge capture and reimbursement, among others.
6.Create visibility into both the total cost and efficacy of the products being used in patient care, so that you can determine the role supplies play in both the cost and quality of the care your organization provides.
7.Focus on bringing more non-file and off-contract spend under contract, especially high-cost physician preference items.
8.Save an estimated 1-3 percent in avoided overpayments by validating contract pricing and making sure you’re using the most up-to-date contract information.
9.View the supply chain as a function that operates across your organization; establish partnerships with clinical and financial departments to develop and work together to achieve mutual objectives.
10.Collaborate with your trading partners to achieve mutual benefits. Share insights into what happens to products once they arrive at your facility and ask your suppliers for insights into how you can become a lower-cost customer to serve.
What do you think?
Healthcare reform boosts importance of business process improvement
Posted by Mark Brousseau
Health reform-mandated revisions, productivity adjustments, and proposed documentation and coding offsets pose a huge challenge for hospitals, says Ken Perez, senior vice president of marketing for MedeAnalytics.
MedeAnalytics is exhibiting at HFMA's ANI Conference this week in Orlando.
“Our research and economic models indicate that a 300-bed hospital will be required to reduce costs by more than $6 million in the year ahead to avoid erosion of its Medicare margins," Perez says. "The sheer magnitude of the financial impact of these multiple, complex and mounting reductions indicates that hospitals should focus even more attention on improving the efficiency and effectiveness of their core activity—the process and delivery of care.”
“Many hospitals are dealing with complex internal processes and financial pressures,” adds MedeAnalytics Associate Vice President of Product Marketing Cole Hooper. “It’s evident that hospitals will need to focus on process workflow and key performance indicators to improve cash flow and identify areas of loss."
What do you think?
Health reform-mandated revisions, productivity adjustments, and proposed documentation and coding offsets pose a huge challenge for hospitals, says Ken Perez, senior vice president of marketing for MedeAnalytics.
MedeAnalytics is exhibiting at HFMA's ANI Conference this week in Orlando.
“Our research and economic models indicate that a 300-bed hospital will be required to reduce costs by more than $6 million in the year ahead to avoid erosion of its Medicare margins," Perez says. "The sheer magnitude of the financial impact of these multiple, complex and mounting reductions indicates that hospitals should focus even more attention on improving the efficiency and effectiveness of their core activity—the process and delivery of care.”
“Many hospitals are dealing with complex internal processes and financial pressures,” adds MedeAnalytics Associate Vice President of Product Marketing Cole Hooper. “It’s evident that hospitals will need to focus on process workflow and key performance indicators to improve cash flow and identify areas of loss."
What do you think?
Saturday, June 25, 2011
Huge opportunity remains for AR automation
By Carrie Krell, campaign manager, Esker
Given the lack of market research data about accounts receivable (AR) automation, the Institute of Financial Operations (IFO) and Esker teamed up this spring on an AR automation study -- a fact-finding mission of sorts. The results are in, and they highlight key findings of qualitative and quantitative research conducted to gain insights into AR automation trends among companies in various industries.
Presented to several thousand AR professionals during the spring of 2011, the survey checked the pulse of the business world to discover how companies are sending their invoices to customers, the cost of sending those invoices, perceptions about the benefits of electronic invoice delivery, the key challenges of customer invoicing and the main obstacles to implementing solutions for automation within accounts receivable. In addition, the survey was intended to gauge trends in AR automation toward initiatives to reduce customer invoicing costs, improve invoice delivery and visibility, and facilitate customer adoption of electronic invoicing.
What the survey found is that invoice delivery processes remain paper-based at most companies, raising several key questions:
... How are companies sending invoices (postal mail, fax or email)?
... What is it costing companies to send out invoices?
... What are the key challenges of customer invoicing?
... What are the main obstacles to implementing an AR automation solution?
... How much time and money could e-invoicing save?
Based on the findings of this study, it is clear that AR departments have a long way to go in migrating from inefficient paper-based processes. The recommendation is for companies to keep a simple focus on delivery of customer invoices in assessing the value of technologies for AR automation. By focusing on delivery and investing in a solution that addresses their specific goals with regard to sending invoices, companies can take advantage of the opportunity to significantly reduce costs and improve efficiency.
On Tuesday afternoon, IFO and Esker will present a free webinar on the survey findings. For more information, visit: http://www.theiarp.org/ViewItem-563.do?parentCatId=271.
Given the lack of market research data about accounts receivable (AR) automation, the Institute of Financial Operations (IFO) and Esker teamed up this spring on an AR automation study -- a fact-finding mission of sorts. The results are in, and they highlight key findings of qualitative and quantitative research conducted to gain insights into AR automation trends among companies in various industries.
Presented to several thousand AR professionals during the spring of 2011, the survey checked the pulse of the business world to discover how companies are sending their invoices to customers, the cost of sending those invoices, perceptions about the benefits of electronic invoice delivery, the key challenges of customer invoicing and the main obstacles to implementing solutions for automation within accounts receivable. In addition, the survey was intended to gauge trends in AR automation toward initiatives to reduce customer invoicing costs, improve invoice delivery and visibility, and facilitate customer adoption of electronic invoicing.
What the survey found is that invoice delivery processes remain paper-based at most companies, raising several key questions:
... How are companies sending invoices (postal mail, fax or email)?
... What is it costing companies to send out invoices?
... What are the key challenges of customer invoicing?
... What are the main obstacles to implementing an AR automation solution?
... How much time and money could e-invoicing save?
Based on the findings of this study, it is clear that AR departments have a long way to go in migrating from inefficient paper-based processes. The recommendation is for companies to keep a simple focus on delivery of customer invoices in assessing the value of technologies for AR automation. By focusing on delivery and investing in a solution that addresses their specific goals with regard to sending invoices, companies can take advantage of the opportunity to significantly reduce costs and improve efficiency.
On Tuesday afternoon, IFO and Esker will present a free webinar on the survey findings. For more information, visit: http://www.theiarp.org/ViewItem-563.do?parentCatId=271.
Wednesday, June 22, 2011
4 ways to use your customers to boost innovation
Posted by Mark Brousseau
The best technology. The best employees. The biggest budget. The strongest R&D department. Check, check, check, and check! If you think these are all the elements you need in order to build a consistently successful company, you’re wrong. Dan Adams says there is one other factor you’ll need to check off that list—an innovation strategy that works.
“The best way to ensure your company will be a success is to deliver more than your share of customer value,” says Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth. “Specifically, you need to develop differentiated products that provide benefits your customers crave. Products they can’t get anywhere else at a comparable cost. But you shouldn’t be guessing what they want. You should base your product innovation on what they say they want.”
Adams notes that back in 2007, Booz Allen Hamilton released an important study on innovation called “The Customer Connection: The Global Innovation 1000.” The company studied 84 percent of the planet’s corporate R&D spending and identified several distinct innovation strategies.
Most importantly, says Adams, the study highlighted one essential element of successful innovation that too many companies forget. Your employees aren’t the only people you should be engaging to create truly unique and profitable products. You should actually be focusing your efforts on engaging your customers!
The Booz Allen Hamilton study found that when it comes to innovation, customer engagement has a huge payoff. It noted, “Companies that directly engaged their customers had superior results regardless of innovation strategy.”
"And not just a little bit superior, a lot superior,” says Adams. “Those companies that used direct customer engagement while innovating versus indirect customer insight enjoyed great financial gains.”
In fact, the study found that the companies that based their innovation strategies on customer feedback experienced gains in the following key areas:
1) Profit Growth: Operating income growth rate that was three times higher.
2) Shareholder Return: Total shareholder return that was 65 percent higher.
3) Return on Assets: Return on assets that was two times higher.
“What should you do with this information?” asks Adams. “For starters, if you’re in a conversation about your company’s innovation and nobody’s talking about the customer, realize something might be very wrong. To put it in terms of the study, your company might be practicing ‘indirect customer insight’ instead of ‘direct customer engagement.’ This is a kind way of saying, ‘We’ve lost track of who our innovation is supposed to help.’”
If you think your company needs some innovation help, read on for a few words of advice from Adams.
Take it to the next level. For more than five years, Adams has been helping B2B suppliers engage their customers in the innovation process. In that time, he has almost seen it all! And he’s used what he’s seen to distinguish six levels of customer engagement during product development. What’s your level?
Level 1: Our Conference Room: At the lowest level, you decide what customers want around your conference room table. Internal opinions determine the design of your next new product.
Level 2: Ask Our Experts: At the next level, you poll your sales force, tech service department, and other internal experts to determine customer needs. Better—because more voices are heard—but still too “internal.”
Level 3: Customer Survey: Here you use surveys and polls to ask customers what they want. This begins to shake out internal biases…but doesn’t deliver much in the way of deep insight.
Level 4: Qualitative VOC Interviews: You send out interview teams that meet with customers to learn what they want. This is a quantum leap from VOO (voice of ourselves) to VOC (voice of the customer).
Level 5: Quantitative VOC Interviews: The problem with just qualitative VOC is that people hear what they want to hear. Quantitative feedback drives out assumptions, bias, and wishful thinking.
Level 6: B2B VOC Interviews: Unlike end-consumers, B2B customers are knowledgeable, rational, and interested. B2B-optimized interview methodology fully engages them to take advantage of this.
“If you aren’t happy with your level, don’t worry,” says Adams. “Through solid training and committed leadership, I’ve seen businesses leap from Level 1 to 6 in the space of a year.”
Remember who’s showing you the money. A successful company innovates for its customers, not itself. “That’s because nobody inside your company can pay for innovation,” notes Adams. “Only your customers can do that. So the more closely you engage those who pay…the more you learn what they’ll pay for.”
Make sure you’re asking the right questions. Too often, innovation is misunderstood as the process of coming up with the right answers. “The reality is that it is actually about asking the right questions,” explains Adams. “If the bright people in your company are focused on real customer needs, they’ll run circles around the bright people at competitors who are focused elsewhere.”
Learn to pre-sell. “I believe the Booz Allen Hamilton conclusions are especially potent for the B2B supplier serving a concentrated market,” says Adams. “If you interview the ten largest prospects in your target market correctly, you’ll engage them so they’ll be primed to buy when you launch that new product.”
“So the bottom line is if you want to boost your innovation, you should start by directly engaging your customers,” says Adams. “And do this in a way that allows you to understand their world, focus on their important, unsatisfied needs, and entice them to keep working with you.
"This innovation strategy is great because you are removing the guessing game aspect of new product development,” he concludes. “You won’t have to worry about whether or not your customers will like your new products because you’ll already know you are delivering exactly what they want.”
What do you think?
The best technology. The best employees. The biggest budget. The strongest R&D department. Check, check, check, and check! If you think these are all the elements you need in order to build a consistently successful company, you’re wrong. Dan Adams says there is one other factor you’ll need to check off that list—an innovation strategy that works.
“The best way to ensure your company will be a success is to deliver more than your share of customer value,” says Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth. “Specifically, you need to develop differentiated products that provide benefits your customers crave. Products they can’t get anywhere else at a comparable cost. But you shouldn’t be guessing what they want. You should base your product innovation on what they say they want.”
Adams notes that back in 2007, Booz Allen Hamilton released an important study on innovation called “The Customer Connection: The Global Innovation 1000.” The company studied 84 percent of the planet’s corporate R&D spending and identified several distinct innovation strategies.
Most importantly, says Adams, the study highlighted one essential element of successful innovation that too many companies forget. Your employees aren’t the only people you should be engaging to create truly unique and profitable products. You should actually be focusing your efforts on engaging your customers!
The Booz Allen Hamilton study found that when it comes to innovation, customer engagement has a huge payoff. It noted, “Companies that directly engaged their customers had superior results regardless of innovation strategy.”
"And not just a little bit superior, a lot superior,” says Adams. “Those companies that used direct customer engagement while innovating versus indirect customer insight enjoyed great financial gains.”
In fact, the study found that the companies that based their innovation strategies on customer feedback experienced gains in the following key areas:
1) Profit Growth: Operating income growth rate that was three times higher.
2) Shareholder Return: Total shareholder return that was 65 percent higher.
3) Return on Assets: Return on assets that was two times higher.
“What should you do with this information?” asks Adams. “For starters, if you’re in a conversation about your company’s innovation and nobody’s talking about the customer, realize something might be very wrong. To put it in terms of the study, your company might be practicing ‘indirect customer insight’ instead of ‘direct customer engagement.’ This is a kind way of saying, ‘We’ve lost track of who our innovation is supposed to help.’”
If you think your company needs some innovation help, read on for a few words of advice from Adams.
Take it to the next level. For more than five years, Adams has been helping B2B suppliers engage their customers in the innovation process. In that time, he has almost seen it all! And he’s used what he’s seen to distinguish six levels of customer engagement during product development. What’s your level?
Level 1: Our Conference Room: At the lowest level, you decide what customers want around your conference room table. Internal opinions determine the design of your next new product.
Level 2: Ask Our Experts: At the next level, you poll your sales force, tech service department, and other internal experts to determine customer needs. Better—because more voices are heard—but still too “internal.”
Level 3: Customer Survey: Here you use surveys and polls to ask customers what they want. This begins to shake out internal biases…but doesn’t deliver much in the way of deep insight.
Level 4: Qualitative VOC Interviews: You send out interview teams that meet with customers to learn what they want. This is a quantum leap from VOO (voice of ourselves) to VOC (voice of the customer).
Level 5: Quantitative VOC Interviews: The problem with just qualitative VOC is that people hear what they want to hear. Quantitative feedback drives out assumptions, bias, and wishful thinking.
Level 6: B2B VOC Interviews: Unlike end-consumers, B2B customers are knowledgeable, rational, and interested. B2B-optimized interview methodology fully engages them to take advantage of this.
“If you aren’t happy with your level, don’t worry,” says Adams. “Through solid training and committed leadership, I’ve seen businesses leap from Level 1 to 6 in the space of a year.”
Remember who’s showing you the money. A successful company innovates for its customers, not itself. “That’s because nobody inside your company can pay for innovation,” notes Adams. “Only your customers can do that. So the more closely you engage those who pay…the more you learn what they’ll pay for.”
Make sure you’re asking the right questions. Too often, innovation is misunderstood as the process of coming up with the right answers. “The reality is that it is actually about asking the right questions,” explains Adams. “If the bright people in your company are focused on real customer needs, they’ll run circles around the bright people at competitors who are focused elsewhere.”
Learn to pre-sell. “I believe the Booz Allen Hamilton conclusions are especially potent for the B2B supplier serving a concentrated market,” says Adams. “If you interview the ten largest prospects in your target market correctly, you’ll engage them so they’ll be primed to buy when you launch that new product.”
“So the bottom line is if you want to boost your innovation, you should start by directly engaging your customers,” says Adams. “And do this in a way that allows you to understand their world, focus on their important, unsatisfied needs, and entice them to keep working with you.
"This innovation strategy is great because you are removing the guessing game aspect of new product development,” he concludes. “You won’t have to worry about whether or not your customers will like your new products because you’ll already know you are delivering exactly what they want.”
What do you think?
Tuesday, June 14, 2011
Where is the AP Automation?
David Johnson, AP solutions manager, Perceptive Software
In a day and age where the corporate mantra includes: “do more with less,” “work smarter not harder,” or “Kaisen,” we still see many organizations processing accounts payable invoices the old-fashioned way. That is, many organizations still receive a majority of their invoices via paper, route them through the organization through intercompany mail--or just walking them from desk to desk, move invoices from pile to pile (waiting to be matched, matched/waiting to be entered, entered/waiting to be paid, paid/waiting to be filed, filed/hopefully to be found again).
Perhaps it’s time to say, put your money where your mouth is. Better yet, put more money on your bottom line by investing in an enterprise content management system that will make your AP processing significantly more efficient.
The Institute of Financial Operations recently conducted a survey regarding the automation of accounts payable with the results issued at their annual Fusion Conference in Orlando, Florida. According to their study, more than 75% of the respondents indicated that they receive a majority of their invoices via paper. Of those responding, 39% stated that their paper invoice volume exceeded 90% of their total volume.
When looking at the paper invoice volume, 32% indicated that their volume over the past year has not changed. When combining the following categories over the past year: slightly lower, unchanged, and slightly higher, the results showed that over 80% of paper invoice volume has essentially remained static. There appears to be no end in sight of paper invoices for these organizations.
There is a cost associated with the manual payment process too. According to this same study, 41% of respondents indicated their processing cost per invoice was $5.00 or less while 59% reported a per invoice cost in excess of $5.00. These processing costs savings do not include the potential for early payment discounts offered by vendors.
I attended Disney’s Accounts Payable Department presentation at Fusion on their world-class accounts payable processing. They reported a per invoice cost of $1.61 per invoice with the aid of automation. It’s worth noting that Disney processes in excess of 5 million invoices annually. Thus, a mere change in cost per invoice of $0.01 will affect their bottom line by $50,000.
So c’mon, do more with less and work smarter not harder, let automation bring bottom line results to your organization. Your competitors are.
What do you think? Post your comments below.
In a day and age where the corporate mantra includes: “do more with less,” “work smarter not harder,” or “Kaisen,” we still see many organizations processing accounts payable invoices the old-fashioned way. That is, many organizations still receive a majority of their invoices via paper, route them through the organization through intercompany mail--or just walking them from desk to desk, move invoices from pile to pile (waiting to be matched, matched/waiting to be entered, entered/waiting to be paid, paid/waiting to be filed, filed/hopefully to be found again).
Perhaps it’s time to say, put your money where your mouth is. Better yet, put more money on your bottom line by investing in an enterprise content management system that will make your AP processing significantly more efficient.
The Institute of Financial Operations recently conducted a survey regarding the automation of accounts payable with the results issued at their annual Fusion Conference in Orlando, Florida. According to their study, more than 75% of the respondents indicated that they receive a majority of their invoices via paper. Of those responding, 39% stated that their paper invoice volume exceeded 90% of their total volume.
When looking at the paper invoice volume, 32% indicated that their volume over the past year has not changed. When combining the following categories over the past year: slightly lower, unchanged, and slightly higher, the results showed that over 80% of paper invoice volume has essentially remained static. There appears to be no end in sight of paper invoices for these organizations.
There is a cost associated with the manual payment process too. According to this same study, 41% of respondents indicated their processing cost per invoice was $5.00 or less while 59% reported a per invoice cost in excess of $5.00. These processing costs savings do not include the potential for early payment discounts offered by vendors.
I attended Disney’s Accounts Payable Department presentation at Fusion on their world-class accounts payable processing. They reported a per invoice cost of $1.61 per invoice with the aid of automation. It’s worth noting that Disney processes in excess of 5 million invoices annually. Thus, a mere change in cost per invoice of $0.01 will affect their bottom line by $50,000.
So c’mon, do more with less and work smarter not harder, let automation bring bottom line results to your organization. Your competitors are.
What do you think? Post your comments below.
Social media, P2P, and mobile payments: disruptive for banks?
By Glenn Wheeler, president, Viewpointe Clearing, Settlement & Association Services
The payments landscape has been transforming. Jockeying for position are some major non-banking players – social networking sites, peer-to-peer payment services, mobile phone providers and credit card companies. As the competition heats up, the time horizon for establishing supremacy in the market will get shorter and shorter.
Some believe that banks, long the dominant facilitator of payments, will have problems reacting and adapting to this new paradigm. Certainly the evolving payments market presents new challenges for banks, but the situation might not be as dire as some have predicted.
There is no question customers are demanding convenience. According to a recent Bank Systems & Technology article, “Mobile Payment Users Expected to Surpass 375 Million by 2015,” market research firm In-Stat predicts that the number of mobile payment users globally will triple by 2015. Where will this demand be met? Well, there are a multitude of channels for customers to gain access to mobile payments; many do not require a financial institution. Consumers have many options: there is “virtual currency” offered by social networking mobile apps such as Facebook, "tap and pay" apps via smartphone using near-field communication (NFC) offered by Google and telecom providers, and mobile peer-to-peer services from PayPal, among others. Undoubtedly, there are additional mobile payment channels being innovated as I type this.
Banks are keenly aware of the risk mobile payments pose to a key part of their business. However, it is the security component of mobile payments that deserves a second look. Security is a differentiator and one in which non-bank providers may have a hard time competing with the banks. By developing capabilities for payment transactions that bypass financial institutions, thereby circumventing the advanced systems that help secure and oversee transactions, consumers could be at risk. Naive consumers might falsely believe that they have the same type of security and protection that they do with their bank. From an individual concern, payment data could wend its way without needed security and structure.
Certainly the mobile payment market is fascinating to watch. Who would have imagined 20 years ago that your wallet could very well be replaced by a mobile phone?
But will this technology disrupt or energize banks? While mobile payments pose some challenges, I believe banks will rise to the occasion, and many already have. Billions of dollars have been spent securing the existing payment ecosystem and new entrants have to play by the same rules. This is one area where banks clearly have a leg up, and are well-prepared for the challenges ahead. Banks, mobile payment providers and consumers should keep this in mind.
What do you think?
The payments landscape has been transforming. Jockeying for position are some major non-banking players – social networking sites, peer-to-peer payment services, mobile phone providers and credit card companies. As the competition heats up, the time horizon for establishing supremacy in the market will get shorter and shorter.
Some believe that banks, long the dominant facilitator of payments, will have problems reacting and adapting to this new paradigm. Certainly the evolving payments market presents new challenges for banks, but the situation might not be as dire as some have predicted.
There is no question customers are demanding convenience. According to a recent Bank Systems & Technology article, “Mobile Payment Users Expected to Surpass 375 Million by 2015,” market research firm In-Stat predicts that the number of mobile payment users globally will triple by 2015. Where will this demand be met? Well, there are a multitude of channels for customers to gain access to mobile payments; many do not require a financial institution. Consumers have many options: there is “virtual currency” offered by social networking mobile apps such as Facebook, "tap and pay" apps via smartphone using near-field communication (NFC) offered by Google and telecom providers, and mobile peer-to-peer services from PayPal, among others. Undoubtedly, there are additional mobile payment channels being innovated as I type this.
Banks are keenly aware of the risk mobile payments pose to a key part of their business. However, it is the security component of mobile payments that deserves a second look. Security is a differentiator and one in which non-bank providers may have a hard time competing with the banks. By developing capabilities for payment transactions that bypass financial institutions, thereby circumventing the advanced systems that help secure and oversee transactions, consumers could be at risk. Naive consumers might falsely believe that they have the same type of security and protection that they do with their bank. From an individual concern, payment data could wend its way without needed security and structure.
Certainly the mobile payment market is fascinating to watch. Who would have imagined 20 years ago that your wallet could very well be replaced by a mobile phone?
But will this technology disrupt or energize banks? While mobile payments pose some challenges, I believe banks will rise to the occasion, and many already have. Billions of dollars have been spent securing the existing payment ecosystem and new entrants have to play by the same rules. This is one area where banks clearly have a leg up, and are well-prepared for the challenges ahead. Banks, mobile payment providers and consumers should keep this in mind.
What do you think?
Monday, June 13, 2011
"Underserved" market is opportunity for banks
Posted by Mark Brousseau
The “underserved” market is considered one of the fastest growing segments in the United States and represents significant potential for banks willing to develop new products and services -- with the appropriate risk safeguards -- and channels to distribute them, according to a study from KPMG.
The KPMG study characterizes the underserved market -- the unbanked (consumers without a transaction account) and underbanked (those without access to incremental credit) -- as having grown significantly in the United States during the economic downturn. The market represents about 88 million individuals with nearly $1.3 trillion in income, according to the KPMG study. Based on forecasts, as many as six million people could be classified as "underserved" in the next two years.
"As banks transform their business models to address a new marketplace, they need to examine the potential of the underserved market as new revenue streams are necessary due to increasing compliance costs and various fees coming under pressure as a result of regulatory reform," said Carl Carande, national account leader of KPMG’s Banking and Finance practice. "In the current environment, we see heavy competition among banks chasing customers with high credit scores, with decreasing margins, leaving the underserved market for those willing to invest in it."
Carande also says that banks, before moving forward, need to ensure that appropriate risk-protections are built-in for the bank and customer. "Risk management is a key element of the early opportunity assessment phase, as banks review their current state and design a portfolio of business opportunities for both the near-term and short-term," said Carande. "From there, it is a matter of creating a target operating model before moving to the end game of deploying a multi-generational plan."
According to the KPMG study, banks can pursue a range of key target segments among the underserved, ranging from those who do not use a bank to young adults with little knowledge of financial products.
"Customer segmentation is critical to serving the underserved market and each target segment requires a disciplined and strategic approach," said Timothy Ramsey, managing director in KPMG LLP’s Performance and Technology Advisory group. "Those banks that carve out a niche that makes sense -- and can successfully market and brand themselves accordingly -- will distinguish themselves from the competition."
"When serving this market, banks also have an opportunity to establish customer loyalty by helping these customers more effectively manage their personal finances and develop better saving and investing habits through educational, financial literacy programs," said Ramsey.
What do you think?
The “underserved” market is considered one of the fastest growing segments in the United States and represents significant potential for banks willing to develop new products and services -- with the appropriate risk safeguards -- and channels to distribute them, according to a study from KPMG.
The KPMG study characterizes the underserved market -- the unbanked (consumers without a transaction account) and underbanked (those without access to incremental credit) -- as having grown significantly in the United States during the economic downturn. The market represents about 88 million individuals with nearly $1.3 trillion in income, according to the KPMG study. Based on forecasts, as many as six million people could be classified as "underserved" in the next two years.
"As banks transform their business models to address a new marketplace, they need to examine the potential of the underserved market as new revenue streams are necessary due to increasing compliance costs and various fees coming under pressure as a result of regulatory reform," said Carl Carande, national account leader of KPMG’s Banking and Finance practice. "In the current environment, we see heavy competition among banks chasing customers with high credit scores, with decreasing margins, leaving the underserved market for those willing to invest in it."
Carande also says that banks, before moving forward, need to ensure that appropriate risk-protections are built-in for the bank and customer. "Risk management is a key element of the early opportunity assessment phase, as banks review their current state and design a portfolio of business opportunities for both the near-term and short-term," said Carande. "From there, it is a matter of creating a target operating model before moving to the end game of deploying a multi-generational plan."
According to the KPMG study, banks can pursue a range of key target segments among the underserved, ranging from those who do not use a bank to young adults with little knowledge of financial products.
"Customer segmentation is critical to serving the underserved market and each target segment requires a disciplined and strategic approach," said Timothy Ramsey, managing director in KPMG LLP’s Performance and Technology Advisory group. "Those banks that carve out a niche that makes sense -- and can successfully market and brand themselves accordingly -- will distinguish themselves from the competition."
"When serving this market, banks also have an opportunity to establish customer loyalty by helping these customers more effectively manage their personal finances and develop better saving and investing habits through educational, financial literacy programs," said Ramsey.
What do you think?
Labels:
bank,
bank fees,
banking,
banking reform,
Mark Brousseau,
mobile banking,
mobile deposits,
TAWPI,
underserved
Thursday, June 9, 2011
Top trends in retail online banking
Posted by Mark Brousseau
Bank customers have started to demand that their banks’ online offerings keep up with the times, according to new research from Celent, LLC. And for good reason: the online banking space has stagnated for far too long, the research and advisory firm contends.
The evolution of the Internet has provided consumers with rich and interactive experiences online. Unfortunately, the banking industry has not kept pace with the evolution of the Internet, and customers have started to demand that their banks keep up with the times. For the most part, financial institutions recognize their online shortcomings.
Celent says the question is: Why haven’t they acted on them, what can they do about it, and how can they keep up with ever-increasing customer demands? These questions become even more difficult to answer because financial institutions have just started to emerge from the impact of the financial crisis and are under extreme pressure to run sustainable businesses in the wake of increased regulatory pressures.
The good news is that next-generation online banking is on its way, according to Celent research. Some of these are in full swing; others are just emerging or expected to impact the space within the next three to five years.
"Online banking isn’t an alternative channel any more. It’s a mainstream channel," says Jacob Jegher, senior analyst with Celent’s Banking group. "This channel, however, requires a lot of attention. If banks don’t act swiftly, they risk critical customer relationships and revenue. It’s important that banks harness technology but don’t use it as their best foot forward."
What do you think?
Bank customers have started to demand that their banks’ online offerings keep up with the times, according to new research from Celent, LLC. And for good reason: the online banking space has stagnated for far too long, the research and advisory firm contends.
The evolution of the Internet has provided consumers with rich and interactive experiences online. Unfortunately, the banking industry has not kept pace with the evolution of the Internet, and customers have started to demand that their banks keep up with the times. For the most part, financial institutions recognize their online shortcomings.
Celent says the question is: Why haven’t they acted on them, what can they do about it, and how can they keep up with ever-increasing customer demands? These questions become even more difficult to answer because financial institutions have just started to emerge from the impact of the financial crisis and are under extreme pressure to run sustainable businesses in the wake of increased regulatory pressures.
The good news is that next-generation online banking is on its way, according to Celent research. Some of these are in full swing; others are just emerging or expected to impact the space within the next three to five years.
"Online banking isn’t an alternative channel any more. It’s a mainstream channel," says Jacob Jegher, senior analyst with Celent’s Banking group. "This channel, however, requires a lot of attention. If banks don’t act swiftly, they risk critical customer relationships and revenue. It’s important that banks harness technology but don’t use it as their best foot forward."
What do you think?
Tuesday, June 7, 2011
Bank CFOs fretting health reform
Posted by Mark Brousseau
While CFOs are becoming bullish on the economy, they are concerned about health care reform, according to a biannual survey of banking and financial services Chief Financial Officers (CFOs) and senior controllers conducted by Grant Thornton LLP.
Nearly half (48 percent) of the banking/financial services CFOs said that they expect the U.S. economy to improve in the next six months and nearly two-thirds (65 percent) are optimistic about their own company; however, 55 percent also report that they plan to increase the prices or fees charged by their company in the next six months.
Regarding health care reform, 49 percent of banking/financial services CFOs said that it will decrease their hiring (compared to 37 percent nationally), 52 percent said that it would decrease their company’s growth (compared to 40 percent nationally) and 58 percent said that it would increase their product pricing (compared to 49 percent nationally).
“Although we are seeing increased optimism in the banking and financial services sectors, firms are also bracing for the increased compliance costs that accompany both financial reform and health care reform legislation,” says Nichole Jordan, Grant Thornton LLP National Banking and Securities Industry Leader. “Unfortunately, this means that increased costs from interchange fees to expanded health care will be passed along to the consumer or will affect how aggressively firms can hire.”
When asked about the business climate in their own state, 61 percent of banking/financial services CFOs said that they are seeing a negative impact on their business due to the financial condition of their state and 66 percent reported that the actions of the political leaders in their state have not created a business-friendly environment. In addition, an overwhelming majority (95 percent) support public-private partnerships that seek to reorganize and improve the function of state and local governments and public services as a means to overcome budget challenges at the state and local level.
“Although much of the industry has focused on the impact of national financial reform, banks also need to understand how the political and fiscal environments in their own states can affect their business,” adds Jordan.
What do you think?
While CFOs are becoming bullish on the economy, they are concerned about health care reform, according to a biannual survey of banking and financial services Chief Financial Officers (CFOs) and senior controllers conducted by Grant Thornton LLP.
Nearly half (48 percent) of the banking/financial services CFOs said that they expect the U.S. economy to improve in the next six months and nearly two-thirds (65 percent) are optimistic about their own company; however, 55 percent also report that they plan to increase the prices or fees charged by their company in the next six months.
Regarding health care reform, 49 percent of banking/financial services CFOs said that it will decrease their hiring (compared to 37 percent nationally), 52 percent said that it would decrease their company’s growth (compared to 40 percent nationally) and 58 percent said that it would increase their product pricing (compared to 49 percent nationally).
“Although we are seeing increased optimism in the banking and financial services sectors, firms are also bracing for the increased compliance costs that accompany both financial reform and health care reform legislation,” says Nichole Jordan, Grant Thornton LLP National Banking and Securities Industry Leader. “Unfortunately, this means that increased costs from interchange fees to expanded health care will be passed along to the consumer or will affect how aggressively firms can hire.”
When asked about the business climate in their own state, 61 percent of banking/financial services CFOs said that they are seeing a negative impact on their business due to the financial condition of their state and 66 percent reported that the actions of the political leaders in their state have not created a business-friendly environment. In addition, an overwhelming majority (95 percent) support public-private partnerships that seek to reorganize and improve the function of state and local governments and public services as a means to overcome budget challenges at the state and local level.
“Although much of the industry has focused on the impact of national financial reform, banks also need to understand how the political and fiscal environments in their own states can affect their business,” adds Jordan.
What do you think?
Labels:
B2B,
bank fees,
banking,
CFO,
CFOs,
financial services,
health reform,
Mark Brousseau,
TAWPI
8 steps to creating B2B products your customers will love
Posted by Mark Brousseau
There’s a famous quote from Henry Ford that Steve Jobs has been known to cite: “If I’d have asked my customers what they wanted,” Ford reportedly said, “they would have told me ‘a faster horse.’” Yes, it reflects a bold product development philosophy. And this closed-door, tell-customers-what-they-want-even-if-they-don’t-yet-know-it approach works well for our modern day King of Innovation (and his development team at Apple, of course). But if you’re tempted to adopt the Jobsian method yourself, Dan Adams urges you to think twice.
“Don’t start wearing black turtlenecks and imagining your blockbuster new product just yet,” advises Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth and founder of Advanced Industrial Marketing.
“The reality is that the average new product success rate—once the costly development stage begins—is only 25 percent,” he adds. “Generally speaking, for those of us who aren’t Steve Jobs, the practice of developing new products first and then waiting to see if customers buy them is a terribly inefficient use of resources.”
For B2B suppliers, in particular, Adams extols the virtues of first understanding market needs and then developing supplier solutions to meet them. In fact, his New Product Blueprinting—packed full of very practical methods, skills, and tools that have been finely tuned on six continents and in hundreds of industries—centers on this “ask before you innovate” philosophy.
“The good news is that you can conceptualize products you know your customers need before spending a bundle on development and launch,” explains Adams. “And even more good news, this approach does not prevent you from developing exciting, breakthrough products. What’s more, it’s unlikely your competitors are using this approach today, so your competitive advantage can be enormous.”
Here are the key steps to becoming a new product mastermind in your own right:
Remember, Steve Jobs deals in consumer goods—a whole different ballgame from B2B products. In describing his iTunes development team, Jobs said, “The reason that we worked so hard is because we all wanted one. You know? I mean, the first few hundred customers were us.”
In contrast, points out Adams, when DuPont developed Kevlar, they first experimented in applications such as tire cords. They went 10 years before implementing the first field trial in protective body armor, which ultimately became their main market. If you’re selling to other businesses, it’s unlikely you know enough about your customers’ worlds to hit the nail on the head with every product you develop for them.
“Unlike Steve Jobs, who can create successful products based on what he knows he wants and what his Apple employees want, you have to ask your customers what they want,” says Adams. “Otherwise, you risk spending tons of time and money on a product that you think is great, but that ultimately elicits a sleepy yawn from your customers.”
Compare your IQ (Innovation Quotient) to Steve’s and act accordingly. There’s no doubt that you and your team are smart. And in fact, you and your development team may just be as smart as Jobs and his team. But it’s unlikely you’ve worked as hard for as long at mastering the skills needed to develop blockbuster products.
“Just because Reinhold Messner—one of the world’s greatest mountain climbers—makes a solo climb of Mt. Everest without supplemental oxygen, doesn’t mean you can,” notes Adams. “But with training, oxygen, the right team, and an easier route, you might still enjoy the same view. My point is, if you want to win in the marketplace, tip the scales in your favor. Why not avoid unnecessary risks when you can?”
Because these risks can be costly. During a time period that Jobs was absent from Apple, the company had its share of new product flops. You might recall the Newton MessagePad. Or how about the Apple Bandai Pippin, the gaming console technology created by Apple, or Cyberdog, the Internet browser Apple created back in the late ’90s?
“Sure, it would be great if your next three products were MacBook, iPod, and iPad,” says Adams. “But if they are Newton, Pippin, and Cyberdog, will you still even be working at the same company?”
Learn how to attack the right market. When Apple develops a new product for the global consumer electronics market, it can be assured it is pursuing a market that is large, growing, and open to change. Unfortunately, it’s possible—and all too common—for B2B suppliers to pursue far lesser markets.
“If you make adhesives, they could be used in window construction, aircraft interiors, solar panels, and so on,” notes Adams. “Smart B2B suppliers focus their scarce resources on just those market segments with the best prospects for growth, adequate size, reasonable competitive landscape, and so on. You can learn much of this information by doing solid secondary market research. But you often need to spend time interviewing customers in potential market segments as well. Sometimes you’ll find an ‘over-served’ market that is looking only for lower pricing. That’s a good time to ‘bail’ and pursue a different market.”
Uncover customer outcomes. Steve Jobs makes a good point when he says you can’t just ask customers for “the next big thing.” But the next big thing is the “solution,” which is supposed to be the supplier’s area of expertise. The customer’s area of expertise is the “outcome”—what they want to have happen or what they want a new product to do for them. They don’t know how to make it happen. They just know they need it to happen. When you find out what kind of outcome your customers want, you can provide their solution.
“Knowing that these are the outcomes his customers wanted, what kind of products should he develop?” says Adams. “Perhaps something that looks like iTunes and the iPod. I use this made-up scenario to illustrate how the outcomes you hear from your customers might translate into new products. Once you know what outcomes your customers want, you can begin to develop a product that delivers them.
“Research shows there are 50 to 150 customer outcomes for every job your product is hired to do,” he adds. “And the reality is that talking to customers and uncovering these outcomes actually helps your team be more creative. For example, it’s likely your customers will reveal an outcome they need that you and your team might never have thought of without their input.”
Don’t “just ask” customers. When you ask customers for their outcomes, get creative. You need to really get your customers thinking and talking. In-depth. One- or two-sentence answers will rarely give you the information you need—and that’s what you’re likely to get unless you know how to probe.
“You can encourage customers to dig deeper using interview methods similar to those we developed at Advanced Industrial Marketing,” says Adams. “For example, we have special ‘trigger methods’ to get them out of mental ruts. We have fresh ways for probing their responses. And we have unique observation and customer tour tools to let you see exciting new opportunities.
“When someone says, ‘Don’t just ask customers what they want,’ it doesn’t mean you should isolate yourself deep within the bowels of your company to guess what they want,” he adds. “It means you should get innovative about ways to enter your customers’ worlds and understand the needs they cannot easily articulate on their own.”
Prioritize customer outcomes. What will customers richly pay you for? Only for delivering outcomes that are important and currently unsatisfied. That’s why Adams advises clients to get quantitative—to ask customers to rate how eager they are for certain elements of a new product. For example, you might ask on a scale of 1-10 how important it is to “search for a broad range of music.” Then ask that same customer to rate, on a scale of 1-10, how satisfied they are today with their ability to “search for a broad range of music.” Then focus your product development on outcomes that scored high in importance and low in current satisfaction.
“Most suppliers fail to ask these quantitative questions,” says Adams. “The result is they miss two critical points: The first is that it’s a mistake to let your engineers and scientists work on answers to questions customers don’t care about; secondly, to a certain extent, we all ‘hear what we want to hear’ in customer interviews, so quantitative data is needed to drive out internal bias and wishful thinking.”
Take advantage of the profit motive. Many B2B suppliers completely overlook an enormous advantage they have over consumer-products companies such as Apple: the ability to measure value delivered to their customers. How do you measure the “coolness” of a tiny iPod, the convenience of a fast music download, or the bragging rights of owning the latest iPhone model?
But the B2B supplier’s world is different. “I’ve helped B2B suppliers in hundreds of industries,” says Adams, “and their customers are usually in the business of making money. B2B suppliers can help their customers make more money by improving their processes and/or their products. If suppliers are willing to work at this, they can often measure or predict how a new product will let customers a) reduce costs, b) sell higher volumes, or c) sell at higher prices.
“Tools such as value calculators allow attentive B2B suppliers to understand the value their customers will receive from their new product,” he adds. “This teaches the supplier how to precisely ‘tune’ the design of their new product, how to price it, and how to promote it. This may not be as much fun as a new touch-screen phone, but it’s great for the supplier’s bottom line.”
Get creative with the solutions. Truly hearing the voice of the customer is necessary, but not sufficient. Here’s where you can and should emulate Jobs and his team at Apple—in the creativity department. Jobs doesn’t just encourage innovation; he requires it. He wants Apple employees to take risks, give feedback, and constantly think outside the box. Basically, creativity is a must.
“Once your team knows the outcomes customers care about, they need to focus all their creative energy on finding the solutions that result in those outcomes,” says Adams. “This is best done by engaging as many of the right minds as possible. But remember, this often means engaging those who work outside your company.”
“I leave you with a sort of caveat,” says Adams. “The new product development process that I’ve laid out might look neat and orderly, but in fact, it is often like a messy kitchen as the meal is being prepared. It won’t be unusual during the process for your scientists to invent great new technology before finding a home for it—think Post-it Notes or ScotchgardTM. Do you just leave those products quivering on the lab bench since customers didn’t ask for them? Absolutely not.
What do you think?
There’s a famous quote from Henry Ford that Steve Jobs has been known to cite: “If I’d have asked my customers what they wanted,” Ford reportedly said, “they would have told me ‘a faster horse.’” Yes, it reflects a bold product development philosophy. And this closed-door, tell-customers-what-they-want-even-if-they-don’t-yet-know-it approach works well for our modern day King of Innovation (and his development team at Apple, of course). But if you’re tempted to adopt the Jobsian method yourself, Dan Adams urges you to think twice.
“Don’t start wearing black turtlenecks and imagining your blockbuster new product just yet,” advises Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth and founder of Advanced Industrial Marketing.
“The reality is that the average new product success rate—once the costly development stage begins—is only 25 percent,” he adds. “Generally speaking, for those of us who aren’t Steve Jobs, the practice of developing new products first and then waiting to see if customers buy them is a terribly inefficient use of resources.”
For B2B suppliers, in particular, Adams extols the virtues of first understanding market needs and then developing supplier solutions to meet them. In fact, his New Product Blueprinting—packed full of very practical methods, skills, and tools that have been finely tuned on six continents and in hundreds of industries—centers on this “ask before you innovate” philosophy.
“The good news is that you can conceptualize products you know your customers need before spending a bundle on development and launch,” explains Adams. “And even more good news, this approach does not prevent you from developing exciting, breakthrough products. What’s more, it’s unlikely your competitors are using this approach today, so your competitive advantage can be enormous.”
Here are the key steps to becoming a new product mastermind in your own right:
Remember, Steve Jobs deals in consumer goods—a whole different ballgame from B2B products. In describing his iTunes development team, Jobs said, “The reason that we worked so hard is because we all wanted one. You know? I mean, the first few hundred customers were us.”
In contrast, points out Adams, when DuPont developed Kevlar, they first experimented in applications such as tire cords. They went 10 years before implementing the first field trial in protective body armor, which ultimately became their main market. If you’re selling to other businesses, it’s unlikely you know enough about your customers’ worlds to hit the nail on the head with every product you develop for them.
“Unlike Steve Jobs, who can create successful products based on what he knows he wants and what his Apple employees want, you have to ask your customers what they want,” says Adams. “Otherwise, you risk spending tons of time and money on a product that you think is great, but that ultimately elicits a sleepy yawn from your customers.”
Compare your IQ (Innovation Quotient) to Steve’s and act accordingly. There’s no doubt that you and your team are smart. And in fact, you and your development team may just be as smart as Jobs and his team. But it’s unlikely you’ve worked as hard for as long at mastering the skills needed to develop blockbuster products.
“Just because Reinhold Messner—one of the world’s greatest mountain climbers—makes a solo climb of Mt. Everest without supplemental oxygen, doesn’t mean you can,” notes Adams. “But with training, oxygen, the right team, and an easier route, you might still enjoy the same view. My point is, if you want to win in the marketplace, tip the scales in your favor. Why not avoid unnecessary risks when you can?”
Because these risks can be costly. During a time period that Jobs was absent from Apple, the company had its share of new product flops. You might recall the Newton MessagePad. Or how about the Apple Bandai Pippin, the gaming console technology created by Apple, or Cyberdog, the Internet browser Apple created back in the late ’90s?
“Sure, it would be great if your next three products were MacBook, iPod, and iPad,” says Adams. “But if they are Newton, Pippin, and Cyberdog, will you still even be working at the same company?”
Learn how to attack the right market. When Apple develops a new product for the global consumer electronics market, it can be assured it is pursuing a market that is large, growing, and open to change. Unfortunately, it’s possible—and all too common—for B2B suppliers to pursue far lesser markets.
“If you make adhesives, they could be used in window construction, aircraft interiors, solar panels, and so on,” notes Adams. “Smart B2B suppliers focus their scarce resources on just those market segments with the best prospects for growth, adequate size, reasonable competitive landscape, and so on. You can learn much of this information by doing solid secondary market research. But you often need to spend time interviewing customers in potential market segments as well. Sometimes you’ll find an ‘over-served’ market that is looking only for lower pricing. That’s a good time to ‘bail’ and pursue a different market.”
Uncover customer outcomes. Steve Jobs makes a good point when he says you can’t just ask customers for “the next big thing.” But the next big thing is the “solution,” which is supposed to be the supplier’s area of expertise. The customer’s area of expertise is the “outcome”—what they want to have happen or what they want a new product to do for them. They don’t know how to make it happen. They just know they need it to happen. When you find out what kind of outcome your customers want, you can provide their solution.
“Knowing that these are the outcomes his customers wanted, what kind of products should he develop?” says Adams. “Perhaps something that looks like iTunes and the iPod. I use this made-up scenario to illustrate how the outcomes you hear from your customers might translate into new products. Once you know what outcomes your customers want, you can begin to develop a product that delivers them.
“Research shows there are 50 to 150 customer outcomes for every job your product is hired to do,” he adds. “And the reality is that talking to customers and uncovering these outcomes actually helps your team be more creative. For example, it’s likely your customers will reveal an outcome they need that you and your team might never have thought of without their input.”
Don’t “just ask” customers. When you ask customers for their outcomes, get creative. You need to really get your customers thinking and talking. In-depth. One- or two-sentence answers will rarely give you the information you need—and that’s what you’re likely to get unless you know how to probe.
“You can encourage customers to dig deeper using interview methods similar to those we developed at Advanced Industrial Marketing,” says Adams. “For example, we have special ‘trigger methods’ to get them out of mental ruts. We have fresh ways for probing their responses. And we have unique observation and customer tour tools to let you see exciting new opportunities.
“When someone says, ‘Don’t just ask customers what they want,’ it doesn’t mean you should isolate yourself deep within the bowels of your company to guess what they want,” he adds. “It means you should get innovative about ways to enter your customers’ worlds and understand the needs they cannot easily articulate on their own.”
Prioritize customer outcomes. What will customers richly pay you for? Only for delivering outcomes that are important and currently unsatisfied. That’s why Adams advises clients to get quantitative—to ask customers to rate how eager they are for certain elements of a new product. For example, you might ask on a scale of 1-10 how important it is to “search for a broad range of music.” Then ask that same customer to rate, on a scale of 1-10, how satisfied they are today with their ability to “search for a broad range of music.” Then focus your product development on outcomes that scored high in importance and low in current satisfaction.
“Most suppliers fail to ask these quantitative questions,” says Adams. “The result is they miss two critical points: The first is that it’s a mistake to let your engineers and scientists work on answers to questions customers don’t care about; secondly, to a certain extent, we all ‘hear what we want to hear’ in customer interviews, so quantitative data is needed to drive out internal bias and wishful thinking.”
Take advantage of the profit motive. Many B2B suppliers completely overlook an enormous advantage they have over consumer-products companies such as Apple: the ability to measure value delivered to their customers. How do you measure the “coolness” of a tiny iPod, the convenience of a fast music download, or the bragging rights of owning the latest iPhone model?
But the B2B supplier’s world is different. “I’ve helped B2B suppliers in hundreds of industries,” says Adams, “and their customers are usually in the business of making money. B2B suppliers can help their customers make more money by improving their processes and/or their products. If suppliers are willing to work at this, they can often measure or predict how a new product will let customers a) reduce costs, b) sell higher volumes, or c) sell at higher prices.
“Tools such as value calculators allow attentive B2B suppliers to understand the value their customers will receive from their new product,” he adds. “This teaches the supplier how to precisely ‘tune’ the design of their new product, how to price it, and how to promote it. This may not be as much fun as a new touch-screen phone, but it’s great for the supplier’s bottom line.”
Get creative with the solutions. Truly hearing the voice of the customer is necessary, but not sufficient. Here’s where you can and should emulate Jobs and his team at Apple—in the creativity department. Jobs doesn’t just encourage innovation; he requires it. He wants Apple employees to take risks, give feedback, and constantly think outside the box. Basically, creativity is a must.
“Once your team knows the outcomes customers care about, they need to focus all their creative energy on finding the solutions that result in those outcomes,” says Adams. “This is best done by engaging as many of the right minds as possible. But remember, this often means engaging those who work outside your company.”
“I leave you with a sort of caveat,” says Adams. “The new product development process that I’ve laid out might look neat and orderly, but in fact, it is often like a messy kitchen as the meal is being prepared. It won’t be unusual during the process for your scientists to invent great new technology before finding a home for it—think Post-it Notes or ScotchgardTM. Do you just leave those products quivering on the lab bench since customers didn’t ask for them? Absolutely not.
What do you think?
Thursday, June 2, 2011
Pre-paid cards set to skyrocket
Posted by Mark Brousseau
Pre-paid cards are primed for explosive growth in the coming year, according to a survey conducted by Firstsource Solutions.
Fifty percent of payment industry professionals surveyed expect wider adoption of pre-paid cards as more consumers move away from credit cards and cash. Nearly 30 percent of respondents said that more consumers would become “loaders” (i.e. depositing more money to their pre-paid accounts).
“We’re seeing a growing interest in pre-paid cards in consumer segments that weren’t originally drawn to using such a form of payment,” says Tim Smith, senior vice president, Banking Financial Services & Insurance, Firstsource. “Our findings support recent research about the upward trend in the pre-paid market which shows that an estimated $37 billion was loaded onto prepaid cards last year, compared to $18 billion in 2009 and $9 billion in 2008.”
Survey respondents indicated that there is a huge opportunity for the pre-paid market to expand its customer base beyond the most likely consumer targets. More than 40 percent indicated that increased scrutiny from regulators regarding loading and set-up fees will pose the greatest risk to the industry. Additionally, 47 percent said educating card holders on the nuances of a pre-paid will be critical to successful adoption and overall growth in the market.
Firstsource’s survey also examined sentiment on the current regulatory climate in the payments industry. While Dodd-Frank was top-of-mind for 45 percent of payments professionals, the Consumer Financial Protection Act has fallen off the radar for most industry executives (only 9 percent of respondents indicated it was currently a priority issue).
What do you think?
Pre-paid cards are primed for explosive growth in the coming year, according to a survey conducted by Firstsource Solutions.
Fifty percent of payment industry professionals surveyed expect wider adoption of pre-paid cards as more consumers move away from credit cards and cash. Nearly 30 percent of respondents said that more consumers would become “loaders” (i.e. depositing more money to their pre-paid accounts).
“We’re seeing a growing interest in pre-paid cards in consumer segments that weren’t originally drawn to using such a form of payment,” says Tim Smith, senior vice president, Banking Financial Services & Insurance, Firstsource. “Our findings support recent research about the upward trend in the pre-paid market which shows that an estimated $37 billion was loaded onto prepaid cards last year, compared to $18 billion in 2009 and $9 billion in 2008.”
Survey respondents indicated that there is a huge opportunity for the pre-paid market to expand its customer base beyond the most likely consumer targets. More than 40 percent indicated that increased scrutiny from regulators regarding loading and set-up fees will pose the greatest risk to the industry. Additionally, 47 percent said educating card holders on the nuances of a pre-paid will be critical to successful adoption and overall growth in the market.
Firstsource’s survey also examined sentiment on the current regulatory climate in the payments industry. While Dodd-Frank was top-of-mind for 45 percent of payments professionals, the Consumer Financial Protection Act has fallen off the radar for most industry executives (only 9 percent of respondents indicated it was currently a priority issue).
What do you think?
Tuesday, May 31, 2011
Manual data entry top AP challenge
By Mark Brousseau
An eye-popping 88 percent of attendees at a recent accounts payable (AP) forum sponsored by Ricoh identified manual data entry as their top AP challenge, providing further evidence of the need for automated invoice processing solutions. Manual data entry came in a whopping 25 percentage points higher than the second biggest challenge identified by attendees at the Houston event.
Routing invoices for approval was identified as a top AP challenge for 63 percent of forum attendees, while 54 percent of attendees stated that resolving errors and exceptions was among their top challenges. Lost or missing invoices (42 percent) and overall payment costs (29 percent) rounded out the list of top AP challenges identified by forum attendees.
What is your top AP challenge?
An eye-popping 88 percent of attendees at a recent accounts payable (AP) forum sponsored by Ricoh identified manual data entry as their top AP challenge, providing further evidence of the need for automated invoice processing solutions. Manual data entry came in a whopping 25 percentage points higher than the second biggest challenge identified by attendees at the Houston event.
Routing invoices for approval was identified as a top AP challenge for 63 percent of forum attendees, while 54 percent of attendees stated that resolving errors and exceptions was among their top challenges. Lost or missing invoices (42 percent) and overall payment costs (29 percent) rounded out the list of top AP challenges identified by forum attendees.
What is your top AP challenge?
Labels:
data capture,
document imaging,
document management,
ICR,
IDR,
invoice scanning,
Mark Brousseau,
OCR,
page scanning,
TAWPI,
workflow
Thursday, May 26, 2011
Avoid these document imaging mistakes!
By Mark Brousseau
Despite the lousy economy, document imaging solutions continue to enjoy strong adoption among organizations of all sizes. No wonder: the technology is proven to deliver tremendous operations and business benefits, including lower processing costs, streamlined storage and retrieval, and better information tracking and reporting.
But even the strongest business case for document imaging can be undermined by crucial errors during system deployment, says Brett Rodgers (brodgers@ibml.com), manager, Solution Consulting, Americas, at ibml (www.ibml.com), a Birmingham, AL-based document imaging solutions provider.
If you want to keep your document imaging business case on track (and who doesn't?), Rodgers suggests avoiding the following 10 all-too-common foul-ups during system deployment:
1. Incorrect sizing of the necessary number of document scanners.
2. Not including all stakeholders (business and IT) in the requirements definition.
3. Buying technology without first conducting a proof of concept.
4. Making decisions on front-end and back-end software separately.
5. Not coordinating software and hardware vendors during system deployment.
6. Not using a phased implementation approach (biting off too much at once).
7. Letting "fear of change" take over.
8. Not thinking LEAN.
9. Not cutting the paper cord.
10. Not "sharing" -- as in utilizing shared services.
What was your biggest mistake when deploying document imaging?
Despite the lousy economy, document imaging solutions continue to enjoy strong adoption among organizations of all sizes. No wonder: the technology is proven to deliver tremendous operations and business benefits, including lower processing costs, streamlined storage and retrieval, and better information tracking and reporting.
But even the strongest business case for document imaging can be undermined by crucial errors during system deployment, says Brett Rodgers (brodgers@ibml.com), manager, Solution Consulting, Americas, at ibml (www.ibml.com), a Birmingham, AL-based document imaging solutions provider.
If you want to keep your document imaging business case on track (and who doesn't?), Rodgers suggests avoiding the following 10 all-too-common foul-ups during system deployment:
1. Incorrect sizing of the necessary number of document scanners.
2. Not including all stakeholders (business and IT) in the requirements definition.
3. Buying technology without first conducting a proof of concept.
4. Making decisions on front-end and back-end software separately.
5. Not coordinating software and hardware vendors during system deployment.
6. Not using a phased implementation approach (biting off too much at once).
7. Letting "fear of change" take over.
8. Not thinking LEAN.
9. Not cutting the paper cord.
10. Not "sharing" -- as in utilizing shared services.
What was your biggest mistake when deploying document imaging?
Labels:
content management,
data capture,
data management,
document imaging,
ecm,
ICR,
Mark Brousseau,
OCR,
page scanning,
TAWPI
Wednesday, May 25, 2011
Cost versus value – who wins?
By Laura Knox, inside sales team leader, DataSource Mobility
In today’s ever challenging economy it’s no surprise that we see technology customers focusing more and more on cost ... and while I am all about getting a bargain and finding the right solution at the right price for my clients, I often find myself explaining that cost does not always equal value.
Much more goes into the concept of value than the upfront purchase price of any solution. You have to think about potential downtime if the equipment breaks, repair costs, replacement if your workers refuse to use the machine because of poor performance, replacement cost if a device fails, upgraded warranty fees (most low cost solutions come with little or no warranty coverage) and the time any IT staff must spend to keep the devices working properly. So, if we are looking at overall value rather than upfront value the emphasis moves from simply finding something cheap to finding something that is high quality.
Now, most people with a healthy knowledge of IT matters already understand that inferior parts and inferior quality plus lack of service are what equal the attractively low price point of generic industry devices – and they are very anxious not to get stuck trying to support devices that will need constant attention and repair - but try explaining this to a person without IT experience who is tasked with finding a top quality solution at a “bargain bin” price and things get tricky.
So, for those of us who are not IT aficionados but need to make smart decisions for the companies we own or are employed by, the question becomes; how do I tell a high quality device from all the other options? Below is a list of questions that I strongly encourage these folks to ask before purchasing any equipment from a potential vendor.
$ vs. ROI vs. TCO
1) What is it made out of?
2) What type of service and support is included in the cost being quoted (and what will you have to pay extra for)?
3) What is the typical lifespan of the device?
4) Are parts and labor outsourced or does the manufacturer actually make the product?
5) Has it passed any level of rugged certification?
6) What is the typical failure rate for the device?
What do you think?
In today’s ever challenging economy it’s no surprise that we see technology customers focusing more and more on cost ... and while I am all about getting a bargain and finding the right solution at the right price for my clients, I often find myself explaining that cost does not always equal value.
Much more goes into the concept of value than the upfront purchase price of any solution. You have to think about potential downtime if the equipment breaks, repair costs, replacement if your workers refuse to use the machine because of poor performance, replacement cost if a device fails, upgraded warranty fees (most low cost solutions come with little or no warranty coverage) and the time any IT staff must spend to keep the devices working properly. So, if we are looking at overall value rather than upfront value the emphasis moves from simply finding something cheap to finding something that is high quality.
Now, most people with a healthy knowledge of IT matters already understand that inferior parts and inferior quality plus lack of service are what equal the attractively low price point of generic industry devices – and they are very anxious not to get stuck trying to support devices that will need constant attention and repair - but try explaining this to a person without IT experience who is tasked with finding a top quality solution at a “bargain bin” price and things get tricky.
So, for those of us who are not IT aficionados but need to make smart decisions for the companies we own or are employed by, the question becomes; how do I tell a high quality device from all the other options? Below is a list of questions that I strongly encourage these folks to ask before purchasing any equipment from a potential vendor.
$ vs. ROI vs. TCO
1) What is it made out of?
2) What type of service and support is included in the cost being quoted (and what will you have to pay extra for)?
3) What is the typical lifespan of the device?
4) Are parts and labor outsourced or does the manufacturer actually make the product?
5) Has it passed any level of rugged certification?
6) What is the typical failure rate for the device?
What do you think?
Tuesday, May 24, 2011
Fusion attendees meet Kevin Nealon
Posted by Mark Brousseau
Customers and prospects of Brainware -- sponsor of the Fusion 2011 Wednesday night reception -- had an opportunity to meet actor and comedian Kevin Nealon before his performance.
Click below to see photos from the meet and greet.
http://www.iappnet.org/photo/fusion2011_vip/index.html
Customers and prospects of Brainware -- sponsor of the Fusion 2011 Wednesday night reception -- had an opportunity to meet actor and comedian Kevin Nealon before his performance.
Click below to see photos from the meet and greet.
http://www.iappnet.org/photo/fusion2011_vip/index.html
Labels:
Brainware,
data capture,
document imaging,
ICR,
IDR,
Mark Brousseau,
OCR,
page scanning
Enterprises will adopt layered fraud prevention techniques
Posted by Mark Brousseau
By 2014, 15 percent of enterprises will adopt layered fraud prevention techniques for their internal systems to compensate for weaknesses inherent in using only authentication methods, according to Gartner, Inc.
Gartner analysts say no single layer of fraud prevention or authentication is enough to keep determined fraudsters out of enterprise systems. Multiple layers must be employed to defend against today's attacks and those that have yet to appear.
"Malware-based attacks against bank customers and company employees are levying severe reputational and financial damage on their victims. They are fast becoming a prevalent tool for attacking customer and corporate accounts, and stealing sensitive information or funds," said Avivah Litan, vice president and distinguished analyst at Gartner. "Fighting these and future types of attacks requires a layered fraud prevention approach."
Litan explained that while the layered approach to fraud prevention tries to keep the attackers from getting inside in the first place, it also assumes that they will make it in, and that multiple fraud prevention layers are needed to stop the damage once they do. She said that no authentication measure on its own, especially when communicating through a browser, is sufficient to counter today's threats.
Gartner breaks down fraud prevention into five layers:
Layer 1
Layer 1 is endpoint-centric, and it involves technologies deployed in the context of users and the endpoints they use. Layer 1 technologies include secure browsing applications or hardware, as well as transaction-signing devices. Transaction-signing devices can be dedicated tokens, telephones, PCs and more. Out-of-band or dedicated hardware-based transaction verification affords stronger security and a higher level of assurance than in-band processes do. The technologies in this layer can be typically deployed faster than those in subsequent layers and go a long way toward defeating malware-based attacks.
Layer 2
Layer 2 is navigation-centric; this monitors and analyzes session navigation behavior and compares it with navigation patterns that are expected on that site, or uses rules that identify abnormal and suspect navigation patterns. It's useful for spotting individual suspect transactions as well as fraud rings. This layer can also generally be deployed faster than those in Layers 3, 4 and 5, and it can be effective in identifying and defeating malware-based attacks.
Layer 3
Layer 3 is user- and account-centric for a specific channel, such as online sales; it monitors and analyzes user or account behavior and associated transactions and identifies anomalous behavior, using rules or statistical models. It may also use continuously updated profiles of users and accounts, as well as peer groups for comparing transactions and identifying the suspect ones.
Layer 4
Layer 4 is user- and account-centric across multiple channels and products. As with Layer 3, it looks for suspect user or account behavior, but it also offers the benefit of looking across channels and products and correlating alerts and activities for each user, account or entity.
Layer 5
Layer 5 is entity link analysis. It enables the analysis of relationships among internal and/or external entities and their attributes (for example, users, accounts, account attributes, machines and machine attributes) to detect organized or collusive criminal activities or misuse.
Litan said that, depending on the size and complexity of the end-user institution, implementing the systems that support a layered fraud management framework can take at least three to five years, especially when it comes to the upper layers — Layers 3, 4 and 5. These efforts are continuous, because fraud prevention rules and models require ongoing maintenance, tuning and care.
"Organizations don't have years to wait to introduce fraud prevention while malware-based attacks proliferate. We recommend starting with the first layer of this fraud prevention framework, as well as the second layer, resources permitting, since these can be deployed relatively quickly," says Litan. "Enterprises that start by deploying lower levels of the layered stack can help to stave off immediate threats, with the assurance that these layers are part of an overall strategy that relies on basic fraud prevention principles, such as user and account profiling that have generally stood the test of time."
What do you think?
By 2014, 15 percent of enterprises will adopt layered fraud prevention techniques for their internal systems to compensate for weaknesses inherent in using only authentication methods, according to Gartner, Inc.
Gartner analysts say no single layer of fraud prevention or authentication is enough to keep determined fraudsters out of enterprise systems. Multiple layers must be employed to defend against today's attacks and those that have yet to appear.
"Malware-based attacks against bank customers and company employees are levying severe reputational and financial damage on their victims. They are fast becoming a prevalent tool for attacking customer and corporate accounts, and stealing sensitive information or funds," said Avivah Litan, vice president and distinguished analyst at Gartner. "Fighting these and future types of attacks requires a layered fraud prevention approach."
Litan explained that while the layered approach to fraud prevention tries to keep the attackers from getting inside in the first place, it also assumes that they will make it in, and that multiple fraud prevention layers are needed to stop the damage once they do. She said that no authentication measure on its own, especially when communicating through a browser, is sufficient to counter today's threats.
Gartner breaks down fraud prevention into five layers:
Layer 1
Layer 1 is endpoint-centric, and it involves technologies deployed in the context of users and the endpoints they use. Layer 1 technologies include secure browsing applications or hardware, as well as transaction-signing devices. Transaction-signing devices can be dedicated tokens, telephones, PCs and more. Out-of-band or dedicated hardware-based transaction verification affords stronger security and a higher level of assurance than in-band processes do. The technologies in this layer can be typically deployed faster than those in subsequent layers and go a long way toward defeating malware-based attacks.
Layer 2
Layer 2 is navigation-centric; this monitors and analyzes session navigation behavior and compares it with navigation patterns that are expected on that site, or uses rules that identify abnormal and suspect navigation patterns. It's useful for spotting individual suspect transactions as well as fraud rings. This layer can also generally be deployed faster than those in Layers 3, 4 and 5, and it can be effective in identifying and defeating malware-based attacks.
Layer 3
Layer 3 is user- and account-centric for a specific channel, such as online sales; it monitors and analyzes user or account behavior and associated transactions and identifies anomalous behavior, using rules or statistical models. It may also use continuously updated profiles of users and accounts, as well as peer groups for comparing transactions and identifying the suspect ones.
Layer 4
Layer 4 is user- and account-centric across multiple channels and products. As with Layer 3, it looks for suspect user or account behavior, but it also offers the benefit of looking across channels and products and correlating alerts and activities for each user, account or entity.
Layer 5
Layer 5 is entity link analysis. It enables the analysis of relationships among internal and/or external entities and their attributes (for example, users, accounts, account attributes, machines and machine attributes) to detect organized or collusive criminal activities or misuse.
Litan said that, depending on the size and complexity of the end-user institution, implementing the systems that support a layered fraud management framework can take at least three to five years, especially when it comes to the upper layers — Layers 3, 4 and 5. These efforts are continuous, because fraud prevention rules and models require ongoing maintenance, tuning and care.
"Organizations don't have years to wait to introduce fraud prevention while malware-based attacks proliferate. We recommend starting with the first layer of this fraud prevention framework, as well as the second layer, resources permitting, since these can be deployed relatively quickly," says Litan. "Enterprises that start by deploying lower levels of the layered stack can help to stave off immediate threats, with the assurance that these layers are part of an overall strategy that relies on basic fraud prevention principles, such as user and account profiling that have generally stood the test of time."
What do you think?
Labels:
accounting fraud,
ACH fraud,
check fraud,
enterprise fraud hub,
fraud,
Mark Brousseau,
TAWPI
Monday, May 23, 2011
Where’s the automation?
By Mark Brousseau
Despite revenues in the billions of dollars and the document volumes inherent to that scale of operation, many—possibly even most—companies have not made the leap to automated data capture technology for invoice processing, a proven driver of efficiency and value in accounts payable (AP).
That’s the key takeaway of a survey of attendees of Fusion 2011, held May 8-12 at the Gaylord Palms Resort and Convention Center near Orlando, Florida. The survey polled AP professionals around the globe, working in numerous industries and for organizations ranging from less than $500 million in annual revenues to well in excess of $10 billion in revenues. It was conducted by The Institute of Financial Operations and sponsored by Brainware. Fusion 2011 brought together more than 1,800 financial operations professionals and 170 exhibiting companies.
With an increased focus on working capital management, many AP professionals are emphasizing a need for greater visibility into and reporting of invoice processing—a demonstrated strength of available data capture and extraction technologies such as optical character recognition (OCR) and intelligent document recognition (IDR). That’s what makes these survey findings so surprising.
More than half of the survey respondents (56.3 percent) indicated that their AP organization doesn’t use automated data capture technology. And, only 3.1 percent of respondents stated that their AP organization plans to implement automated data capture within the next six months, while 6.3 percent stated their AP organization plans to implement the technology within the next 12 months.
Why aren’t AP departments making greater use of automated data capture and extraction?
Tight capital budgets are undoubtedly a factor. But AP departments also may not see the need.
Despite their lack of data capture technologies, most of the respondents to the survey are doing a pretty good job of holding the line on invoice processing costs. A plurality of respondents (41.9 percent) indicated that their average invoice processing costs have not changed over the past 12 months, while 38.7 percent of respondents stated their invoice processing costs have dropped slightly. Only 12.9 percent of respondents indicated that their average invoice processing costs have increased either slightly (9.7 percent) or significantly (3.2 percent) over the past 12 months.
Similarly, a plurality of respondents (40 percent) indicated that their average cost to process an invoice is between $2 and $5 – in line with the costs published in surveys by industry research firms. Some 16.7 percent of respondents said their average invoice processing costs are less than $2.
But the survey results show that many AP departments could benefit from labor-saving technologies such as automated data capture. More than a quarter of respondents (26.7 percent) pegged their average invoice processing costs between $5 and $10. Worse, 13.4 percent of respondents stated their average invoice processing costs are between $10 and $20, while 3.3 percent of respondents indicated that their average invoice processing costs were between an eye-popping $20 and $25.
“Among other findings, more than a third of respondents claim it still takes them more than twelve days to process an invoice, inhibiting their ability to take early payment discounts, creating backlogs, and often necessitating increased headcount,” notes Charles Kaplan, vice president of sales and marketing at Brainware. Twenty-five percent of respondents stated it takes their organization more than 15 days to pay invoices. “Automated data capture solves those problems and many others.”
To this point, a plurality of respondents (32.3 percent) believe that “better visibility and reporting” is the biggest benefit of the technology, followed by “faster turnaround” (29 percent), “lower costs” (12.9 percent), “better working capital management” (12.9 percent), and “fewer errors” (9.7 percent). Only 3.2 percent of survey respondents stated that they see “no benefit” to automated data capture.
The bottom line is that despite all the hype about automating invoice processing with data capture technology, vendors have a long way to go in convincing AP departments to deploy them.
What do you think?
Despite revenues in the billions of dollars and the document volumes inherent to that scale of operation, many—possibly even most—companies have not made the leap to automated data capture technology for invoice processing, a proven driver of efficiency and value in accounts payable (AP).
That’s the key takeaway of a survey of attendees of Fusion 2011, held May 8-12 at the Gaylord Palms Resort and Convention Center near Orlando, Florida. The survey polled AP professionals around the globe, working in numerous industries and for organizations ranging from less than $500 million in annual revenues to well in excess of $10 billion in revenues. It was conducted by The Institute of Financial Operations and sponsored by Brainware. Fusion 2011 brought together more than 1,800 financial operations professionals and 170 exhibiting companies.
With an increased focus on working capital management, many AP professionals are emphasizing a need for greater visibility into and reporting of invoice processing—a demonstrated strength of available data capture and extraction technologies such as optical character recognition (OCR) and intelligent document recognition (IDR). That’s what makes these survey findings so surprising.
More than half of the survey respondents (56.3 percent) indicated that their AP organization doesn’t use automated data capture technology. And, only 3.1 percent of respondents stated that their AP organization plans to implement automated data capture within the next six months, while 6.3 percent stated their AP organization plans to implement the technology within the next 12 months.
Why aren’t AP departments making greater use of automated data capture and extraction?
Tight capital budgets are undoubtedly a factor. But AP departments also may not see the need.
Despite their lack of data capture technologies, most of the respondents to the survey are doing a pretty good job of holding the line on invoice processing costs. A plurality of respondents (41.9 percent) indicated that their average invoice processing costs have not changed over the past 12 months, while 38.7 percent of respondents stated their invoice processing costs have dropped slightly. Only 12.9 percent of respondents indicated that their average invoice processing costs have increased either slightly (9.7 percent) or significantly (3.2 percent) over the past 12 months.
Similarly, a plurality of respondents (40 percent) indicated that their average cost to process an invoice is between $2 and $5 – in line with the costs published in surveys by industry research firms. Some 16.7 percent of respondents said their average invoice processing costs are less than $2.
But the survey results show that many AP departments could benefit from labor-saving technologies such as automated data capture. More than a quarter of respondents (26.7 percent) pegged their average invoice processing costs between $5 and $10. Worse, 13.4 percent of respondents stated their average invoice processing costs are between $10 and $20, while 3.3 percent of respondents indicated that their average invoice processing costs were between an eye-popping $20 and $25.
“Among other findings, more than a third of respondents claim it still takes them more than twelve days to process an invoice, inhibiting their ability to take early payment discounts, creating backlogs, and often necessitating increased headcount,” notes Charles Kaplan, vice president of sales and marketing at Brainware. Twenty-five percent of respondents stated it takes their organization more than 15 days to pay invoices. “Automated data capture solves those problems and many others.”
To this point, a plurality of respondents (32.3 percent) believe that “better visibility and reporting” is the biggest benefit of the technology, followed by “faster turnaround” (29 percent), “lower costs” (12.9 percent), “better working capital management” (12.9 percent), and “fewer errors” (9.7 percent). Only 3.2 percent of survey respondents stated that they see “no benefit” to automated data capture.
The bottom line is that despite all the hype about automating invoice processing with data capture technology, vendors have a long way to go in convincing AP departments to deploy them.
What do you think?
Tuesday, May 17, 2011
The alligator and the vendor
Posted by Mark Brousseau
ibml Business Solution Consultant Curtis Williams sees an alligator up-close at Celebration, Florida, during downtime at last week's Fusion conference.
ibml Business Solution Consultant Curtis Williams sees an alligator up-close at Celebration, Florida, during downtime at last week's Fusion conference.
Checks in a 21st Century digital world
By Glenn Wheeler, president, Viewpointe Clearing, Settlement & Association Services, Viewpointe
Is the check dead? You might hear a near-unanimous “yes” to that question; or as others might say more accurately, check usage is simply on a long decline. While check usage has been dwindling in recent years, to paraphrase Mark Twain, the reports of its death are greatly exaggerated. A recent study shows a sizeable segment of the market still writes checks.
As The 2010 Federal Reserve Payments Study, which looks at noncash payments in the U.S. from 2006 through 2009, indicates electronic payments are quickly outstripping check payments; yet checks have remained a significant payment instrument – to the tune of $31.6 trillion in value paid in 2009. While businesses far outweigh consumers in the total dollar value of the checks paid, consumers overall continue to write more checks, according to the findings. And, the study found that while the number of checks written overall has declined more than 7 percent from 2006 to 2009, the volume of consumer-to-consumer check payments has actually grown in that same time period, from 2.2 billion to 2.4 billion.
Where is the consumer-to-consumer check-writing trend heading? Despite its overall decline, there are those who continue to see the value in this traditional payment method. A January New York Times story, Social Security and Welfare Benefits Going Paperless, about the U.S. government’s decision to pay benefits electronically, chronicled how the elderly have continued to opt to receive old-reliable checks versus the government’s proposed electronic deposit of social security payments.
While this one segment of the population alone will not keep checks going indefinitely, technology might encourage some of the smartphone-wielding segment of the population to continue circulating them. According to a recent American Banker article, For Mobile Deposit, Banks Choose Speed-to-Market Over Simplicity, banks are rushing ahead with mobile check deposit technology at the behest of their customers who are using the technology to deposit checks without having to step foot in a bank.
As electronic payments technology continues to evolve – from mobile payment apps to “tap-and-pay” payments using near field communications (NFC), financial institutions and their customers can easily move into a new payments world. Embracing the budding technology will, no doubt, bring new challenges, but with ease of use and the promise of potential growth to the financial institution’s bottom line it could be a worthwhile investment.
Even in our digital age, the old-fashioned check may still stand up as a viable complement to the technologically advanced payment methods.
What do you think?
Is the check dead? You might hear a near-unanimous “yes” to that question; or as others might say more accurately, check usage is simply on a long decline. While check usage has been dwindling in recent years, to paraphrase Mark Twain, the reports of its death are greatly exaggerated. A recent study shows a sizeable segment of the market still writes checks.
As The 2010 Federal Reserve Payments Study, which looks at noncash payments in the U.S. from 2006 through 2009, indicates electronic payments are quickly outstripping check payments; yet checks have remained a significant payment instrument – to the tune of $31.6 trillion in value paid in 2009. While businesses far outweigh consumers in the total dollar value of the checks paid, consumers overall continue to write more checks, according to the findings. And, the study found that while the number of checks written overall has declined more than 7 percent from 2006 to 2009, the volume of consumer-to-consumer check payments has actually grown in that same time period, from 2.2 billion to 2.4 billion.
Where is the consumer-to-consumer check-writing trend heading? Despite its overall decline, there are those who continue to see the value in this traditional payment method. A January New York Times story, Social Security and Welfare Benefits Going Paperless, about the U.S. government’s decision to pay benefits electronically, chronicled how the elderly have continued to opt to receive old-reliable checks versus the government’s proposed electronic deposit of social security payments.
While this one segment of the population alone will not keep checks going indefinitely, technology might encourage some of the smartphone-wielding segment of the population to continue circulating them. According to a recent American Banker article, For Mobile Deposit, Banks Choose Speed-to-Market Over Simplicity, banks are rushing ahead with mobile check deposit technology at the behest of their customers who are using the technology to deposit checks without having to step foot in a bank.
As electronic payments technology continues to evolve – from mobile payment apps to “tap-and-pay” payments using near field communications (NFC), financial institutions and their customers can easily move into a new payments world. Embracing the budding technology will, no doubt, bring new challenges, but with ease of use and the promise of potential growth to the financial institution’s bottom line it could be a worthwhile investment.
Even in our digital age, the old-fashioned check may still stand up as a viable complement to the technologically advanced payment methods.
What do you think?
Labels:
Check 21,
check archive,
check imaging,
deposits,
Mark Brousseau,
NFC,
Viewpointe
Friday, May 13, 2011
Make information safekeeping part of your hurricane preparations
Posted by Mark Brousseau
Forecasters at Colorado State University recently announced that the 2011 Atlantic hurricane season will be very active. Before the season starts, it's time for businesses to get ready, while remembering their most valuable asset: information.
"No matter the size of a company, without access to information, clients could be lost and the owner may be at risk for losing the business altogether," said Marshall Stevens, co-owner of Stevens & Stevens Business Records Management, Inc, a Florida-based records management center. "To ensure business continuity, owners should develop a disaster recovery plan to assess how they're storing and managing information. These plans can help keep a business up and running so all business functions could be handled, even without access to your facility or network."
Get prepared by considering the following:
... Location and security of your storage facility – Store information off-site in a location that's been designed to withstand high sustained winds, is located in a non-flood zone, has a secure vault and is also secured with alarms, security cameras and pass codes.
... Accessibility – Be sure you can access your information no matter the time of day or day of week.
... Document back-ups – Whether you make copies or have external hard drives, back-up files of key documents is crucial. Keep back-ups in multiple locations, so if a disaster affects your office, another copy of your information is still available.
... Alternative records storage options – Consider utilizing technology that allows files to be converted to electronic images, which are then hosted on a secure, password-protected website. So, if files are destroyed or you couldn't access your facility, information wouldn't be gone for good.
"Hurricane season can be an uneasy time, but by planning how you'll protect information now, if disaster does strike, you can focus on running your business rather than trying to pick up the pieces after the fact," said Stevens.
How does your company safeguard its information during a hurricane?
Forecasters at Colorado State University recently announced that the 2011 Atlantic hurricane season will be very active. Before the season starts, it's time for businesses to get ready, while remembering their most valuable asset: information.
"No matter the size of a company, without access to information, clients could be lost and the owner may be at risk for losing the business altogether," said Marshall Stevens, co-owner of Stevens & Stevens Business Records Management, Inc, a Florida-based records management center. "To ensure business continuity, owners should develop a disaster recovery plan to assess how they're storing and managing information. These plans can help keep a business up and running so all business functions could be handled, even without access to your facility or network."
Get prepared by considering the following:
... Location and security of your storage facility – Store information off-site in a location that's been designed to withstand high sustained winds, is located in a non-flood zone, has a secure vault and is also secured with alarms, security cameras and pass codes.
... Accessibility – Be sure you can access your information no matter the time of day or day of week.
... Document back-ups – Whether you make copies or have external hard drives, back-up files of key documents is crucial. Keep back-ups in multiple locations, so if a disaster affects your office, another copy of your information is still available.
... Alternative records storage options – Consider utilizing technology that allows files to be converted to electronic images, which are then hosted on a secure, password-protected website. So, if files are destroyed or you couldn't access your facility, information wouldn't be gone for good.
"Hurricane season can be an uneasy time, but by planning how you'll protect information now, if disaster does strike, you can focus on running your business rather than trying to pick up the pieces after the fact," said Stevens.
How does your company safeguard its information during a hurricane?
What CFOs are thinking and doing
By Mark Brousseau
Although the economic downturn caused CFOs to concentrate primarily on their steward role, the recovery has reemphasized the need to act as a catalyst and strategist, Jeff Bronaugh, senior manager at Deloitte Consulting LLP told attendees of the Masters Session at Fusion 2011 at the Gaylord Palms Resort and Convention Center Florida. Bronaugh and his colleagues Bob Comeau, national service line lead and principle at Deloitte Consulting, and Scott Rottman, principal at Deloitte Consulting, led a highly interactive discussion among attendees of the Masters Session.
During the depths of the credit crisis and recession, CFOs were spending roughly 60 percent of their time in the operator and steward roles, Bronaugh said. The increased time spent in the steward role reduced the amount of time CFOs spent in their preferred role as the strategist in the organization.
But in the wake of considerable capital-market and economic turmoil, CFOs are expected to take on broader and deeper strategic roles, Bronaugh said. CFOs are now routinely in charge of an expansive range of regulatory, governance, and strategy functions, especially investor and public relations, strategic planning, corporate development, and mergers and acquisitions.
As the economy begins to stabilize, focus for North America's top finance executives is shifting back to strategic initiatives, Bronaugh said. He pointed to a survey of CFOs conducted by Deloitte in the first quarter that showed that quality metrics, influencing strategies, and monitoring initiatives were the top three challenges of their finance organizations. CFOs cited strategic ambiguity, major change initiatives and regulatory change as their top three job stresses, Bronaugh said.
Against this backdrop, Bronaugh said there are 10 hot topics for CFOs:
1. Improving business decision support
2. Influencing business strategy and operational strategies
3. Major infrastructure and change initiatives
4. Prioritizing capital investments
5. Regulatory changes
6. Finance operating models
7. Cash is king
8. Managing finance department expectations
9. Finance talent management
10. Taxes
What are the hot topics in your finance department?
Although the economic downturn caused CFOs to concentrate primarily on their steward role, the recovery has reemphasized the need to act as a catalyst and strategist, Jeff Bronaugh, senior manager at Deloitte Consulting LLP told attendees of the Masters Session at Fusion 2011 at the Gaylord Palms Resort and Convention Center Florida. Bronaugh and his colleagues Bob Comeau, national service line lead and principle at Deloitte Consulting, and Scott Rottman, principal at Deloitte Consulting, led a highly interactive discussion among attendees of the Masters Session.
During the depths of the credit crisis and recession, CFOs were spending roughly 60 percent of their time in the operator and steward roles, Bronaugh said. The increased time spent in the steward role reduced the amount of time CFOs spent in their preferred role as the strategist in the organization.
But in the wake of considerable capital-market and economic turmoil, CFOs are expected to take on broader and deeper strategic roles, Bronaugh said. CFOs are now routinely in charge of an expansive range of regulatory, governance, and strategy functions, especially investor and public relations, strategic planning, corporate development, and mergers and acquisitions.
As the economy begins to stabilize, focus for North America's top finance executives is shifting back to strategic initiatives, Bronaugh said. He pointed to a survey of CFOs conducted by Deloitte in the first quarter that showed that quality metrics, influencing strategies, and monitoring initiatives were the top three challenges of their finance organizations. CFOs cited strategic ambiguity, major change initiatives and regulatory change as their top three job stresses, Bronaugh said.
Against this backdrop, Bronaugh said there are 10 hot topics for CFOs:
1. Improving business decision support
2. Influencing business strategy and operational strategies
3. Major infrastructure and change initiatives
4. Prioritizing capital investments
5. Regulatory changes
6. Finance operating models
7. Cash is king
8. Managing finance department expectations
9. Finance talent management
10. Taxes
What are the hot topics in your finance department?
Labels:
CFOs,
deloitte,
Mark Brousseau,
recession,
regulations,
TAWPI,
tax,
working capital management
Best operations-improving strategies
By Mark Brousseau
During a pre-conference networking lunch at Fusion 2011 at the Gaylord Palms Resort and Convention Center in Florida, attendees were asked to share the best operations-improving strategy that their accounts payable (AP) department has implemented in the past 12 months. Here are the operations strategies that the luncheon attendees said were the most effective during the past year:
... Provided AP processors with two computer monitors, reducing errors and increasing efficiency
... Took the time to better understand AP processes (became "black belts" in evaluating processes) to weed out the processes that don't add value
... Migrated more payments from paper check and wire transfer to automated clearing house (ACH) transactions
... Separated straight-through and exceptions processors
... Deployed a new enterprise resource planning (ERP) solution
... Deployed a purchasing card program
... Became more open-minded to new ideas
... Started reimbursing via a debit card since some people won't take direct deposit and paper checks are too costly and inefficient
... Began measuring and improving input quality, in turn, increasing AP productivity without changing any processes
... Began e-mailing and faxing remittances to save time and postage associated with paper remittances
... Developed a proprietary travel and expense (T&E) reporting system
... Brought AP functions previously done in India back in-house, resulting in savings of $26,000 a month, largely from fewer mistakes
... Implemented an imaging and workflow solution, reducing processing time and enabling all staff to know where an invoice stands in the approval process
... Standardized on one system and one process whenever possible
... Implemented virtual card payments
... Automated accounts receivable (AR) refunds
... Consolidated various overnight shipping and cellular phone accounts into "master" accounts, allowing the company to negotiate discounts of 18 to 50 percent off list prices
... Automated payroll processing with ACH
... Eliminated duplicate vendors, in turn, eliminating many duplicate payments
What is the best operations-improving strategy your AP department has implemented in the past 12 months? Post it below.
During a pre-conference networking lunch at Fusion 2011 at the Gaylord Palms Resort and Convention Center in Florida, attendees were asked to share the best operations-improving strategy that their accounts payable (AP) department has implemented in the past 12 months. Here are the operations strategies that the luncheon attendees said were the most effective during the past year:
... Provided AP processors with two computer monitors, reducing errors and increasing efficiency
... Took the time to better understand AP processes (became "black belts" in evaluating processes) to weed out the processes that don't add value
... Migrated more payments from paper check and wire transfer to automated clearing house (ACH) transactions
... Separated straight-through and exceptions processors
... Deployed a new enterprise resource planning (ERP) solution
... Deployed a purchasing card program
... Became more open-minded to new ideas
... Started reimbursing via a debit card since some people won't take direct deposit and paper checks are too costly and inefficient
... Began measuring and improving input quality, in turn, increasing AP productivity without changing any processes
... Began e-mailing and faxing remittances to save time and postage associated with paper remittances
... Developed a proprietary travel and expense (T&E) reporting system
... Brought AP functions previously done in India back in-house, resulting in savings of $26,000 a month, largely from fewer mistakes
... Implemented an imaging and workflow solution, reducing processing time and enabling all staff to know where an invoice stands in the approval process
... Standardized on one system and one process whenever possible
... Implemented virtual card payments
... Automated accounts receivable (AR) refunds
... Consolidated various overnight shipping and cellular phone accounts into "master" accounts, allowing the company to negotiate discounts of 18 to 50 percent off list prices
... Automated payroll processing with ACH
... Eliminated duplicate vendors, in turn, eliminating many duplicate payments
What is the best operations-improving strategy your AP department has implemented in the past 12 months? Post it below.
Labels:
accounts payable,
ACH,
AP,
AR,
check imaging,
ERP,
Mark Brousseau,
TAWPI,
workflow
Bringing mobile banking to the masses
Posted by Mark Brousseau
The number of unbanked or underbanked mobile subscribers around the world is projected to reach ~2 billion by 2012, according to research from Oliver Wyman and PlaNet Finance Group. Today, only around 50 million subscribers use mobile money services. Most of these deployments have been focusing on first generation mobile money products such as remittances, airtime top-up, bill payments and loan repayment.
The transformational impact of mobile money is expected to come from second generation financial services such as micro-savings, micro-credit and micro-insurance, especially in countries with less than 10 percent retail banking penetration, according to Oliver Wyman. Both telcos and financial institutions should benefit from the take-up of these products, as they reap expertise from complementary skills and deliver more value to customers, the research firm says.
However, the formula for success is not straightforward.
Two distinct models are emerging:
... The distribution of microfinance through mobile money via existing microfinance banks
... The distribution of microfinance through a virtual microfinance bank, operating as a pure mobile player.
“The benefits of these models include a more than twofold increase in access to banking, 20-50 percent lower operational costs for the microfinance institution and revenue or market share benefits for the Mobile Network Operator,” says Arnaud Ventura, co-founder and vice president of PlaNet Finance Group.
In a report, PlaNet Finance Group and Oliver Wyman conclude:
... Mobile Microfinance can have a significant impact on increasing financial services access for unbanked subscribers by eliminating all the disadvantages of physical bank branches. The benefits of this service are both social and economic.
... It is a cost-effective way for banks and MFIs to reach the masses by capitalizing on the widespread penetration of telecom distribution networks. PlaNet Finance and Oliver Wyman also see a new breed of intermediaries emerging that allow partners on both sides to interact smoothly by playing the “interconnection” role, making money on transactions rather than the spread.
Greg Rung, partner at Oliver Wyman said “PlaNet Finance and Oliver Wyman are convinced that, agreeing on a long-term vision, all stakeholders, from banks to distributors to regulators, need to come together to design an adequate offer and build a win-win model that can address all challenges successfully.”
What do you think?
The number of unbanked or underbanked mobile subscribers around the world is projected to reach ~2 billion by 2012, according to research from Oliver Wyman and PlaNet Finance Group. Today, only around 50 million subscribers use mobile money services. Most of these deployments have been focusing on first generation mobile money products such as remittances, airtime top-up, bill payments and loan repayment.
The transformational impact of mobile money is expected to come from second generation financial services such as micro-savings, micro-credit and micro-insurance, especially in countries with less than 10 percent retail banking penetration, according to Oliver Wyman. Both telcos and financial institutions should benefit from the take-up of these products, as they reap expertise from complementary skills and deliver more value to customers, the research firm says.
However, the formula for success is not straightforward.
Two distinct models are emerging:
... The distribution of microfinance through mobile money via existing microfinance banks
... The distribution of microfinance through a virtual microfinance bank, operating as a pure mobile player.
“The benefits of these models include a more than twofold increase in access to banking, 20-50 percent lower operational costs for the microfinance institution and revenue or market share benefits for the Mobile Network Operator,” says Arnaud Ventura, co-founder and vice president of PlaNet Finance Group.
In a report, PlaNet Finance Group and Oliver Wyman conclude:
... Mobile Microfinance can have a significant impact on increasing financial services access for unbanked subscribers by eliminating all the disadvantages of physical bank branches. The benefits of this service are both social and economic.
... It is a cost-effective way for banks and MFIs to reach the masses by capitalizing on the widespread penetration of telecom distribution networks. PlaNet Finance and Oliver Wyman also see a new breed of intermediaries emerging that allow partners on both sides to interact smoothly by playing the “interconnection” role, making money on transactions rather than the spread.
Greg Rung, partner at Oliver Wyman said “PlaNet Finance and Oliver Wyman are convinced that, agreeing on a long-term vision, all stakeholders, from banks to distributors to regulators, need to come together to design an adequate offer and build a win-win model that can address all challenges successfully.”
What do you think?
Friday, May 6, 2011
Alligators at Fusion 2011
Posted by Mark Brousseau
Folks arriving this weekend for Fusion 2011 may be surprised to see alligators(!) in the atrium of the Gaylord Palms Resort and Convention Center in Florida.
Labels:
document imaging,
document management,
FUSION,
Gaylord,
IAPP,
ICR,
Mark Brousseau,
Microsoft,
OCR,
page scanning,
TAWPI
Tuesday, May 3, 2011
Unlocking the value of enterprise content management
By Mark Brousseau
It’s one thing to have enterprise content management (ECM) technology, it’s another thing altogether to get value out of it, Gartner Analyst Mark Gilbert told attendees at Systemware’s user conference, SWUC 11, last week at the Westin Galleria in Dallas.
Companies seem to be getting Gilbert’s message. Many are focusing like never before on ECM applications that provide more value, he told attendees. This trend is being driven by increased expectations from ECM buyers and users, new demands for faster and richer process management and information delivery, increased archiving, compliance and information governance requirements, and the evolution of social media into a tool targeting supply and value chain management.
“The ECM market is strong, and it is reshaping itself as the technology becomes more adaptive and customers make more demands on it,” Gilbert explained, noting that the ECM market now tops $4 billion a year in sales. He added that, “Companies are now relying on ECM to drive business efficiency and achieve better results.“
When it comes to ECM, “ROI matters,” Gilbert said flatly.
Some key elements of the ECM business case that Gilbert identified include:
… Faster, better, processes.
… Better customer service
… Better, less costly regulatory compliance
… Better management decisions
… Better front-line decisions
… Better teamwork
“When we talk to customers, these things come up time and time again,” he said.
Gilbert offered several tips for ensuring your company meets its business case:
… Build a vision for how ECM can transform and drive your business.
… Survey ECM use-cases in your industry.
… Establish roles and an organizational structure to support your ECM vision.
… Set scope for your ECM initiatives by assessing risk and the value of information assets – across the breadth of the content continuum.
… Leverage existing technology and vendors and determine what you have and how it supports the ideals of information infrastructure.
… Accept the fact that technology alone will not succeed; policies and governance models are critical for long-term value.
What do you think?
It’s one thing to have enterprise content management (ECM) technology, it’s another thing altogether to get value out of it, Gartner Analyst Mark Gilbert told attendees at Systemware’s user conference, SWUC 11, last week at the Westin Galleria in Dallas.
Companies seem to be getting Gilbert’s message. Many are focusing like never before on ECM applications that provide more value, he told attendees. This trend is being driven by increased expectations from ECM buyers and users, new demands for faster and richer process management and information delivery, increased archiving, compliance and information governance requirements, and the evolution of social media into a tool targeting supply and value chain management.
“The ECM market is strong, and it is reshaping itself as the technology becomes more adaptive and customers make more demands on it,” Gilbert explained, noting that the ECM market now tops $4 billion a year in sales. He added that, “Companies are now relying on ECM to drive business efficiency and achieve better results.“
When it comes to ECM, “ROI matters,” Gilbert said flatly.
Some key elements of the ECM business case that Gilbert identified include:
… Faster, better, processes.
… Better customer service
… Better, less costly regulatory compliance
… Better management decisions
… Better front-line decisions
… Better teamwork
“When we talk to customers, these things come up time and time again,” he said.
Gilbert offered several tips for ensuring your company meets its business case:
… Build a vision for how ECM can transform and drive your business.
… Survey ECM use-cases in your industry.
… Establish roles and an organizational structure to support your ECM vision.
… Set scope for your ECM initiatives by assessing risk and the value of information assets – across the breadth of the content continuum.
… Leverage existing technology and vendors and determine what you have and how it supports the ideals of information infrastructure.
… Accept the fact that technology alone will not succeed; policies and governance models are critical for long-term value.
What do you think?
Monday, May 2, 2011
AP professionals see benefits to cloud computing
By Mark Brousseau
Accounts payable (AP) professionals see "minimal IT involvement" as the biggest benefit of using Software-as-a-Service (SaaS) or cloud computing for AP processing, according the findings of the 2011 AP Automation Study by International Accounts Payable Professionals. Nineteen percent of survey respondents identified "no capital investment" as the biggest benefit of cloud computing or SaaS, while 17.5 percent cited "lower cost per invoice" and 14.3 percent identified "fast start-up."
Some 12.7 percent of respondents identified "no software or hardware " as the biggest benefit.
Randy Davis, vice president of sales and marketing operations for eGistics isn't surprised that these benefits would rank high in the minds of AP staff. "Cloud offerings have always touted minimal IT involvement, no capital investment, fast deployment, and no on-site software as benefits," he notes.
But Davis believes that the ability of cloud-based document processing solutions to remove paper management from AP processing could deliver even greater benefits to AP professionals. "Today's cloud-based AP solutions significantly improve on key usability factors such as electronic capture, structured indexing, search and retrieval, work allocation, data updates and corrections, and audit and tracking -- things that directly contribute to the smooth operation of an AP department," Davis says.
"eGistics believes that business users will increasingly appreciate and accept the benefits of SaaS and cloud computing for critical tasks such as AP processing and management, and that such benefits will soon be taken for granted. At the end of the day, AP departments are looking for solutions that help them do their jobs faster, more accurately and with better accountability," Davis concludes.
What do you think?
Accounts payable (AP) professionals see "minimal IT involvement" as the biggest benefit of using Software-as-a-Service (SaaS) or cloud computing for AP processing, according the findings of the 2011 AP Automation Study by International Accounts Payable Professionals. Nineteen percent of survey respondents identified "no capital investment" as the biggest benefit of cloud computing or SaaS, while 17.5 percent cited "lower cost per invoice" and 14.3 percent identified "fast start-up."
Some 12.7 percent of respondents identified "no software or hardware " as the biggest benefit.
Randy Davis, vice president of sales and marketing operations for eGistics isn't surprised that these benefits would rank high in the minds of AP staff. "Cloud offerings have always touted minimal IT involvement, no capital investment, fast deployment, and no on-site software as benefits," he notes.
But Davis believes that the ability of cloud-based document processing solutions to remove paper management from AP processing could deliver even greater benefits to AP professionals. "Today's cloud-based AP solutions significantly improve on key usability factors such as electronic capture, structured indexing, search and retrieval, work allocation, data updates and corrections, and audit and tracking -- things that directly contribute to the smooth operation of an AP department," Davis says.
"eGistics believes that business users will increasingly appreciate and accept the benefits of SaaS and cloud computing for critical tasks such as AP processing and management, and that such benefits will soon be taken for granted. At the end of the day, AP departments are looking for solutions that help them do their jobs faster, more accurately and with better accountability," Davis concludes.
What do you think?
Labels:
AP,
cloud computing,
invoice processing,
invoice scanning,
Mark Brousseau,
SaaS
Friday, April 29, 2011
Document management best practices
Posted by Mark Brousseau
According to the International Association of Administrative Professionals (IAAP), there are more than 4.3 million secretaries and administrative assistants working in the United States. In honor of Administrative Professionals Day, celebrated on April 27, Cintas Corporation offered best practices to help administrative professionals implement a successful office-wide program to manage, maintain and protect confidential business documents.
"During the recession, downsizing has forced all office professionals to come together and work harder in the workplace,” said Marcia Peller, corporate office services manager, Cintas. “Ensuring business information remains secure, yet easily accessible is essential to the success of any business. This is best accomplished through an integrated program that involves all relevant stakeholders.”
Cintas’ best practices include:
1. Create a document retention schedule. All businesses have an abundance of documents and records to maintain, which can often be a daunting task. To maximize space efficiently and increase productivity, work with management and a legal consultant to identify a retention schedule based on legal requirements and internal company policies. Depending on the type of business, proactively learn and implement these retention guidelines to maintain an organized, uncluttered office.
2. Educate and engage employees. Once a retention schedule has been established, educate and train all employees to take a proactive role and follow protocol. Each year, update employees regarding any new legal requirements and encourage them to securely shred any documents that are no longer needed. This will save space to ensure current documents are easily accessible.
3. Store records offsite. If your company has a large volume of records with long retention periods but limited space, consider an offsite storage provider. This will free up space and keep confidential information out of the wrong hands. The ideal provider will offer a secure storage facility equipped with 24-hour security cameras, alarm systems and complete fire protection systems to protect records from catastrophes such as floods and fires.
4. Limit accessibility to records. Only personnel who require job-related access should be authorized to view records. Limiting accessibility is critical as every business retains some degree of confidential information regarding their employees and customers. Such information includes names, addresses, credit card numbers, Social Security numbers and other account information. By enforcing these rules, administrative professionals can greatly reduce the threat of data breaches from employees and other unauthorized sources.
5. Digitally image critical files. Converting paper files and records to electronic documents can help businesses increase productivity, improve processes and ensure compliance with regulatory requirements. From disaster recovery planning, to having immediate access to files, a digital imaging solution helps employees find what they need, when they need it. Consider working with a professional provider that provides secure document imaging and scanning services to gain immediate, real-time access to all critical files.
6. Implement a “shred-all” program. It is important to securely shred all unneeded documents. With identity theft and data breaches on the rise, doing so will protect confidential business data and customers’ sensitive information from falling into the wrong hands. In addition, encourage employees to shred their personal information at work to protect their identity as well. Recommend a shredding service that destroys documents on a scheduled basis. These companies place secure shredding containers in accessible and identifiable locations to make it safe and convenient for all employees to properly shred documents that have reached the end of their useful life. In addition, they will provide a certificate of destruction for a legal audit trail.
7. Create an office recycling program. Ensure that your office or department is doing its part to protect the earth by encouraging and promoting a paper recycling program. Many companies that offer shredding services recycle the paper into secondary paper products, such as paper towels, to reduce the impact on the environment. Recycling paper saves water, reduces green house gas emissions and uses 25 percent less energy than manufacturing paper from trees.
“Administrative professionals work hard throughout the year to ensure their offices operate as efficiently as possible,” said Brittney Kirk, marketing associate, Cintas Document Management. “This Administrative Professionals Day, we want to recognize their efforts and provide them with best practices to help them securely protect and store confidential information to ensure business success.”
What do you think?
According to the International Association of Administrative Professionals (IAAP), there are more than 4.3 million secretaries and administrative assistants working in the United States. In honor of Administrative Professionals Day, celebrated on April 27, Cintas Corporation offered best practices to help administrative professionals implement a successful office-wide program to manage, maintain and protect confidential business documents.
"During the recession, downsizing has forced all office professionals to come together and work harder in the workplace,” said Marcia Peller, corporate office services manager, Cintas. “Ensuring business information remains secure, yet easily accessible is essential to the success of any business. This is best accomplished through an integrated program that involves all relevant stakeholders.”
Cintas’ best practices include:
1. Create a document retention schedule. All businesses have an abundance of documents and records to maintain, which can often be a daunting task. To maximize space efficiently and increase productivity, work with management and a legal consultant to identify a retention schedule based on legal requirements and internal company policies. Depending on the type of business, proactively learn and implement these retention guidelines to maintain an organized, uncluttered office.
2. Educate and engage employees. Once a retention schedule has been established, educate and train all employees to take a proactive role and follow protocol. Each year, update employees regarding any new legal requirements and encourage them to securely shred any documents that are no longer needed. This will save space to ensure current documents are easily accessible.
3. Store records offsite. If your company has a large volume of records with long retention periods but limited space, consider an offsite storage provider. This will free up space and keep confidential information out of the wrong hands. The ideal provider will offer a secure storage facility equipped with 24-hour security cameras, alarm systems and complete fire protection systems to protect records from catastrophes such as floods and fires.
4. Limit accessibility to records. Only personnel who require job-related access should be authorized to view records. Limiting accessibility is critical as every business retains some degree of confidential information regarding their employees and customers. Such information includes names, addresses, credit card numbers, Social Security numbers and other account information. By enforcing these rules, administrative professionals can greatly reduce the threat of data breaches from employees and other unauthorized sources.
5. Digitally image critical files. Converting paper files and records to electronic documents can help businesses increase productivity, improve processes and ensure compliance with regulatory requirements. From disaster recovery planning, to having immediate access to files, a digital imaging solution helps employees find what they need, when they need it. Consider working with a professional provider that provides secure document imaging and scanning services to gain immediate, real-time access to all critical files.
6. Implement a “shred-all” program. It is important to securely shred all unneeded documents. With identity theft and data breaches on the rise, doing so will protect confidential business data and customers’ sensitive information from falling into the wrong hands. In addition, encourage employees to shred their personal information at work to protect their identity as well. Recommend a shredding service that destroys documents on a scheduled basis. These companies place secure shredding containers in accessible and identifiable locations to make it safe and convenient for all employees to properly shred documents that have reached the end of their useful life. In addition, they will provide a certificate of destruction for a legal audit trail.
7. Create an office recycling program. Ensure that your office or department is doing its part to protect the earth by encouraging and promoting a paper recycling program. Many companies that offer shredding services recycle the paper into secondary paper products, such as paper towels, to reduce the impact on the environment. Recycling paper saves water, reduces green house gas emissions and uses 25 percent less energy than manufacturing paper from trees.
“Administrative professionals work hard throughout the year to ensure their offices operate as efficiently as possible,” said Brittney Kirk, marketing associate, Cintas Document Management. “This Administrative Professionals Day, we want to recognize their efforts and provide them with best practices to help them securely protect and store confidential information to ensure business success.”
What do you think?
Wednesday, April 20, 2011
5 Questions For …
Shayne Magee, director, Client Solutions, Diversified Information Technologies
When you talk to prospects, what do they tell you is their biggest document processing challenge, and why?
Our prospects and clients typically have many customers. The relationship they have is one that requires efficient management of inbound and outbound documents and data. The biggest challenge has been finding a partner that has a complete solution. The solutions needs to seamlessly capture, output, processing, and preservation of increasingly compliant centric environments.
What is your company doing to address this challenge?
Diversified is continuing to develop our virtual mailroom and information lifecycle management solutions. Our solutions can be combined and interfaced seamlessly with our clients infrastructure and systems. All of our offerings are specifically designed to deploy quickly and solve this previously unmet industry challenge for a single source solution.
Additionally, We have been adding integrated document facilities across the country to support the requirements of our financial, healthcare, enterprise and government clients. We added five in the last 12 months and continue to invest in quality programs and certification to support our clients needs. Currently, Diversified holds the following certifications: NARA, ISO 9001, SaS 70 Type II, HIPAA, and, most recently, NAID.
What do you believe will be the major storyline in document processing over the next 12 months, and why?
We are in the middle of a swiftly moving trend of SaaS technology, which is allowing organizations to collaborate and communicate in real time streamlined processes that in many cases eliminate previous steps, antiquated systems, and documents. We feel SaaS-deployed applications, and the inclusion of the mobile Internet tsunami, will be the major ECM storyline in the next 12 months.
What’s the most interesting thing in the documents processing space that you’ve read about recently (that wasn’t put out by your own company)?
Some new data points from some AIIM research have been very interesting regarding the change in paper-based policies in a Facebook era. They forecast an evolution from systems of records to systems of engagement and potentially the end of email, wet signatures, and paper based transactions.
What do you think?
When you talk to prospects, what do they tell you is their biggest document processing challenge, and why?
Our prospects and clients typically have many customers. The relationship they have is one that requires efficient management of inbound and outbound documents and data. The biggest challenge has been finding a partner that has a complete solution. The solutions needs to seamlessly capture, output, processing, and preservation of increasingly compliant centric environments.
What is your company doing to address this challenge?
Diversified is continuing to develop our virtual mailroom and information lifecycle management solutions. Our solutions can be combined and interfaced seamlessly with our clients infrastructure and systems. All of our offerings are specifically designed to deploy quickly and solve this previously unmet industry challenge for a single source solution.
Additionally, We have been adding integrated document facilities across the country to support the requirements of our financial, healthcare, enterprise and government clients. We added five in the last 12 months and continue to invest in quality programs and certification to support our clients needs. Currently, Diversified holds the following certifications: NARA, ISO 9001, SaS 70 Type II, HIPAA, and, most recently, NAID.
What do you believe will be the major storyline in document processing over the next 12 months, and why?
We are in the middle of a swiftly moving trend of SaaS technology, which is allowing organizations to collaborate and communicate in real time streamlined processes that in many cases eliminate previous steps, antiquated systems, and documents. We feel SaaS-deployed applications, and the inclusion of the mobile Internet tsunami, will be the major ECM storyline in the next 12 months.
What’s the most interesting thing in the documents processing space that you’ve read about recently (that wasn’t put out by your own company)?
Some new data points from some AIIM research have been very interesting regarding the change in paper-based policies in a Facebook era. They forecast an evolution from systems of records to systems of engagement and potentially the end of email, wet signatures, and paper based transactions.
What do you think?
Monday, April 18, 2011
Healthcare data management
Wendi Klein, Director of Marketing & Communication, North America, A2iA
Since reform and regulation have stirred the industry, it has become an even more complex environment, though the goal is to streamline processes. In the wake of healthcare reform, healthcare IT has been forced to comply with new regulations, and healthcare IT needs have shifted. The U.S. government has set forth dates and deadlines by which providers and payors must meet certain milestones, placing emphasis on obtaining meaningful use of patient data, the availability and recovery of data to increase productivity and enhance patient care, as well as the industry’s transition from ICD-9 to ICD-10.
Given this complex and changing environment, healthcare IT providers must focus on implementing solutions that will meet users’ needs today and in the future, while maximizing existing spending to deliver an ROI. But with so many vendors trying to make a name for themselves, how can one stand out from the competition yet still deliver technology that meets government mandates?
CCHIT certified solutions, for example, are becoming more and more common since this is how the Department of Health and Human Services deems a system a "qualified EMR." However, many CCHIT solutions today still require manual document sorting and data entry because of the complex nature of healthcare documents. Hospitals and clinics alike are looking for ways around this, as it is no secret that manual document handling is a time consuming and expensive task, and even allows for breaches in privacy with the involvement of third-parties.
By partnering with technology companies that provide advanced indexing and data lifting capabilities, CCHIT certified solutions can address these pain points by removing the human interaction and allowing for higher levels of productivity, consequently differentiating themselves from the competition. By allowing complex and even handwritten documents such as provider notes, clinical documentation, lab results or prescriptions to enter the workflow, automatic routing to EHR, EMR or PM solutions can occur, and the data can be automatically located and lifted. Tangible results are seen almost immediately, and the CCHIT solution stands out from seemingly similar applications by providing a greater level of automation for all documents, regardless of their type or complexity.
Once these complex documents are incorporated into the EHR, EMR or PM solution, the next steps, like coding and billing, can occur. According to a recent study, between 5 and 15 percent of a coder’s time is spent reading health information, and 50 percent of a record clerk’s time is spent looking for information. ICD-9 is currently an accepted set of codes to be used for reporting diagnoses and procedures on healthcare transactions, although it must be replaced by ICD-10 no later than October 2013.
Because ICD-10 contains nearly 5 times as many codes and sub-codes, the conversion from ICD-9 to ICD-10 is predicted to decrease productivity by a minimum of 25 percent for three to six months after the transition as coders adjust to new methodology, and become a more costly endeavor than Y2K in terms of both time and money.
Although the deadline to transition to ICD-10 is not until 2013, many have already started to look for solutions to counteract the predicted loss of efficiency. Computer Assisted Coding, or CAC solutions, can help, but many are still asking, “How will coders remain productive as they learn the new codes and sub-codes so that providers can submit and receive payments, and payors can process claims, at the same level of accuracy and speed that they are today?”
Because of these fears and the anticipated decrease in efficiency, there is a large opportunity for technology that can aid in the transition process. Newer, advanced solutions can bring greater levels of automation, help to increase processing times and accuracy, and even save money. By enhancing CAC solutions with capabilities that can automatically locate and lift medical terms and diagnoses from both printed and handwritten documents, such as providers notes and clinical documentation, the research process is sped up and manual labor decreased as codes are automatically assigned. All coded documents can then be indexed and routed with virtually no human interaction to the appropriate EHR, EMR or PM solution, speeding productivity, guaranteeing automation, and aiding in the research and coding process.
Healthcare data management is a complex world, and no one knows what changes are on the horizon. Current solutions can certainly aid in productivity, but combining them with the capabilities of newer, advanced technology, today’s pain points can be lessened, automation improved, and tomorrow’s fears calmed.
What do you think?
Since reform and regulation have stirred the industry, it has become an even more complex environment, though the goal is to streamline processes. In the wake of healthcare reform, healthcare IT has been forced to comply with new regulations, and healthcare IT needs have shifted. The U.S. government has set forth dates and deadlines by which providers and payors must meet certain milestones, placing emphasis on obtaining meaningful use of patient data, the availability and recovery of data to increase productivity and enhance patient care, as well as the industry’s transition from ICD-9 to ICD-10.
Given this complex and changing environment, healthcare IT providers must focus on implementing solutions that will meet users’ needs today and in the future, while maximizing existing spending to deliver an ROI. But with so many vendors trying to make a name for themselves, how can one stand out from the competition yet still deliver technology that meets government mandates?
CCHIT certified solutions, for example, are becoming more and more common since this is how the Department of Health and Human Services deems a system a "qualified EMR." However, many CCHIT solutions today still require manual document sorting and data entry because of the complex nature of healthcare documents. Hospitals and clinics alike are looking for ways around this, as it is no secret that manual document handling is a time consuming and expensive task, and even allows for breaches in privacy with the involvement of third-parties.
By partnering with technology companies that provide advanced indexing and data lifting capabilities, CCHIT certified solutions can address these pain points by removing the human interaction and allowing for higher levels of productivity, consequently differentiating themselves from the competition. By allowing complex and even handwritten documents such as provider notes, clinical documentation, lab results or prescriptions to enter the workflow, automatic routing to EHR, EMR or PM solutions can occur, and the data can be automatically located and lifted. Tangible results are seen almost immediately, and the CCHIT solution stands out from seemingly similar applications by providing a greater level of automation for all documents, regardless of their type or complexity.
Once these complex documents are incorporated into the EHR, EMR or PM solution, the next steps, like coding and billing, can occur. According to a recent study, between 5 and 15 percent of a coder’s time is spent reading health information, and 50 percent of a record clerk’s time is spent looking for information. ICD-9 is currently an accepted set of codes to be used for reporting diagnoses and procedures on healthcare transactions, although it must be replaced by ICD-10 no later than October 2013.
Because ICD-10 contains nearly 5 times as many codes and sub-codes, the conversion from ICD-9 to ICD-10 is predicted to decrease productivity by a minimum of 25 percent for three to six months after the transition as coders adjust to new methodology, and become a more costly endeavor than Y2K in terms of both time and money.
Although the deadline to transition to ICD-10 is not until 2013, many have already started to look for solutions to counteract the predicted loss of efficiency. Computer Assisted Coding, or CAC solutions, can help, but many are still asking, “How will coders remain productive as they learn the new codes and sub-codes so that providers can submit and receive payments, and payors can process claims, at the same level of accuracy and speed that they are today?”
Because of these fears and the anticipated decrease in efficiency, there is a large opportunity for technology that can aid in the transition process. Newer, advanced solutions can bring greater levels of automation, help to increase processing times and accuracy, and even save money. By enhancing CAC solutions with capabilities that can automatically locate and lift medical terms and diagnoses from both printed and handwritten documents, such as providers notes and clinical documentation, the research process is sped up and manual labor decreased as codes are automatically assigned. All coded documents can then be indexed and routed with virtually no human interaction to the appropriate EHR, EMR or PM solution, speeding productivity, guaranteeing automation, and aiding in the research and coding process.
Healthcare data management is a complex world, and no one knows what changes are on the horizon. Current solutions can certainly aid in productivity, but combining them with the capabilities of newer, advanced technology, today’s pain points can be lessened, automation improved, and tomorrow’s fears calmed.
What do you think?
Thursday, April 14, 2011
4 ways to use your customers to boost innovation
Posted by Mark Brousseau
The best technology. The best employees. The biggest budget. The strongest R&D department. Check, check, check, and check! If you think these are all the elements you need in order to build a consistently successful company, you’re wrong. Dan Adams says there is one other factor you’ll need to check off that list—an innovation strategy that works.
“The best way to ensure your company will be a success is to deliver more than your share of customer value,” says Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth. “Specifically, you need to develop differentiated products that provide benefits your customers crave. Products they can’t get anywhere else at a comparable cost. But you shouldn’t be guessing what they want. You should base your product innovation on what they say they want.”
Adams notes that back in 2007, Booz Allen Hamilton released an important study on innovation called “The Customer Connection: The Global Innovation 1000.” The company studied 84 percent of the planet’s corporate R&D spending and identified several distinct innovation strategies.
Most importantly, says Adams, the study highlighted one essential element of successful innovation that too many companies forget. Your employees aren’t the only people you should be engaging to create truly unique and profitable products. You should actually be focusing your efforts on engaging your customers!
The Booz Allen Hamilton study found that when it comes to innovation, customer engagement has a huge payoff. It noted, “Companies that directly engaged their customers had superior results regardless of innovation strategy.”
“And not just a little bit superior, a lot superior,” says Adams. “Those companies that used direct customer engagement while innovating versus indirect customer insight enjoyed great financial gains.”
In fact, the study found that the companies that based their innovation strategies on customer feedback experienced gains in the following key areas:
1) Profit Growth: Operating income growth rate that was three times higher.
2) Shareholder Return: Total shareholder return that was 65 percent higher.
3) Return on Assets: Return on assets that was two times higher.
“What should you do with this information?” asks Adams. “For starters, if you’re in a conversation about your company’s innovation and nobody’s talking about the customer, realize something might be very wrong. To put it in terms of the study, your company might be practicing ‘indirect customer insight’ instead of ‘direct customer engagement.’ This is a kind way of saying, ‘We’ve lost track of who our innovation is supposed to help.’”
If you think your company needs some innovation help, read on for a few words of advice from Adams.
Take it to the next level. For more than five years, Adams has been helping B2B suppliers engage their customers in the innovation process. In that time, he has almost seen it all! And he’s used what he’s seen to distinguish six levels of customer engagement during product development. What’s your level?
Level 1: Our Conference Room: At the lowest level, you decide what customers want around your conference room table. Internal opinions determine the design of your next new product.
Level 2: Ask Our Experts: At the next level, you poll your sales force, tech service department, and other internal experts to determine customer needs. Better—because more voices are heard—but still too “internal.”
Level 3: Customer Survey: Here you use surveys and polls to ask customers what they want. This begins to shake out internal biases…but doesn’t deliver much in the way of deep insight.
Level 4: Qualitative VOC Interviews: You send out interview teams that meet with customers to learn what they want. This is a quantum leap from VOO (voice of ourselves) to VOC (voice of the customer).
Level 5: Quantitative VOC Interviews: The problem with just qualitative VOC is that people hear what they want to hear. Quantitative feedback drives out assumptions, bias, and wishful thinking.
Level 6: B2B VOC Interviews: Unlike end-consumers, B2B customers are knowledgeable, rational, and interested. B2B-optimized interview methodology fully engages them to take advantage of this.
“If you aren’t happy with your level, don’t worry,” says Adams. “Through solid training and committed leadership, I’ve seen businesses leap from Level 1 to 6 in the space of a year.”
Remember who’s showing you the money. A successful company innovates for its customers, not itself. “That’s because nobody inside your company can pay for innovation,” notes Adams. “Only your customers can do that. So the more closely you engage those who pay…the more you learn what they’ll pay for.”
Make sure you’re asking the right questions. Too often, innovation is misunderstood as the process of coming up with the right answers. “The reality is that it is actually about asking the right questions,” explains Adams. “If the bright people in your company are focused on real customer needs, they’ll run circles around the bright people at competitors who are focused elsewhere.”
Learn to pre-sell. “I believe the Booz Allen Hamilton conclusions are especially potent for the B2B supplier serving a concentrated market,” says Adams. “If you interview the ten largest prospects in your target market correctly, you’ll engage them so they’ll be primed to buy when you launch that new product.”
“So the bottom line is if you want to boost your innovation, you should start by directly engaging your customers,” says Adams. “And do this in a way that allows you to understand their world, focus on their important, unsatisfied needs, and entice them to keep working with you.
“This innovation strategy is great because you are removing the guessing game aspect of new product development,” he concludes. “You won’t have to worry about whether or not your customers will like your new products because you’ll already know you are delivering exactly what they want.”
The best technology. The best employees. The biggest budget. The strongest R&D department. Check, check, check, and check! If you think these are all the elements you need in order to build a consistently successful company, you’re wrong. Dan Adams says there is one other factor you’ll need to check off that list—an innovation strategy that works.
“The best way to ensure your company will be a success is to deliver more than your share of customer value,” says Adams, author of New Product Blueprinting: The Handbook for B2B Organic Growth. “Specifically, you need to develop differentiated products that provide benefits your customers crave. Products they can’t get anywhere else at a comparable cost. But you shouldn’t be guessing what they want. You should base your product innovation on what they say they want.”
Adams notes that back in 2007, Booz Allen Hamilton released an important study on innovation called “The Customer Connection: The Global Innovation 1000.” The company studied 84 percent of the planet’s corporate R&D spending and identified several distinct innovation strategies.
Most importantly, says Adams, the study highlighted one essential element of successful innovation that too many companies forget. Your employees aren’t the only people you should be engaging to create truly unique and profitable products. You should actually be focusing your efforts on engaging your customers!
The Booz Allen Hamilton study found that when it comes to innovation, customer engagement has a huge payoff. It noted, “Companies that directly engaged their customers had superior results regardless of innovation strategy.”
“And not just a little bit superior, a lot superior,” says Adams. “Those companies that used direct customer engagement while innovating versus indirect customer insight enjoyed great financial gains.”
In fact, the study found that the companies that based their innovation strategies on customer feedback experienced gains in the following key areas:
1) Profit Growth: Operating income growth rate that was three times higher.
2) Shareholder Return: Total shareholder return that was 65 percent higher.
3) Return on Assets: Return on assets that was two times higher.
“What should you do with this information?” asks Adams. “For starters, if you’re in a conversation about your company’s innovation and nobody’s talking about the customer, realize something might be very wrong. To put it in terms of the study, your company might be practicing ‘indirect customer insight’ instead of ‘direct customer engagement.’ This is a kind way of saying, ‘We’ve lost track of who our innovation is supposed to help.’”
If you think your company needs some innovation help, read on for a few words of advice from Adams.
Take it to the next level. For more than five years, Adams has been helping B2B suppliers engage their customers in the innovation process. In that time, he has almost seen it all! And he’s used what he’s seen to distinguish six levels of customer engagement during product development. What’s your level?
Level 1: Our Conference Room: At the lowest level, you decide what customers want around your conference room table. Internal opinions determine the design of your next new product.
Level 2: Ask Our Experts: At the next level, you poll your sales force, tech service department, and other internal experts to determine customer needs. Better—because more voices are heard—but still too “internal.”
Level 3: Customer Survey: Here you use surveys and polls to ask customers what they want. This begins to shake out internal biases…but doesn’t deliver much in the way of deep insight.
Level 4: Qualitative VOC Interviews: You send out interview teams that meet with customers to learn what they want. This is a quantum leap from VOO (voice of ourselves) to VOC (voice of the customer).
Level 5: Quantitative VOC Interviews: The problem with just qualitative VOC is that people hear what they want to hear. Quantitative feedback drives out assumptions, bias, and wishful thinking.
Level 6: B2B VOC Interviews: Unlike end-consumers, B2B customers are knowledgeable, rational, and interested. B2B-optimized interview methodology fully engages them to take advantage of this.
“If you aren’t happy with your level, don’t worry,” says Adams. “Through solid training and committed leadership, I’ve seen businesses leap from Level 1 to 6 in the space of a year.”
Remember who’s showing you the money. A successful company innovates for its customers, not itself. “That’s because nobody inside your company can pay for innovation,” notes Adams. “Only your customers can do that. So the more closely you engage those who pay…the more you learn what they’ll pay for.”
Make sure you’re asking the right questions. Too often, innovation is misunderstood as the process of coming up with the right answers. “The reality is that it is actually about asking the right questions,” explains Adams. “If the bright people in your company are focused on real customer needs, they’ll run circles around the bright people at competitors who are focused elsewhere.”
Learn to pre-sell. “I believe the Booz Allen Hamilton conclusions are especially potent for the B2B supplier serving a concentrated market,” says Adams. “If you interview the ten largest prospects in your target market correctly, you’ll engage them so they’ll be primed to buy when you launch that new product.”
“So the bottom line is if you want to boost your innovation, you should start by directly engaging your customers,” says Adams. “And do this in a way that allows you to understand their world, focus on their important, unsatisfied needs, and entice them to keep working with you.
“This innovation strategy is great because you are removing the guessing game aspect of new product development,” he concludes. “You won’t have to worry about whether or not your customers will like your new products because you’ll already know you are delivering exactly what they want.”
Labels:
customer service,
Mark Brousseau,
product management,
TAWPI
Wednesday, April 13, 2011
NAPCP after hours
Posted by Mark Brousseau
A breathtaking view of the Las Vegas strip tonight from the Eiffel Tower Experience at the Paris Hotel after hours at the NAPCP Commercial Card and Payment Conference.
Tuesday, April 12, 2011
Interest in p-cards still going strong
By Mark Brousseau
Want more proof of the continued strength of purchasing cards (p-cards) as a key component of accounts payable (AP) programs? Look no further than this week’s NAPCP Commercial Card and Payment Conference at the Paris hotel in Las Vegas.
Some 657 people – representing 265 end-user organizations and 85 provider organizations – are in attendance at this year’s NAPCP event, up from 616 attendees last year (although still down from the 850 people that attended NAPCP’s event in 2008). Overall, 54 percent of the attendees are from end-user organizations and 46 percent of the attendees are from provider organizations, such as banks.
Among the end-users in attendance, 51 percent describe their experience level as “advanced,” while another 39 percent say their experience level is “intermediate.” Only 10 percent of attendees at this year’s NAPCP event say they are “beginning” with p-cards – further proof of the growing maturity of p-cards. In terms of the sectors represented, 61 percent are from corporations, while 23 percent are from government or primary education and 16 percent are from higher education.
By far, the hottest topic among attendees is how to grow their existing p-card programs to further reduce costs, increase productivity, and earn rebates. Many attendees also are looking for ways to better integrate p-cards with their purchase-to-pay (P2P) initiatives (it seems many more P2P professionals are in attendance).
Want more proof of the continued strength of purchasing cards (p-cards) as a key component of accounts payable (AP) programs? Look no further than this week’s NAPCP Commercial Card and Payment Conference at the Paris hotel in Las Vegas.
Some 657 people – representing 265 end-user organizations and 85 provider organizations – are in attendance at this year’s NAPCP event, up from 616 attendees last year (although still down from the 850 people that attended NAPCP’s event in 2008). Overall, 54 percent of the attendees are from end-user organizations and 46 percent of the attendees are from provider organizations, such as banks.
Among the end-users in attendance, 51 percent describe their experience level as “advanced,” while another 39 percent say their experience level is “intermediate.” Only 10 percent of attendees at this year’s NAPCP event say they are “beginning” with p-cards – further proof of the growing maturity of p-cards. In terms of the sectors represented, 61 percent are from corporations, while 23 percent are from government or primary education and 16 percent are from higher education.
By far, the hottest topic among attendees is how to grow their existing p-card programs to further reduce costs, increase productivity, and earn rebates. Many attendees also are looking for ways to better integrate p-cards with their purchase-to-pay (P2P) initiatives (it seems many more P2P professionals are in attendance).
Labels:
accounts payable,
AP,
AR,
commercial card,
Mark Brousseau,
NAPCP,
p-cards,
purchasing cards,
TAWPI
Selling top execs on records management
By Mark Brousseau
Every organization, regardless of industry, needs to have a records management policy, and they must have a records manager. Nonetheless, it can be an uphill climb convincing top managers to embrace records management, Kevin Joerling, senior project manager, records management, Perceptive Software, said yesterday at the company’s Inspire 2011 user conference at the Wynn in Las Vegas.
Records and information management is defined as the systematic control of records and information throughout their lifecycle, encompassing creation, use, storage, retention, and disposition. “Retention is where we find many companies are not doing a very good job – knowing how long to keep documents,” Joerling said
And this is an area where companies can waste a lot of money, Joerling said: For every $1 spent on disk storage, $3 to $8 per megabyte is spent on managing that storage, he explained. “In some cases, companies don’t even realize this,” he said.
Joerling said records and information is more important than ever because it: reduces storage costs; organizations information for quick retrieval; facilitates litigation risk avoidance; helps protect information assets; and ensures compliance with recordkeeping laws and regulations. To this last point, Joerling said records management is a key part of an organization’s commitment to risk mitigation.
“Liability lawsuits are often decided on the basis of old records,” Joerling explained. What’s more, the loss of records can have more devastating consequences than the loss of a plant, Joerling said, noting that some companies based in the World Trade Center during 9/11 went out of business because they didn’t have backup records.
So why don’t more top execs embrace records management?
For starters, most top execs don’t understand records management. “Records management is not mainstream yet,” he said. “It’s not taught in too many colleges or universities, so business managers coming out of school don’t understand it.” Many organizations also don’t have a records management professional. “Who’s going to be that champion in your company to go to senior management and say ‘We need to look into this because we could get in trouble by not doing it?’” Joerling asked.
Additionally, records and information management benefits can be difficult to quantify. With tight budgets as a result of the recession, this makes it more difficult to persuade senior management about records and information management.
Joerling offered tips for selling top execs on the need for records management:
1. Describe how records management will solve issues facing your organization.
2. Propose a recommended solution, whether it’s hiring a records manager or records management consultant.
3. Detail what will happen if a records management program is not undertaken.
4. Explain when the records management program will be deployed, and how much money, how much time and how many people will be needed for the program.
5. Keep the discussions at a high level and targeted to c-level core concerns, such as how the program ties into the company’s strategic plan.
“It’s an uphill battle because you’re dealing with something that a lot of executives don’t understand and don’t recognize why it needs to be done,” Joerling admitted. But with the growing importance of records and information management, it’s critical that document professionals convince top execs on the need for a program.
What do you think?
Every organization, regardless of industry, needs to have a records management policy, and they must have a records manager. Nonetheless, it can be an uphill climb convincing top managers to embrace records management, Kevin Joerling, senior project manager, records management, Perceptive Software, said yesterday at the company’s Inspire 2011 user conference at the Wynn in Las Vegas.
Records and information management is defined as the systematic control of records and information throughout their lifecycle, encompassing creation, use, storage, retention, and disposition. “Retention is where we find many companies are not doing a very good job – knowing how long to keep documents,” Joerling said
And this is an area where companies can waste a lot of money, Joerling said: For every $1 spent on disk storage, $3 to $8 per megabyte is spent on managing that storage, he explained. “In some cases, companies don’t even realize this,” he said.
Joerling said records and information is more important than ever because it: reduces storage costs; organizations information for quick retrieval; facilitates litigation risk avoidance; helps protect information assets; and ensures compliance with recordkeeping laws and regulations. To this last point, Joerling said records management is a key part of an organization’s commitment to risk mitigation.
“Liability lawsuits are often decided on the basis of old records,” Joerling explained. What’s more, the loss of records can have more devastating consequences than the loss of a plant, Joerling said, noting that some companies based in the World Trade Center during 9/11 went out of business because they didn’t have backup records.
So why don’t more top execs embrace records management?
For starters, most top execs don’t understand records management. “Records management is not mainstream yet,” he said. “It’s not taught in too many colleges or universities, so business managers coming out of school don’t understand it.” Many organizations also don’t have a records management professional. “Who’s going to be that champion in your company to go to senior management and say ‘We need to look into this because we could get in trouble by not doing it?’” Joerling asked.
Additionally, records and information management benefits can be difficult to quantify. With tight budgets as a result of the recession, this makes it more difficult to persuade senior management about records and information management.
Joerling offered tips for selling top execs on the need for records management:
1. Describe how records management will solve issues facing your organization.
2. Propose a recommended solution, whether it’s hiring a records manager or records management consultant.
3. Detail what will happen if a records management program is not undertaken.
4. Explain when the records management program will be deployed, and how much money, how much time and how many people will be needed for the program.
5. Keep the discussions at a high level and targeted to c-level core concerns, such as how the program ties into the company’s strategic plan.
“It’s an uphill battle because you’re dealing with something that a lot of executives don’t understand and don’t recognize why it needs to be done,” Joerling admitted. But with the growing importance of records and information management, it’s critical that document professionals convince top execs on the need for a program.
What do you think?
Consolidating patient records
By Mark Brousseau
As they move to electronic medical records (EMRs), one challenge for healthcare providers is how to consolidate electronic access to all of a patient’s documents.
Citizens Memorial Healthcare, which is made up of a hospital, 25 clinics, five long-term care facilities and a cancer center, has licked this problem by using ImageNow from Perceptive Software to tie together documents not captured in its Meditech 5.6 EMR system, including EKGs, radiology reports, outside lab results, wound and surgical photos and registration photos. This integration has allowed Citizens Memorial Healthcare to create a shared medical record across its enterprise.
In its EMR environment, regardless of visit, a patient’s documents are organized under a common number. To access documents related to an account that were not captured at the point-of-service by the EMR, the provider launches ImageNow in the background, allowing documents from either application to be displayed in the EMR system. To speed retrieval, ImageNow indexes documents in several categories; for instance, physicians can view patient diagnostics without having to look through registration documents. Documents also can be retrieved across accounts.
Electronic documents are created at the point-of-registration or via scanning later.
For primary care physicians that don’t have a way to get their EMR records into Citizens Memorial Healthcare’s system, ImageNow provides the ability to “scrape” data off of incoming faxes. As faxes come in to Citizens Memorial Healthcare, they are placed in a workflow queue where an employee triggers the technology.
“Our long-term goal is to have a true interface with Meditech for these documents. But we don’t have that in place now. ImageNow has provided that availability,” Tim Roberts, IS specialist, Citizens Memorial Healthcare, said yesterday during a presentation at Perceptive Software’s Inspire 2011 user conference in Las Vegas.
As they move to electronic medical records (EMRs), one challenge for healthcare providers is how to consolidate electronic access to all of a patient’s documents.
Citizens Memorial Healthcare, which is made up of a hospital, 25 clinics, five long-term care facilities and a cancer center, has licked this problem by using ImageNow from Perceptive Software to tie together documents not captured in its Meditech 5.6 EMR system, including EKGs, radiology reports, outside lab results, wound and surgical photos and registration photos. This integration has allowed Citizens Memorial Healthcare to create a shared medical record across its enterprise.
In its EMR environment, regardless of visit, a patient’s documents are organized under a common number. To access documents related to an account that were not captured at the point-of-service by the EMR, the provider launches ImageNow in the background, allowing documents from either application to be displayed in the EMR system. To speed retrieval, ImageNow indexes documents in several categories; for instance, physicians can view patient diagnostics without having to look through registration documents. Documents also can be retrieved across accounts.
Electronic documents are created at the point-of-registration or via scanning later.
For primary care physicians that don’t have a way to get their EMR records into Citizens Memorial Healthcare’s system, ImageNow provides the ability to “scrape” data off of incoming faxes. As faxes come in to Citizens Memorial Healthcare, they are placed in a workflow queue where an employee triggers the technology.
“Our long-term goal is to have a true interface with Meditech for these documents. But we don’t have that in place now. ImageNow has provided that availability,” Tim Roberts, IS specialist, Citizens Memorial Healthcare, said yesterday during a presentation at Perceptive Software’s Inspire 2011 user conference in Las Vegas.
Inspire 2011 kicks off in Las Vegas
By Mark Brousseau
More than 900 document automation professionals have descended upon Las Vegas this week for Perceptive Software’s Inspire 2011 user conference at the Wynn.
This is the fifth year that Perceptive Software has held its Inspire event.
The attendance at this year’s Inspire user conference is a record, Jeremy McNeive, Perceptive Software’s public relations manager told me yesterday, topping the crowd of about 700 that attended Inspire 2010 in Kansas City. McNeive attributes the growth to the improving economy, the continuing pressure within organizations to improve efficiency, and the value that end-users perceive from the content. For instance, yesterday’s keynote address on the state of the company by Perceptive Software President and CEO Scott Coons was “very well received,” McNeive said.
Inspire 2011 includes more than 60 educational sessions (more than ever before at an Inspire event), with learning tracks dedicated to higher education, healthcare, financial services, the back-office, and product and platform information. There also is a large resource center where end-users can try out Perceptive Software products and get answers to technical and product questions from the company’s experts.
Eight of Perceptive Software’s partners also are exhibiting at the event: Lexmark (Perceptive Software’s parent company), Fujitsu, Napersoft, Scanning America, CSP Group, Docucon, Global Information Distribution, and HyBridge Solutions. Many of these partners provide various components for Perceptive Software’s solutions.
One of the hot topics among attendees is the anticipated release next year of ImageNow 6.7, which will offer “a little bit of everything” for end-users, McNeive said, including records information management and foldering capabilities. It will help end-users move towards collaboration in an even bigger way, McNeive added.
“There is lots of energy here and lots of excitement,” McNeive concluded.
Subscribe to:
Posts (Atom)