By Mark Brousseau
An eye-popping 88 percent of attendees at a recent accounts payable (AP) forum sponsored by Ricoh identified manual data entry as their top AP challenge, providing further evidence of the need for automated invoice processing solutions. Manual data entry came in a whopping 25 percentage points higher than the second biggest challenge identified by attendees at the Houston event.
Routing invoices for approval was identified as a top AP challenge for 63 percent of forum attendees, while 54 percent of attendees stated that resolving errors and exceptions was among their top challenges. Lost or missing invoices (42 percent) and overall payment costs (29 percent) rounded out the list of top AP challenges identified by forum attendees.
What is your top AP challenge?
Tuesday, May 31, 2011
Thursday, May 26, 2011
Avoid these document imaging mistakes!
By Mark Brousseau
Despite the lousy economy, document imaging solutions continue to enjoy strong adoption among organizations of all sizes. No wonder: the technology is proven to deliver tremendous operations and business benefits, including lower processing costs, streamlined storage and retrieval, and better information tracking and reporting.
But even the strongest business case for document imaging can be undermined by crucial errors during system deployment, says Brett Rodgers (brodgers@ibml.com), manager, Solution Consulting, Americas, at ibml (www.ibml.com), a Birmingham, AL-based document imaging solutions provider.
If you want to keep your document imaging business case on track (and who doesn't?), Rodgers suggests avoiding the following 10 all-too-common foul-ups during system deployment:
1. Incorrect sizing of the necessary number of document scanners.
2. Not including all stakeholders (business and IT) in the requirements definition.
3. Buying technology without first conducting a proof of concept.
4. Making decisions on front-end and back-end software separately.
5. Not coordinating software and hardware vendors during system deployment.
6. Not using a phased implementation approach (biting off too much at once).
7. Letting "fear of change" take over.
8. Not thinking LEAN.
9. Not cutting the paper cord.
10. Not "sharing" -- as in utilizing shared services.
What was your biggest mistake when deploying document imaging?
Despite the lousy economy, document imaging solutions continue to enjoy strong adoption among organizations of all sizes. No wonder: the technology is proven to deliver tremendous operations and business benefits, including lower processing costs, streamlined storage and retrieval, and better information tracking and reporting.
But even the strongest business case for document imaging can be undermined by crucial errors during system deployment, says Brett Rodgers (brodgers@ibml.com), manager, Solution Consulting, Americas, at ibml (www.ibml.com), a Birmingham, AL-based document imaging solutions provider.
If you want to keep your document imaging business case on track (and who doesn't?), Rodgers suggests avoiding the following 10 all-too-common foul-ups during system deployment:
1. Incorrect sizing of the necessary number of document scanners.
2. Not including all stakeholders (business and IT) in the requirements definition.
3. Buying technology without first conducting a proof of concept.
4. Making decisions on front-end and back-end software separately.
5. Not coordinating software and hardware vendors during system deployment.
6. Not using a phased implementation approach (biting off too much at once).
7. Letting "fear of change" take over.
8. Not thinking LEAN.
9. Not cutting the paper cord.
10. Not "sharing" -- as in utilizing shared services.
What was your biggest mistake when deploying document imaging?
Labels:
content management,
data capture,
data management,
document imaging,
ecm,
ICR,
Mark Brousseau,
OCR,
page scanning,
TAWPI
Wednesday, May 25, 2011
Cost versus value – who wins?
By Laura Knox, inside sales team leader, DataSource Mobility
In today’s ever challenging economy it’s no surprise that we see technology customers focusing more and more on cost ... and while I am all about getting a bargain and finding the right solution at the right price for my clients, I often find myself explaining that cost does not always equal value.
Much more goes into the concept of value than the upfront purchase price of any solution. You have to think about potential downtime if the equipment breaks, repair costs, replacement if your workers refuse to use the machine because of poor performance, replacement cost if a device fails, upgraded warranty fees (most low cost solutions come with little or no warranty coverage) and the time any IT staff must spend to keep the devices working properly. So, if we are looking at overall value rather than upfront value the emphasis moves from simply finding something cheap to finding something that is high quality.
Now, most people with a healthy knowledge of IT matters already understand that inferior parts and inferior quality plus lack of service are what equal the attractively low price point of generic industry devices – and they are very anxious not to get stuck trying to support devices that will need constant attention and repair - but try explaining this to a person without IT experience who is tasked with finding a top quality solution at a “bargain bin” price and things get tricky.
So, for those of us who are not IT aficionados but need to make smart decisions for the companies we own or are employed by, the question becomes; how do I tell a high quality device from all the other options? Below is a list of questions that I strongly encourage these folks to ask before purchasing any equipment from a potential vendor.
$ vs. ROI vs. TCO
1) What is it made out of?
2) What type of service and support is included in the cost being quoted (and what will you have to pay extra for)?
3) What is the typical lifespan of the device?
4) Are parts and labor outsourced or does the manufacturer actually make the product?
5) Has it passed any level of rugged certification?
6) What is the typical failure rate for the device?
What do you think?
In today’s ever challenging economy it’s no surprise that we see technology customers focusing more and more on cost ... and while I am all about getting a bargain and finding the right solution at the right price for my clients, I often find myself explaining that cost does not always equal value.
Much more goes into the concept of value than the upfront purchase price of any solution. You have to think about potential downtime if the equipment breaks, repair costs, replacement if your workers refuse to use the machine because of poor performance, replacement cost if a device fails, upgraded warranty fees (most low cost solutions come with little or no warranty coverage) and the time any IT staff must spend to keep the devices working properly. So, if we are looking at overall value rather than upfront value the emphasis moves from simply finding something cheap to finding something that is high quality.
Now, most people with a healthy knowledge of IT matters already understand that inferior parts and inferior quality plus lack of service are what equal the attractively low price point of generic industry devices – and they are very anxious not to get stuck trying to support devices that will need constant attention and repair - but try explaining this to a person without IT experience who is tasked with finding a top quality solution at a “bargain bin” price and things get tricky.
So, for those of us who are not IT aficionados but need to make smart decisions for the companies we own or are employed by, the question becomes; how do I tell a high quality device from all the other options? Below is a list of questions that I strongly encourage these folks to ask before purchasing any equipment from a potential vendor.
$ vs. ROI vs. TCO
1) What is it made out of?
2) What type of service and support is included in the cost being quoted (and what will you have to pay extra for)?
3) What is the typical lifespan of the device?
4) Are parts and labor outsourced or does the manufacturer actually make the product?
5) Has it passed any level of rugged certification?
6) What is the typical failure rate for the device?
What do you think?
Tuesday, May 24, 2011
Fusion attendees meet Kevin Nealon
Posted by Mark Brousseau
Customers and prospects of Brainware -- sponsor of the Fusion 2011 Wednesday night reception -- had an opportunity to meet actor and comedian Kevin Nealon before his performance.
Click below to see photos from the meet and greet.
http://www.iappnet.org/photo/fusion2011_vip/index.html
Customers and prospects of Brainware -- sponsor of the Fusion 2011 Wednesday night reception -- had an opportunity to meet actor and comedian Kevin Nealon before his performance.
Click below to see photos from the meet and greet.
http://www.iappnet.org/photo/fusion2011_vip/index.html
Labels:
Brainware,
data capture,
document imaging,
ICR,
IDR,
Mark Brousseau,
OCR,
page scanning
Enterprises will adopt layered fraud prevention techniques
Posted by Mark Brousseau
By 2014, 15 percent of enterprises will adopt layered fraud prevention techniques for their internal systems to compensate for weaknesses inherent in using only authentication methods, according to Gartner, Inc.
Gartner analysts say no single layer of fraud prevention or authentication is enough to keep determined fraudsters out of enterprise systems. Multiple layers must be employed to defend against today's attacks and those that have yet to appear.
"Malware-based attacks against bank customers and company employees are levying severe reputational and financial damage on their victims. They are fast becoming a prevalent tool for attacking customer and corporate accounts, and stealing sensitive information or funds," said Avivah Litan, vice president and distinguished analyst at Gartner. "Fighting these and future types of attacks requires a layered fraud prevention approach."
Litan explained that while the layered approach to fraud prevention tries to keep the attackers from getting inside in the first place, it also assumes that they will make it in, and that multiple fraud prevention layers are needed to stop the damage once they do. She said that no authentication measure on its own, especially when communicating through a browser, is sufficient to counter today's threats.
Gartner breaks down fraud prevention into five layers:
Layer 1
Layer 1 is endpoint-centric, and it involves technologies deployed in the context of users and the endpoints they use. Layer 1 technologies include secure browsing applications or hardware, as well as transaction-signing devices. Transaction-signing devices can be dedicated tokens, telephones, PCs and more. Out-of-band or dedicated hardware-based transaction verification affords stronger security and a higher level of assurance than in-band processes do. The technologies in this layer can be typically deployed faster than those in subsequent layers and go a long way toward defeating malware-based attacks.
Layer 2
Layer 2 is navigation-centric; this monitors and analyzes session navigation behavior and compares it with navigation patterns that are expected on that site, or uses rules that identify abnormal and suspect navigation patterns. It's useful for spotting individual suspect transactions as well as fraud rings. This layer can also generally be deployed faster than those in Layers 3, 4 and 5, and it can be effective in identifying and defeating malware-based attacks.
Layer 3
Layer 3 is user- and account-centric for a specific channel, such as online sales; it monitors and analyzes user or account behavior and associated transactions and identifies anomalous behavior, using rules or statistical models. It may also use continuously updated profiles of users and accounts, as well as peer groups for comparing transactions and identifying the suspect ones.
Layer 4
Layer 4 is user- and account-centric across multiple channels and products. As with Layer 3, it looks for suspect user or account behavior, but it also offers the benefit of looking across channels and products and correlating alerts and activities for each user, account or entity.
Layer 5
Layer 5 is entity link analysis. It enables the analysis of relationships among internal and/or external entities and their attributes (for example, users, accounts, account attributes, machines and machine attributes) to detect organized or collusive criminal activities or misuse.
Litan said that, depending on the size and complexity of the end-user institution, implementing the systems that support a layered fraud management framework can take at least three to five years, especially when it comes to the upper layers — Layers 3, 4 and 5. These efforts are continuous, because fraud prevention rules and models require ongoing maintenance, tuning and care.
"Organizations don't have years to wait to introduce fraud prevention while malware-based attacks proliferate. We recommend starting with the first layer of this fraud prevention framework, as well as the second layer, resources permitting, since these can be deployed relatively quickly," says Litan. "Enterprises that start by deploying lower levels of the layered stack can help to stave off immediate threats, with the assurance that these layers are part of an overall strategy that relies on basic fraud prevention principles, such as user and account profiling that have generally stood the test of time."
What do you think?
By 2014, 15 percent of enterprises will adopt layered fraud prevention techniques for their internal systems to compensate for weaknesses inherent in using only authentication methods, according to Gartner, Inc.
Gartner analysts say no single layer of fraud prevention or authentication is enough to keep determined fraudsters out of enterprise systems. Multiple layers must be employed to defend against today's attacks and those that have yet to appear.
"Malware-based attacks against bank customers and company employees are levying severe reputational and financial damage on their victims. They are fast becoming a prevalent tool for attacking customer and corporate accounts, and stealing sensitive information or funds," said Avivah Litan, vice president and distinguished analyst at Gartner. "Fighting these and future types of attacks requires a layered fraud prevention approach."
Litan explained that while the layered approach to fraud prevention tries to keep the attackers from getting inside in the first place, it also assumes that they will make it in, and that multiple fraud prevention layers are needed to stop the damage once they do. She said that no authentication measure on its own, especially when communicating through a browser, is sufficient to counter today's threats.
Gartner breaks down fraud prevention into five layers:
Layer 1
Layer 1 is endpoint-centric, and it involves technologies deployed in the context of users and the endpoints they use. Layer 1 technologies include secure browsing applications or hardware, as well as transaction-signing devices. Transaction-signing devices can be dedicated tokens, telephones, PCs and more. Out-of-band or dedicated hardware-based transaction verification affords stronger security and a higher level of assurance than in-band processes do. The technologies in this layer can be typically deployed faster than those in subsequent layers and go a long way toward defeating malware-based attacks.
Layer 2
Layer 2 is navigation-centric; this monitors and analyzes session navigation behavior and compares it with navigation patterns that are expected on that site, or uses rules that identify abnormal and suspect navigation patterns. It's useful for spotting individual suspect transactions as well as fraud rings. This layer can also generally be deployed faster than those in Layers 3, 4 and 5, and it can be effective in identifying and defeating malware-based attacks.
Layer 3
Layer 3 is user- and account-centric for a specific channel, such as online sales; it monitors and analyzes user or account behavior and associated transactions and identifies anomalous behavior, using rules or statistical models. It may also use continuously updated profiles of users and accounts, as well as peer groups for comparing transactions and identifying the suspect ones.
Layer 4
Layer 4 is user- and account-centric across multiple channels and products. As with Layer 3, it looks for suspect user or account behavior, but it also offers the benefit of looking across channels and products and correlating alerts and activities for each user, account or entity.
Layer 5
Layer 5 is entity link analysis. It enables the analysis of relationships among internal and/or external entities and their attributes (for example, users, accounts, account attributes, machines and machine attributes) to detect organized or collusive criminal activities or misuse.
Litan said that, depending on the size and complexity of the end-user institution, implementing the systems that support a layered fraud management framework can take at least three to five years, especially when it comes to the upper layers — Layers 3, 4 and 5. These efforts are continuous, because fraud prevention rules and models require ongoing maintenance, tuning and care.
"Organizations don't have years to wait to introduce fraud prevention while malware-based attacks proliferate. We recommend starting with the first layer of this fraud prevention framework, as well as the second layer, resources permitting, since these can be deployed relatively quickly," says Litan. "Enterprises that start by deploying lower levels of the layered stack can help to stave off immediate threats, with the assurance that these layers are part of an overall strategy that relies on basic fraud prevention principles, such as user and account profiling that have generally stood the test of time."
What do you think?
Labels:
accounting fraud,
ACH fraud,
check fraud,
enterprise fraud hub,
fraud,
Mark Brousseau,
TAWPI
Monday, May 23, 2011
Where’s the automation?
By Mark Brousseau
Despite revenues in the billions of dollars and the document volumes inherent to that scale of operation, many—possibly even most—companies have not made the leap to automated data capture technology for invoice processing, a proven driver of efficiency and value in accounts payable (AP).
That’s the key takeaway of a survey of attendees of Fusion 2011, held May 8-12 at the Gaylord Palms Resort and Convention Center near Orlando, Florida. The survey polled AP professionals around the globe, working in numerous industries and for organizations ranging from less than $500 million in annual revenues to well in excess of $10 billion in revenues. It was conducted by The Institute of Financial Operations and sponsored by Brainware. Fusion 2011 brought together more than 1,800 financial operations professionals and 170 exhibiting companies.
With an increased focus on working capital management, many AP professionals are emphasizing a need for greater visibility into and reporting of invoice processing—a demonstrated strength of available data capture and extraction technologies such as optical character recognition (OCR) and intelligent document recognition (IDR). That’s what makes these survey findings so surprising.
More than half of the survey respondents (56.3 percent) indicated that their AP organization doesn’t use automated data capture technology. And, only 3.1 percent of respondents stated that their AP organization plans to implement automated data capture within the next six months, while 6.3 percent stated their AP organization plans to implement the technology within the next 12 months.
Why aren’t AP departments making greater use of automated data capture and extraction?
Tight capital budgets are undoubtedly a factor. But AP departments also may not see the need.
Despite their lack of data capture technologies, most of the respondents to the survey are doing a pretty good job of holding the line on invoice processing costs. A plurality of respondents (41.9 percent) indicated that their average invoice processing costs have not changed over the past 12 months, while 38.7 percent of respondents stated their invoice processing costs have dropped slightly. Only 12.9 percent of respondents indicated that their average invoice processing costs have increased either slightly (9.7 percent) or significantly (3.2 percent) over the past 12 months.
Similarly, a plurality of respondents (40 percent) indicated that their average cost to process an invoice is between $2 and $5 – in line with the costs published in surveys by industry research firms. Some 16.7 percent of respondents said their average invoice processing costs are less than $2.
But the survey results show that many AP departments could benefit from labor-saving technologies such as automated data capture. More than a quarter of respondents (26.7 percent) pegged their average invoice processing costs between $5 and $10. Worse, 13.4 percent of respondents stated their average invoice processing costs are between $10 and $20, while 3.3 percent of respondents indicated that their average invoice processing costs were between an eye-popping $20 and $25.
“Among other findings, more than a third of respondents claim it still takes them more than twelve days to process an invoice, inhibiting their ability to take early payment discounts, creating backlogs, and often necessitating increased headcount,” notes Charles Kaplan, vice president of sales and marketing at Brainware. Twenty-five percent of respondents stated it takes their organization more than 15 days to pay invoices. “Automated data capture solves those problems and many others.”
To this point, a plurality of respondents (32.3 percent) believe that “better visibility and reporting” is the biggest benefit of the technology, followed by “faster turnaround” (29 percent), “lower costs” (12.9 percent), “better working capital management” (12.9 percent), and “fewer errors” (9.7 percent). Only 3.2 percent of survey respondents stated that they see “no benefit” to automated data capture.
The bottom line is that despite all the hype about automating invoice processing with data capture technology, vendors have a long way to go in convincing AP departments to deploy them.
What do you think?
Despite revenues in the billions of dollars and the document volumes inherent to that scale of operation, many—possibly even most—companies have not made the leap to automated data capture technology for invoice processing, a proven driver of efficiency and value in accounts payable (AP).
That’s the key takeaway of a survey of attendees of Fusion 2011, held May 8-12 at the Gaylord Palms Resort and Convention Center near Orlando, Florida. The survey polled AP professionals around the globe, working in numerous industries and for organizations ranging from less than $500 million in annual revenues to well in excess of $10 billion in revenues. It was conducted by The Institute of Financial Operations and sponsored by Brainware. Fusion 2011 brought together more than 1,800 financial operations professionals and 170 exhibiting companies.
With an increased focus on working capital management, many AP professionals are emphasizing a need for greater visibility into and reporting of invoice processing—a demonstrated strength of available data capture and extraction technologies such as optical character recognition (OCR) and intelligent document recognition (IDR). That’s what makes these survey findings so surprising.
More than half of the survey respondents (56.3 percent) indicated that their AP organization doesn’t use automated data capture technology. And, only 3.1 percent of respondents stated that their AP organization plans to implement automated data capture within the next six months, while 6.3 percent stated their AP organization plans to implement the technology within the next 12 months.
Why aren’t AP departments making greater use of automated data capture and extraction?
Tight capital budgets are undoubtedly a factor. But AP departments also may not see the need.
Despite their lack of data capture technologies, most of the respondents to the survey are doing a pretty good job of holding the line on invoice processing costs. A plurality of respondents (41.9 percent) indicated that their average invoice processing costs have not changed over the past 12 months, while 38.7 percent of respondents stated their invoice processing costs have dropped slightly. Only 12.9 percent of respondents indicated that their average invoice processing costs have increased either slightly (9.7 percent) or significantly (3.2 percent) over the past 12 months.
Similarly, a plurality of respondents (40 percent) indicated that their average cost to process an invoice is between $2 and $5 – in line with the costs published in surveys by industry research firms. Some 16.7 percent of respondents said their average invoice processing costs are less than $2.
But the survey results show that many AP departments could benefit from labor-saving technologies such as automated data capture. More than a quarter of respondents (26.7 percent) pegged their average invoice processing costs between $5 and $10. Worse, 13.4 percent of respondents stated their average invoice processing costs are between $10 and $20, while 3.3 percent of respondents indicated that their average invoice processing costs were between an eye-popping $20 and $25.
“Among other findings, more than a third of respondents claim it still takes them more than twelve days to process an invoice, inhibiting their ability to take early payment discounts, creating backlogs, and often necessitating increased headcount,” notes Charles Kaplan, vice president of sales and marketing at Brainware. Twenty-five percent of respondents stated it takes their organization more than 15 days to pay invoices. “Automated data capture solves those problems and many others.”
To this point, a plurality of respondents (32.3 percent) believe that “better visibility and reporting” is the biggest benefit of the technology, followed by “faster turnaround” (29 percent), “lower costs” (12.9 percent), “better working capital management” (12.9 percent), and “fewer errors” (9.7 percent). Only 3.2 percent of survey respondents stated that they see “no benefit” to automated data capture.
The bottom line is that despite all the hype about automating invoice processing with data capture technology, vendors have a long way to go in convincing AP departments to deploy them.
What do you think?
Tuesday, May 17, 2011
The alligator and the vendor
Posted by Mark Brousseau
ibml Business Solution Consultant Curtis Williams sees an alligator up-close at Celebration, Florida, during downtime at last week's Fusion conference.
ibml Business Solution Consultant Curtis Williams sees an alligator up-close at Celebration, Florida, during downtime at last week's Fusion conference.
Checks in a 21st Century digital world
By Glenn Wheeler, president, Viewpointe Clearing, Settlement & Association Services, Viewpointe
Is the check dead? You might hear a near-unanimous “yes” to that question; or as others might say more accurately, check usage is simply on a long decline. While check usage has been dwindling in recent years, to paraphrase Mark Twain, the reports of its death are greatly exaggerated. A recent study shows a sizeable segment of the market still writes checks.
As The 2010 Federal Reserve Payments Study, which looks at noncash payments in the U.S. from 2006 through 2009, indicates electronic payments are quickly outstripping check payments; yet checks have remained a significant payment instrument – to the tune of $31.6 trillion in value paid in 2009. While businesses far outweigh consumers in the total dollar value of the checks paid, consumers overall continue to write more checks, according to the findings. And, the study found that while the number of checks written overall has declined more than 7 percent from 2006 to 2009, the volume of consumer-to-consumer check payments has actually grown in that same time period, from 2.2 billion to 2.4 billion.
Where is the consumer-to-consumer check-writing trend heading? Despite its overall decline, there are those who continue to see the value in this traditional payment method. A January New York Times story, Social Security and Welfare Benefits Going Paperless, about the U.S. government’s decision to pay benefits electronically, chronicled how the elderly have continued to opt to receive old-reliable checks versus the government’s proposed electronic deposit of social security payments.
While this one segment of the population alone will not keep checks going indefinitely, technology might encourage some of the smartphone-wielding segment of the population to continue circulating them. According to a recent American Banker article, For Mobile Deposit, Banks Choose Speed-to-Market Over Simplicity, banks are rushing ahead with mobile check deposit technology at the behest of their customers who are using the technology to deposit checks without having to step foot in a bank.
As electronic payments technology continues to evolve – from mobile payment apps to “tap-and-pay” payments using near field communications (NFC), financial institutions and their customers can easily move into a new payments world. Embracing the budding technology will, no doubt, bring new challenges, but with ease of use and the promise of potential growth to the financial institution’s bottom line it could be a worthwhile investment.
Even in our digital age, the old-fashioned check may still stand up as a viable complement to the technologically advanced payment methods.
What do you think?
Is the check dead? You might hear a near-unanimous “yes” to that question; or as others might say more accurately, check usage is simply on a long decline. While check usage has been dwindling in recent years, to paraphrase Mark Twain, the reports of its death are greatly exaggerated. A recent study shows a sizeable segment of the market still writes checks.
As The 2010 Federal Reserve Payments Study, which looks at noncash payments in the U.S. from 2006 through 2009, indicates electronic payments are quickly outstripping check payments; yet checks have remained a significant payment instrument – to the tune of $31.6 trillion in value paid in 2009. While businesses far outweigh consumers in the total dollar value of the checks paid, consumers overall continue to write more checks, according to the findings. And, the study found that while the number of checks written overall has declined more than 7 percent from 2006 to 2009, the volume of consumer-to-consumer check payments has actually grown in that same time period, from 2.2 billion to 2.4 billion.
Where is the consumer-to-consumer check-writing trend heading? Despite its overall decline, there are those who continue to see the value in this traditional payment method. A January New York Times story, Social Security and Welfare Benefits Going Paperless, about the U.S. government’s decision to pay benefits electronically, chronicled how the elderly have continued to opt to receive old-reliable checks versus the government’s proposed electronic deposit of social security payments.
While this one segment of the population alone will not keep checks going indefinitely, technology might encourage some of the smartphone-wielding segment of the population to continue circulating them. According to a recent American Banker article, For Mobile Deposit, Banks Choose Speed-to-Market Over Simplicity, banks are rushing ahead with mobile check deposit technology at the behest of their customers who are using the technology to deposit checks without having to step foot in a bank.
As electronic payments technology continues to evolve – from mobile payment apps to “tap-and-pay” payments using near field communications (NFC), financial institutions and their customers can easily move into a new payments world. Embracing the budding technology will, no doubt, bring new challenges, but with ease of use and the promise of potential growth to the financial institution’s bottom line it could be a worthwhile investment.
Even in our digital age, the old-fashioned check may still stand up as a viable complement to the technologically advanced payment methods.
What do you think?
Labels:
Check 21,
check archive,
check imaging,
deposits,
Mark Brousseau,
NFC,
Viewpointe
Friday, May 13, 2011
Make information safekeeping part of your hurricane preparations
Posted by Mark Brousseau
Forecasters at Colorado State University recently announced that the 2011 Atlantic hurricane season will be very active. Before the season starts, it's time for businesses to get ready, while remembering their most valuable asset: information.
"No matter the size of a company, without access to information, clients could be lost and the owner may be at risk for losing the business altogether," said Marshall Stevens, co-owner of Stevens & Stevens Business Records Management, Inc, a Florida-based records management center. "To ensure business continuity, owners should develop a disaster recovery plan to assess how they're storing and managing information. These plans can help keep a business up and running so all business functions could be handled, even without access to your facility or network."
Get prepared by considering the following:
... Location and security of your storage facility – Store information off-site in a location that's been designed to withstand high sustained winds, is located in a non-flood zone, has a secure vault and is also secured with alarms, security cameras and pass codes.
... Accessibility – Be sure you can access your information no matter the time of day or day of week.
... Document back-ups – Whether you make copies or have external hard drives, back-up files of key documents is crucial. Keep back-ups in multiple locations, so if a disaster affects your office, another copy of your information is still available.
... Alternative records storage options – Consider utilizing technology that allows files to be converted to electronic images, which are then hosted on a secure, password-protected website. So, if files are destroyed or you couldn't access your facility, information wouldn't be gone for good.
"Hurricane season can be an uneasy time, but by planning how you'll protect information now, if disaster does strike, you can focus on running your business rather than trying to pick up the pieces after the fact," said Stevens.
How does your company safeguard its information during a hurricane?
Forecasters at Colorado State University recently announced that the 2011 Atlantic hurricane season will be very active. Before the season starts, it's time for businesses to get ready, while remembering their most valuable asset: information.
"No matter the size of a company, without access to information, clients could be lost and the owner may be at risk for losing the business altogether," said Marshall Stevens, co-owner of Stevens & Stevens Business Records Management, Inc, a Florida-based records management center. "To ensure business continuity, owners should develop a disaster recovery plan to assess how they're storing and managing information. These plans can help keep a business up and running so all business functions could be handled, even without access to your facility or network."
Get prepared by considering the following:
... Location and security of your storage facility – Store information off-site in a location that's been designed to withstand high sustained winds, is located in a non-flood zone, has a secure vault and is also secured with alarms, security cameras and pass codes.
... Accessibility – Be sure you can access your information no matter the time of day or day of week.
... Document back-ups – Whether you make copies or have external hard drives, back-up files of key documents is crucial. Keep back-ups in multiple locations, so if a disaster affects your office, another copy of your information is still available.
... Alternative records storage options – Consider utilizing technology that allows files to be converted to electronic images, which are then hosted on a secure, password-protected website. So, if files are destroyed or you couldn't access your facility, information wouldn't be gone for good.
"Hurricane season can be an uneasy time, but by planning how you'll protect information now, if disaster does strike, you can focus on running your business rather than trying to pick up the pieces after the fact," said Stevens.
How does your company safeguard its information during a hurricane?
What CFOs are thinking and doing
By Mark Brousseau
Although the economic downturn caused CFOs to concentrate primarily on their steward role, the recovery has reemphasized the need to act as a catalyst and strategist, Jeff Bronaugh, senior manager at Deloitte Consulting LLP told attendees of the Masters Session at Fusion 2011 at the Gaylord Palms Resort and Convention Center Florida. Bronaugh and his colleagues Bob Comeau, national service line lead and principle at Deloitte Consulting, and Scott Rottman, principal at Deloitte Consulting, led a highly interactive discussion among attendees of the Masters Session.
During the depths of the credit crisis and recession, CFOs were spending roughly 60 percent of their time in the operator and steward roles, Bronaugh said. The increased time spent in the steward role reduced the amount of time CFOs spent in their preferred role as the strategist in the organization.
But in the wake of considerable capital-market and economic turmoil, CFOs are expected to take on broader and deeper strategic roles, Bronaugh said. CFOs are now routinely in charge of an expansive range of regulatory, governance, and strategy functions, especially investor and public relations, strategic planning, corporate development, and mergers and acquisitions.
As the economy begins to stabilize, focus for North America's top finance executives is shifting back to strategic initiatives, Bronaugh said. He pointed to a survey of CFOs conducted by Deloitte in the first quarter that showed that quality metrics, influencing strategies, and monitoring initiatives were the top three challenges of their finance organizations. CFOs cited strategic ambiguity, major change initiatives and regulatory change as their top three job stresses, Bronaugh said.
Against this backdrop, Bronaugh said there are 10 hot topics for CFOs:
1. Improving business decision support
2. Influencing business strategy and operational strategies
3. Major infrastructure and change initiatives
4. Prioritizing capital investments
5. Regulatory changes
6. Finance operating models
7. Cash is king
8. Managing finance department expectations
9. Finance talent management
10. Taxes
What are the hot topics in your finance department?
Although the economic downturn caused CFOs to concentrate primarily on their steward role, the recovery has reemphasized the need to act as a catalyst and strategist, Jeff Bronaugh, senior manager at Deloitte Consulting LLP told attendees of the Masters Session at Fusion 2011 at the Gaylord Palms Resort and Convention Center Florida. Bronaugh and his colleagues Bob Comeau, national service line lead and principle at Deloitte Consulting, and Scott Rottman, principal at Deloitte Consulting, led a highly interactive discussion among attendees of the Masters Session.
During the depths of the credit crisis and recession, CFOs were spending roughly 60 percent of their time in the operator and steward roles, Bronaugh said. The increased time spent in the steward role reduced the amount of time CFOs spent in their preferred role as the strategist in the organization.
But in the wake of considerable capital-market and economic turmoil, CFOs are expected to take on broader and deeper strategic roles, Bronaugh said. CFOs are now routinely in charge of an expansive range of regulatory, governance, and strategy functions, especially investor and public relations, strategic planning, corporate development, and mergers and acquisitions.
As the economy begins to stabilize, focus for North America's top finance executives is shifting back to strategic initiatives, Bronaugh said. He pointed to a survey of CFOs conducted by Deloitte in the first quarter that showed that quality metrics, influencing strategies, and monitoring initiatives were the top three challenges of their finance organizations. CFOs cited strategic ambiguity, major change initiatives and regulatory change as their top three job stresses, Bronaugh said.
Against this backdrop, Bronaugh said there are 10 hot topics for CFOs:
1. Improving business decision support
2. Influencing business strategy and operational strategies
3. Major infrastructure and change initiatives
4. Prioritizing capital investments
5. Regulatory changes
6. Finance operating models
7. Cash is king
8. Managing finance department expectations
9. Finance talent management
10. Taxes
What are the hot topics in your finance department?
Labels:
CFOs,
deloitte,
Mark Brousseau,
recession,
regulations,
TAWPI,
tax,
working capital management
Best operations-improving strategies
By Mark Brousseau
During a pre-conference networking lunch at Fusion 2011 at the Gaylord Palms Resort and Convention Center in Florida, attendees were asked to share the best operations-improving strategy that their accounts payable (AP) department has implemented in the past 12 months. Here are the operations strategies that the luncheon attendees said were the most effective during the past year:
... Provided AP processors with two computer monitors, reducing errors and increasing efficiency
... Took the time to better understand AP processes (became "black belts" in evaluating processes) to weed out the processes that don't add value
... Migrated more payments from paper check and wire transfer to automated clearing house (ACH) transactions
... Separated straight-through and exceptions processors
... Deployed a new enterprise resource planning (ERP) solution
... Deployed a purchasing card program
... Became more open-minded to new ideas
... Started reimbursing via a debit card since some people won't take direct deposit and paper checks are too costly and inefficient
... Began measuring and improving input quality, in turn, increasing AP productivity without changing any processes
... Began e-mailing and faxing remittances to save time and postage associated with paper remittances
... Developed a proprietary travel and expense (T&E) reporting system
... Brought AP functions previously done in India back in-house, resulting in savings of $26,000 a month, largely from fewer mistakes
... Implemented an imaging and workflow solution, reducing processing time and enabling all staff to know where an invoice stands in the approval process
... Standardized on one system and one process whenever possible
... Implemented virtual card payments
... Automated accounts receivable (AR) refunds
... Consolidated various overnight shipping and cellular phone accounts into "master" accounts, allowing the company to negotiate discounts of 18 to 50 percent off list prices
... Automated payroll processing with ACH
... Eliminated duplicate vendors, in turn, eliminating many duplicate payments
What is the best operations-improving strategy your AP department has implemented in the past 12 months? Post it below.
During a pre-conference networking lunch at Fusion 2011 at the Gaylord Palms Resort and Convention Center in Florida, attendees were asked to share the best operations-improving strategy that their accounts payable (AP) department has implemented in the past 12 months. Here are the operations strategies that the luncheon attendees said were the most effective during the past year:
... Provided AP processors with two computer monitors, reducing errors and increasing efficiency
... Took the time to better understand AP processes (became "black belts" in evaluating processes) to weed out the processes that don't add value
... Migrated more payments from paper check and wire transfer to automated clearing house (ACH) transactions
... Separated straight-through and exceptions processors
... Deployed a new enterprise resource planning (ERP) solution
... Deployed a purchasing card program
... Became more open-minded to new ideas
... Started reimbursing via a debit card since some people won't take direct deposit and paper checks are too costly and inefficient
... Began measuring and improving input quality, in turn, increasing AP productivity without changing any processes
... Began e-mailing and faxing remittances to save time and postage associated with paper remittances
... Developed a proprietary travel and expense (T&E) reporting system
... Brought AP functions previously done in India back in-house, resulting in savings of $26,000 a month, largely from fewer mistakes
... Implemented an imaging and workflow solution, reducing processing time and enabling all staff to know where an invoice stands in the approval process
... Standardized on one system and one process whenever possible
... Implemented virtual card payments
... Automated accounts receivable (AR) refunds
... Consolidated various overnight shipping and cellular phone accounts into "master" accounts, allowing the company to negotiate discounts of 18 to 50 percent off list prices
... Automated payroll processing with ACH
... Eliminated duplicate vendors, in turn, eliminating many duplicate payments
What is the best operations-improving strategy your AP department has implemented in the past 12 months? Post it below.
Labels:
accounts payable,
ACH,
AP,
AR,
check imaging,
ERP,
Mark Brousseau,
TAWPI,
workflow
Bringing mobile banking to the masses
Posted by Mark Brousseau
The number of unbanked or underbanked mobile subscribers around the world is projected to reach ~2 billion by 2012, according to research from Oliver Wyman and PlaNet Finance Group. Today, only around 50 million subscribers use mobile money services. Most of these deployments have been focusing on first generation mobile money products such as remittances, airtime top-up, bill payments and loan repayment.
The transformational impact of mobile money is expected to come from second generation financial services such as micro-savings, micro-credit and micro-insurance, especially in countries with less than 10 percent retail banking penetration, according to Oliver Wyman. Both telcos and financial institutions should benefit from the take-up of these products, as they reap expertise from complementary skills and deliver more value to customers, the research firm says.
However, the formula for success is not straightforward.
Two distinct models are emerging:
... The distribution of microfinance through mobile money via existing microfinance banks
... The distribution of microfinance through a virtual microfinance bank, operating as a pure mobile player.
“The benefits of these models include a more than twofold increase in access to banking, 20-50 percent lower operational costs for the microfinance institution and revenue or market share benefits for the Mobile Network Operator,” says Arnaud Ventura, co-founder and vice president of PlaNet Finance Group.
In a report, PlaNet Finance Group and Oliver Wyman conclude:
... Mobile Microfinance can have a significant impact on increasing financial services access for unbanked subscribers by eliminating all the disadvantages of physical bank branches. The benefits of this service are both social and economic.
... It is a cost-effective way for banks and MFIs to reach the masses by capitalizing on the widespread penetration of telecom distribution networks. PlaNet Finance and Oliver Wyman also see a new breed of intermediaries emerging that allow partners on both sides to interact smoothly by playing the “interconnection” role, making money on transactions rather than the spread.
Greg Rung, partner at Oliver Wyman said “PlaNet Finance and Oliver Wyman are convinced that, agreeing on a long-term vision, all stakeholders, from banks to distributors to regulators, need to come together to design an adequate offer and build a win-win model that can address all challenges successfully.”
What do you think?
The number of unbanked or underbanked mobile subscribers around the world is projected to reach ~2 billion by 2012, according to research from Oliver Wyman and PlaNet Finance Group. Today, only around 50 million subscribers use mobile money services. Most of these deployments have been focusing on first generation mobile money products such as remittances, airtime top-up, bill payments and loan repayment.
The transformational impact of mobile money is expected to come from second generation financial services such as micro-savings, micro-credit and micro-insurance, especially in countries with less than 10 percent retail banking penetration, according to Oliver Wyman. Both telcos and financial institutions should benefit from the take-up of these products, as they reap expertise from complementary skills and deliver more value to customers, the research firm says.
However, the formula for success is not straightforward.
Two distinct models are emerging:
... The distribution of microfinance through mobile money via existing microfinance banks
... The distribution of microfinance through a virtual microfinance bank, operating as a pure mobile player.
“The benefits of these models include a more than twofold increase in access to banking, 20-50 percent lower operational costs for the microfinance institution and revenue or market share benefits for the Mobile Network Operator,” says Arnaud Ventura, co-founder and vice president of PlaNet Finance Group.
In a report, PlaNet Finance Group and Oliver Wyman conclude:
... Mobile Microfinance can have a significant impact on increasing financial services access for unbanked subscribers by eliminating all the disadvantages of physical bank branches. The benefits of this service are both social and economic.
... It is a cost-effective way for banks and MFIs to reach the masses by capitalizing on the widespread penetration of telecom distribution networks. PlaNet Finance and Oliver Wyman also see a new breed of intermediaries emerging that allow partners on both sides to interact smoothly by playing the “interconnection” role, making money on transactions rather than the spread.
Greg Rung, partner at Oliver Wyman said “PlaNet Finance and Oliver Wyman are convinced that, agreeing on a long-term vision, all stakeholders, from banks to distributors to regulators, need to come together to design an adequate offer and build a win-win model that can address all challenges successfully.”
What do you think?
Friday, May 6, 2011
Alligators at Fusion 2011
Posted by Mark Brousseau
Folks arriving this weekend for Fusion 2011 may be surprised to see alligators(!) in the atrium of the Gaylord Palms Resort and Convention Center in Florida.
Labels:
document imaging,
document management,
FUSION,
Gaylord,
IAPP,
ICR,
Mark Brousseau,
Microsoft,
OCR,
page scanning,
TAWPI
Tuesday, May 3, 2011
Unlocking the value of enterprise content management
By Mark Brousseau
It’s one thing to have enterprise content management (ECM) technology, it’s another thing altogether to get value out of it, Gartner Analyst Mark Gilbert told attendees at Systemware’s user conference, SWUC 11, last week at the Westin Galleria in Dallas.
Companies seem to be getting Gilbert’s message. Many are focusing like never before on ECM applications that provide more value, he told attendees. This trend is being driven by increased expectations from ECM buyers and users, new demands for faster and richer process management and information delivery, increased archiving, compliance and information governance requirements, and the evolution of social media into a tool targeting supply and value chain management.
“The ECM market is strong, and it is reshaping itself as the technology becomes more adaptive and customers make more demands on it,” Gilbert explained, noting that the ECM market now tops $4 billion a year in sales. He added that, “Companies are now relying on ECM to drive business efficiency and achieve better results.“
When it comes to ECM, “ROI matters,” Gilbert said flatly.
Some key elements of the ECM business case that Gilbert identified include:
… Faster, better, processes.
… Better customer service
… Better, less costly regulatory compliance
… Better management decisions
… Better front-line decisions
… Better teamwork
“When we talk to customers, these things come up time and time again,” he said.
Gilbert offered several tips for ensuring your company meets its business case:
… Build a vision for how ECM can transform and drive your business.
… Survey ECM use-cases in your industry.
… Establish roles and an organizational structure to support your ECM vision.
… Set scope for your ECM initiatives by assessing risk and the value of information assets – across the breadth of the content continuum.
… Leverage existing technology and vendors and determine what you have and how it supports the ideals of information infrastructure.
… Accept the fact that technology alone will not succeed; policies and governance models are critical for long-term value.
What do you think?
It’s one thing to have enterprise content management (ECM) technology, it’s another thing altogether to get value out of it, Gartner Analyst Mark Gilbert told attendees at Systemware’s user conference, SWUC 11, last week at the Westin Galleria in Dallas.
Companies seem to be getting Gilbert’s message. Many are focusing like never before on ECM applications that provide more value, he told attendees. This trend is being driven by increased expectations from ECM buyers and users, new demands for faster and richer process management and information delivery, increased archiving, compliance and information governance requirements, and the evolution of social media into a tool targeting supply and value chain management.
“The ECM market is strong, and it is reshaping itself as the technology becomes more adaptive and customers make more demands on it,” Gilbert explained, noting that the ECM market now tops $4 billion a year in sales. He added that, “Companies are now relying on ECM to drive business efficiency and achieve better results.“
When it comes to ECM, “ROI matters,” Gilbert said flatly.
Some key elements of the ECM business case that Gilbert identified include:
… Faster, better, processes.
… Better customer service
… Better, less costly regulatory compliance
… Better management decisions
… Better front-line decisions
… Better teamwork
“When we talk to customers, these things come up time and time again,” he said.
Gilbert offered several tips for ensuring your company meets its business case:
… Build a vision for how ECM can transform and drive your business.
… Survey ECM use-cases in your industry.
… Establish roles and an organizational structure to support your ECM vision.
… Set scope for your ECM initiatives by assessing risk and the value of information assets – across the breadth of the content continuum.
… Leverage existing technology and vendors and determine what you have and how it supports the ideals of information infrastructure.
… Accept the fact that technology alone will not succeed; policies and governance models are critical for long-term value.
What do you think?
Monday, May 2, 2011
AP professionals see benefits to cloud computing
By Mark Brousseau
Accounts payable (AP) professionals see "minimal IT involvement" as the biggest benefit of using Software-as-a-Service (SaaS) or cloud computing for AP processing, according the findings of the 2011 AP Automation Study by International Accounts Payable Professionals. Nineteen percent of survey respondents identified "no capital investment" as the biggest benefit of cloud computing or SaaS, while 17.5 percent cited "lower cost per invoice" and 14.3 percent identified "fast start-up."
Some 12.7 percent of respondents identified "no software or hardware " as the biggest benefit.
Randy Davis, vice president of sales and marketing operations for eGistics isn't surprised that these benefits would rank high in the minds of AP staff. "Cloud offerings have always touted minimal IT involvement, no capital investment, fast deployment, and no on-site software as benefits," he notes.
But Davis believes that the ability of cloud-based document processing solutions to remove paper management from AP processing could deliver even greater benefits to AP professionals. "Today's cloud-based AP solutions significantly improve on key usability factors such as electronic capture, structured indexing, search and retrieval, work allocation, data updates and corrections, and audit and tracking -- things that directly contribute to the smooth operation of an AP department," Davis says.
"eGistics believes that business users will increasingly appreciate and accept the benefits of SaaS and cloud computing for critical tasks such as AP processing and management, and that such benefits will soon be taken for granted. At the end of the day, AP departments are looking for solutions that help them do their jobs faster, more accurately and with better accountability," Davis concludes.
What do you think?
Accounts payable (AP) professionals see "minimal IT involvement" as the biggest benefit of using Software-as-a-Service (SaaS) or cloud computing for AP processing, according the findings of the 2011 AP Automation Study by International Accounts Payable Professionals. Nineteen percent of survey respondents identified "no capital investment" as the biggest benefit of cloud computing or SaaS, while 17.5 percent cited "lower cost per invoice" and 14.3 percent identified "fast start-up."
Some 12.7 percent of respondents identified "no software or hardware " as the biggest benefit.
Randy Davis, vice president of sales and marketing operations for eGistics isn't surprised that these benefits would rank high in the minds of AP staff. "Cloud offerings have always touted minimal IT involvement, no capital investment, fast deployment, and no on-site software as benefits," he notes.
But Davis believes that the ability of cloud-based document processing solutions to remove paper management from AP processing could deliver even greater benefits to AP professionals. "Today's cloud-based AP solutions significantly improve on key usability factors such as electronic capture, structured indexing, search and retrieval, work allocation, data updates and corrections, and audit and tracking -- things that directly contribute to the smooth operation of an AP department," Davis says.
"eGistics believes that business users will increasingly appreciate and accept the benefits of SaaS and cloud computing for critical tasks such as AP processing and management, and that such benefits will soon be taken for granted. At the end of the day, AP departments are looking for solutions that help them do their jobs faster, more accurately and with better accountability," Davis concludes.
What do you think?
Labels:
AP,
cloud computing,
invoice processing,
invoice scanning,
Mark Brousseau,
SaaS
Subscribe to:
Posts (Atom)