Posted by Mark Brousseau
An interesting article from last week's Chicago Sun Times:
Macy's increases 'no fuss' gadget vending machines
ELECTRONICS Sell iPods, digital cameras, headphones
May 23, 2008
BY SANDRA GUY sguy@suntimes.com
Macy's is doubling the number of Chicago-area stores that sell consumer electronics in vending machines.
The machines, first called ZOOM and renamed e-Spot, use electronic arms to grab an iPod, earphones or digital camera and dispense the items.
Macy’s is doubling the number of local stores that sell consumer electronics, including iPods, in its e-Spot vending machines. Items range in price from $14.99 to $349.99.
A Macy's executive said shoppers want "one-stop, no-fuss shopping," and electronics that reflect their personal styles, and that's why the machines are in demand.
The electronics include the iPod touch, shuffle, nano and classic; Canon and Samsung digital cameras; MyVu personal media viewers, and Bang & Olufsen and Harman Kardon headphones. Prices range from $14.99 to $349.99.
New strategy
New machines have been installed at Macy's stores at Northbrook Court in Northbrook, Oakbrook Center in Oak Brook, Orland Square in Orland Park, Hawthorn Center in Vernon Hills, and River Oaks in Calumet City.
Macy's stores that already have the machines are at State Street, Water Tower Place, Woodfield in Schaumburg, Old Orchard in Skokie and Fox Valley in Aurora.
Macy's recently launched a "My Macy's" strategy in which each of 20 districts with 10 stores each is headed by a senior merchant, and assortments differ by region based on what shoppers want.
Tuesday, May 27, 2008
XML Brings Insurers Out Of Babel
Posted by Mark Brousseau
Interesting article in Insurance Networking News about XML in the insurance industry:
Bill KenealyMay 1, 2008
Even in the case of technological standards, it seems there can be too much of a good thing. So it goes with XML, which, a decade following its inception, is now entrenched as the lingua franca of B2B and intra-company communications not just for the insurance industry, but also for the digital universe as a whole.
Insurance technologists, who have always needed to move information around in a way that was standard and convenient, can hardly be faulted for falling hard for the open standard. Indeed, the data-intensive nature of the industry coupled with the ever-widening array of agents, vendors and regulators the typical insurers digitally interacts with, have made communication standards a necessity both internally and externally.
“Standardization was the primary driver for our adoption of XML,” says Peggy Scott, AVP for agency services, Liberty Mutual Agency Markets, a business unit of Boston-based Liberty Mutual Group. “Consistency across various applications reduces cost and time to market. The key driver for Agency Markets’ companies is quality and improving agents’ ease of doing business, allowing them to conduct much of their day-to-day business from within their agency management system. It also affords them the freedom of choice in selecting vendor partners for multi-carrier quoting options.”
Yet, the convergence around XML is hardly the industry’s first attempt at standardization. Many past attempts to standardize failed to gain widespread acceptance, due largely to the rigid, inflexible nature of the standards themselves. Not so with XML, which works across multiple languages and output formats, and is loosely-coupled, permitting insurers to mix and match applications of their choice. Moreover, XML is easily customizable, enabling companies to add information and alter code without affecting processes already in place.
“XML provides a way for companies to read the information easily, as well as add to it,” says Tana Sabatino, president of San Francisco-based Vallue Consulting Inc., who formerly headed up XML development efforts for ACORD, the Pearl River, N.Y.-based insurance association. “What we’ve seen is that instead of XML being phased out, it is being used a base for everything else, such as SOA and Web services. XML as a base language has really held its ground across all industries, not just insurance.”
EASY DOES IT
XML’s ease of use, versatility and ability to overlay sundry computer languages and data standards, makes it a natural choice for insurers, especially those burdened by a tangle of legacy applications. Thus, XML has become the tactical weapon of choice for insurers in their quest to achieve strategic aims such as straight-through processing and service-oriented architecture.
A joint survey conducted by ACORD and Stamford, Conn.-based Gartner Inc. bears this out, finding that XML adoption continues to rise year-over-year among insurers and reinsurers. The survey, conducted in the third quarter of 2007 with a global sample of 176 respondents representing life insurance, property/casualty insurance and reinsurance, found XML is becoming a core technological component for both supporting application integration and interactions with external partners.
“As an industry, we need to start implementing standards from the very core of our systems and processes to achieve our overall strategic and operational objectives,” says Denise Garth, VP, membership and standards, ACORD. “The true benefits of standards can be best achieved when companies organize their efforts strategically with a governance structure that guides development and implementation of standards, which are a core component of projects and enterprise architectures. This ensures that standards are woven into all the business functions and systems across the enterprise, which will help achieve greatest value to organizations.”
GOVERN THYSELF
Yet ubiquity is not an unalloyed positive. The study found that although adoption rates continue to rise annually, many insurers pursuing XML do not follow best practices in XML governance, funding or oversight. It also found that many companies lack the governance and management structure to optimize XML use. Specifically, the study found that only 42% of life insurers, 33% of P&C insurers and 38% of reinsurers currently have a corporate strategy in place to guide their XML projects and investments. Also, less than half of respondents with XML projects in place have a corporate XML strategy to guide investments and project plans. Additionally, few insurers or reinsurers had dedicated leadership for their XML projects. Both of these are considered key governance best practices for promoting business and technical benefits from XML use.
Adherence to a well-designed data model, such as ACORD XML is another prerequisite for good XML governance. “The essence of governance is managing and understating how XML is being used across all the systems, and to promote the use of standards—whether its an external standard such as ACORD or an internal one,” Sabatino says.
Obviously, governance needs vary widely with the size and type of insurer. The XML governance challenges of a large multi-line insurer with 25 departments and 300 systems will exceed those of an small or monoline carrier. Few know this better than Gary Plotkin, VP and CIO of The Hartford. At an SOA summit held by Insurance Networking News in January, Plotkin said his company was adopting ACORD XML standards for internal usage as part of a larger enterprise wide SOA undertaking. “We can’t afford to do things different ways for every line of business,” Plotkin said.
Whether big or small, proper XML governance may require a change in IT culture, Sabatino says. “It’s an organizational challenge but, in order to make it happen, there are technology needs,” she says, noting while three years ago there wasn’t much technology available to aid in XML governance, solutions are now entering the marketplace. “When we were doing flat file data exchange, it was all system-to-system and nobody oversaw it. However, with XML, companies are seeing that they do need to oversee it because there are many advantages to be gained when they all start calling things the same way.”
Getting insurers to use commonly accepted terms has long been a priority of ACORD, which is no doubt aided by the critical mass generated by its largest number of carrier participants.
“ACORD’s focus has been on defining the insurance vocabulary,” Sabatino says, noting insurers have little to gain by clinging to unique names for common terms. “Being different is not better in this case. Saving money is better.”
Hartford’s Plotkin says he too is a proponent of adhering to ACORD messaging standards, both within his organization and without, in order to lock in consistency. “I like to think ACORD standards will be used across the industry, but at the very minimum, I use the ACORD standard for internal messaging because then I know we have a single standard,” he says.
Yet, it is important that carriers remain cognizant that XML itself only provides so much functionality, and it is best to regard it as bedrock or a base on which to build. Carriers need to bear in mind that despite XML’s ease of use, it does add another, often complex, layer to an IT environment. “XML as control layer makes a lot of sense. If you are doing large scale analytics, or moving gigabytes of data around for data warehousing, having all that go through XML doesn’t make a lot of sense,” says Bill Miller CTO of Colorado Springs, Colo.-based XAware Inc., a maker of open source data integration software.
Sabatino says that instead of XML becoming antiquated it’s now part of everything else.
Thus, for all its strength and ubiquity, XML should not be seen as a panacea. “XML is not perfect for everything in terms of moving and manipulating data, but it does serve a large number of purposes,” Miller says. “You are adding a bunch of overhead by having all this metadata and description in XML, so there can be issues around performance, bandwidth, security and governance.”
LITTLE CHUNKS
To help address these concerns, new XML-based tools continue to emerge in the marketplace. Some of the most compelling are based on a complementary, non-proprietary standard called Darwin Information Typing Architecture (DITA), which was developed by IBM but is maintained by an open-standards consortium, Organization for the Advancement of Structured Information Standards (OASIS), Billerica, Mass.
Rather than creating monolithic documents, DITA enables the creation of topic-based XML chunks. These modular components can be reused and assembled to create new docs. DITA, rather than creating documents that are monolithic artifacts, provides a framework for creating reusable topic-oriented components. This bottoms up, loosely coupled approach streamlines the implementation of change, proponents say.
“Similar to SOA where you are decomposing monolithic business apps into reusable services, it’s the same concept just applied to the creation and management of content,” says Jake Sorofman, SVP of marketing and business development for Tokyo-based JustSystems Inc., which recently released a stepped DITA maturity model. “For a specification this young, it has tremendous market momentum, and it is seen as the way forward for authoring and reuse of content when management is a concern.”
One of the promises of DITA is that it helps blur the formerly distinct demarcation between an application and document. This could pay dividends in a call center, where it could eliminate the need for a customer service representative to search both for a document and the pertinent information contained within it. “The document becomes the application,” says Sorofman. “You gain a level of control by breaking information down into smaller units. You are able to process it at a lower level and get the output that you want.”
Because DITA, like XML, is an open standard, those purchasing solutions based on it have a chance to participate in the future development of the standard. They also retain the option to specialize it to meet their needs without reliance on others.
However, this easy customization may come at a cost. The new tools, while accelerating development, imperil consistency and raise concerns about the introduction of errors. Some tools enable non-technical authors to work in XML without ever seeing an angle bracket, Sorofman notes.
MORE NEW TOOLS
While not discounting these concerns, insurers would be wise to educate themselves about the wealth of new technologies available to them. One of the other benefits of XML is the technologies are not necessarily insurance specific.
“You’re seeing horizontal vendors filling this void across all industries,” Sabatino says, adding this may be somewhat novel to insurance industry CTOs conditioned to solutions being built for specifically for them. “What were seeing is because these products can be sold across all markets, we’re getting better products for governance and XML mapping tools.”
Sabatino is sanguine about nascent intelligent transformation and mapping tools, which although still in their infancy, exhibit great promise. “We’ve seen a shift where a lot of new tools have entered the market, but it’s going to take another three to five years until these tools are truly mature and integrated at an enterprise level.”
It’s not just new applications from new entrants altering the XML landscape. Core application vendors are amping up their support industry XML standards. Kyle Blair, practice leader of insurance solutions for Westlake, Ohio-based Hyland Software Inc. says the company is developing an ACORD XML-based module for its popular Onbase offering in response to customer demand. Blair says that although Hyland’s efforts were enhanced by the groundwork ACORD has laid, the efforts to bring the module to market required a lot of work, just based on the size of the standard.
“The standard itself is so large,” he says. “The most challenging part is filtering through that all-encompassing standard to only leverage the XML statements or messages you would need to perform these communications with external sources.”
Interesting article in Insurance Networking News about XML in the insurance industry:
Bill KenealyMay 1, 2008
Even in the case of technological standards, it seems there can be too much of a good thing. So it goes with XML, which, a decade following its inception, is now entrenched as the lingua franca of B2B and intra-company communications not just for the insurance industry, but also for the digital universe as a whole.
Insurance technologists, who have always needed to move information around in a way that was standard and convenient, can hardly be faulted for falling hard for the open standard. Indeed, the data-intensive nature of the industry coupled with the ever-widening array of agents, vendors and regulators the typical insurers digitally interacts with, have made communication standards a necessity both internally and externally.
“Standardization was the primary driver for our adoption of XML,” says Peggy Scott, AVP for agency services, Liberty Mutual Agency Markets, a business unit of Boston-based Liberty Mutual Group. “Consistency across various applications reduces cost and time to market. The key driver for Agency Markets’ companies is quality and improving agents’ ease of doing business, allowing them to conduct much of their day-to-day business from within their agency management system. It also affords them the freedom of choice in selecting vendor partners for multi-carrier quoting options.”
Yet, the convergence around XML is hardly the industry’s first attempt at standardization. Many past attempts to standardize failed to gain widespread acceptance, due largely to the rigid, inflexible nature of the standards themselves. Not so with XML, which works across multiple languages and output formats, and is loosely-coupled, permitting insurers to mix and match applications of their choice. Moreover, XML is easily customizable, enabling companies to add information and alter code without affecting processes already in place.
“XML provides a way for companies to read the information easily, as well as add to it,” says Tana Sabatino, president of San Francisco-based Vallue Consulting Inc., who formerly headed up XML development efforts for ACORD, the Pearl River, N.Y.-based insurance association. “What we’ve seen is that instead of XML being phased out, it is being used a base for everything else, such as SOA and Web services. XML as a base language has really held its ground across all industries, not just insurance.”
EASY DOES IT
XML’s ease of use, versatility and ability to overlay sundry computer languages and data standards, makes it a natural choice for insurers, especially those burdened by a tangle of legacy applications. Thus, XML has become the tactical weapon of choice for insurers in their quest to achieve strategic aims such as straight-through processing and service-oriented architecture.
A joint survey conducted by ACORD and Stamford, Conn.-based Gartner Inc. bears this out, finding that XML adoption continues to rise year-over-year among insurers and reinsurers. The survey, conducted in the third quarter of 2007 with a global sample of 176 respondents representing life insurance, property/casualty insurance and reinsurance, found XML is becoming a core technological component for both supporting application integration and interactions with external partners.
“As an industry, we need to start implementing standards from the very core of our systems and processes to achieve our overall strategic and operational objectives,” says Denise Garth, VP, membership and standards, ACORD. “The true benefits of standards can be best achieved when companies organize their efforts strategically with a governance structure that guides development and implementation of standards, which are a core component of projects and enterprise architectures. This ensures that standards are woven into all the business functions and systems across the enterprise, which will help achieve greatest value to organizations.”
GOVERN THYSELF
Yet ubiquity is not an unalloyed positive. The study found that although adoption rates continue to rise annually, many insurers pursuing XML do not follow best practices in XML governance, funding or oversight. It also found that many companies lack the governance and management structure to optimize XML use. Specifically, the study found that only 42% of life insurers, 33% of P&C insurers and 38% of reinsurers currently have a corporate strategy in place to guide their XML projects and investments. Also, less than half of respondents with XML projects in place have a corporate XML strategy to guide investments and project plans. Additionally, few insurers or reinsurers had dedicated leadership for their XML projects. Both of these are considered key governance best practices for promoting business and technical benefits from XML use.
Adherence to a well-designed data model, such as ACORD XML is another prerequisite for good XML governance. “The essence of governance is managing and understating how XML is being used across all the systems, and to promote the use of standards—whether its an external standard such as ACORD or an internal one,” Sabatino says.
Obviously, governance needs vary widely with the size and type of insurer. The XML governance challenges of a large multi-line insurer with 25 departments and 300 systems will exceed those of an small or monoline carrier. Few know this better than Gary Plotkin, VP and CIO of The Hartford. At an SOA summit held by Insurance Networking News in January, Plotkin said his company was adopting ACORD XML standards for internal usage as part of a larger enterprise wide SOA undertaking. “We can’t afford to do things different ways for every line of business,” Plotkin said.
Whether big or small, proper XML governance may require a change in IT culture, Sabatino says. “It’s an organizational challenge but, in order to make it happen, there are technology needs,” she says, noting while three years ago there wasn’t much technology available to aid in XML governance, solutions are now entering the marketplace. “When we were doing flat file data exchange, it was all system-to-system and nobody oversaw it. However, with XML, companies are seeing that they do need to oversee it because there are many advantages to be gained when they all start calling things the same way.”
Getting insurers to use commonly accepted terms has long been a priority of ACORD, which is no doubt aided by the critical mass generated by its largest number of carrier participants.
“ACORD’s focus has been on defining the insurance vocabulary,” Sabatino says, noting insurers have little to gain by clinging to unique names for common terms. “Being different is not better in this case. Saving money is better.”
Hartford’s Plotkin says he too is a proponent of adhering to ACORD messaging standards, both within his organization and without, in order to lock in consistency. “I like to think ACORD standards will be used across the industry, but at the very minimum, I use the ACORD standard for internal messaging because then I know we have a single standard,” he says.
Yet, it is important that carriers remain cognizant that XML itself only provides so much functionality, and it is best to regard it as bedrock or a base on which to build. Carriers need to bear in mind that despite XML’s ease of use, it does add another, often complex, layer to an IT environment. “XML as control layer makes a lot of sense. If you are doing large scale analytics, or moving gigabytes of data around for data warehousing, having all that go through XML doesn’t make a lot of sense,” says Bill Miller CTO of Colorado Springs, Colo.-based XAware Inc., a maker of open source data integration software.
Sabatino says that instead of XML becoming antiquated it’s now part of everything else.
Thus, for all its strength and ubiquity, XML should not be seen as a panacea. “XML is not perfect for everything in terms of moving and manipulating data, but it does serve a large number of purposes,” Miller says. “You are adding a bunch of overhead by having all this metadata and description in XML, so there can be issues around performance, bandwidth, security and governance.”
LITTLE CHUNKS
To help address these concerns, new XML-based tools continue to emerge in the marketplace. Some of the most compelling are based on a complementary, non-proprietary standard called Darwin Information Typing Architecture (DITA), which was developed by IBM but is maintained by an open-standards consortium, Organization for the Advancement of Structured Information Standards (OASIS), Billerica, Mass.
Rather than creating monolithic documents, DITA enables the creation of topic-based XML chunks. These modular components can be reused and assembled to create new docs. DITA, rather than creating documents that are monolithic artifacts, provides a framework for creating reusable topic-oriented components. This bottoms up, loosely coupled approach streamlines the implementation of change, proponents say.
“Similar to SOA where you are decomposing monolithic business apps into reusable services, it’s the same concept just applied to the creation and management of content,” says Jake Sorofman, SVP of marketing and business development for Tokyo-based JustSystems Inc., which recently released a stepped DITA maturity model. “For a specification this young, it has tremendous market momentum, and it is seen as the way forward for authoring and reuse of content when management is a concern.”
One of the promises of DITA is that it helps blur the formerly distinct demarcation between an application and document. This could pay dividends in a call center, where it could eliminate the need for a customer service representative to search both for a document and the pertinent information contained within it. “The document becomes the application,” says Sorofman. “You gain a level of control by breaking information down into smaller units. You are able to process it at a lower level and get the output that you want.”
Because DITA, like XML, is an open standard, those purchasing solutions based on it have a chance to participate in the future development of the standard. They also retain the option to specialize it to meet their needs without reliance on others.
However, this easy customization may come at a cost. The new tools, while accelerating development, imperil consistency and raise concerns about the introduction of errors. Some tools enable non-technical authors to work in XML without ever seeing an angle bracket, Sorofman notes.
MORE NEW TOOLS
While not discounting these concerns, insurers would be wise to educate themselves about the wealth of new technologies available to them. One of the other benefits of XML is the technologies are not necessarily insurance specific.
“You’re seeing horizontal vendors filling this void across all industries,” Sabatino says, adding this may be somewhat novel to insurance industry CTOs conditioned to solutions being built for specifically for them. “What were seeing is because these products can be sold across all markets, we’re getting better products for governance and XML mapping tools.”
Sabatino is sanguine about nascent intelligent transformation and mapping tools, which although still in their infancy, exhibit great promise. “We’ve seen a shift where a lot of new tools have entered the market, but it’s going to take another three to five years until these tools are truly mature and integrated at an enterprise level.”
It’s not just new applications from new entrants altering the XML landscape. Core application vendors are amping up their support industry XML standards. Kyle Blair, practice leader of insurance solutions for Westlake, Ohio-based Hyland Software Inc. says the company is developing an ACORD XML-based module for its popular Onbase offering in response to customer demand. Blair says that although Hyland’s efforts were enhanced by the groundwork ACORD has laid, the efforts to bring the module to market required a lot of work, just based on the size of the standard.
“The standard itself is so large,” he says. “The most challenging part is filtering through that all-encompassing standard to only leverage the XML statements or messages you would need to perform these communications with external sources.”
Monday, May 26, 2008
Diverse Image Sources Challenge Traditional Document OCR/ICR
I’ve seen two growing trends in document processing: the increasing use of decentralized and smaller scanners, and the loss of control of precise printing for the source documents being scanned.
These two combined trends are causing lots of grief for long-established industries and solutions that depend on precisely measured field positioning to perform OCR/ICR.
For example, years ago, state tax agencies would lay out and then contract with print shops to produce tax forms, and then these paper forms were sent back, where they were scanned on a few “big iron” scanners.
Today, each form is still printed, perhaps at more print shops due to competitive requirements. The “same” forms are also color or black-and-white photocopied by individuals, and printed (color or black-and-white), at various scaling factors, from downloaded PDF files.
Another example is insurance agencies, where desktop scanners are used at each agency to scan and send documents to an insurance company’s central processing. Again, images may be scanned at 200dpi, 300dpi, color, grayscale, bi-tonal, and output as TIF, PDF, JPEG, and who knows what other format-du-jour.
While no one has had to toss out their existing traditional OCR/ICR capture technology, all these variations take a huge amount of extra effort to deal with. I’ve seen traditional OCR/ICR systems where a single logical form has to be implemented 4 different times to handle these differences.
Does anyone else have a story to tell here, especially one with a happy ending? I know there are different approaches to OCR/ICR capture that don’t have problems with these kinds of variations. Has anyone tried them?
Paul Traite, ICP
These two combined trends are causing lots of grief for long-established industries and solutions that depend on precisely measured field positioning to perform OCR/ICR.
For example, years ago, state tax agencies would lay out and then contract with print shops to produce tax forms, and then these paper forms were sent back, where they were scanned on a few “big iron” scanners.
Today, each form is still printed, perhaps at more print shops due to competitive requirements. The “same” forms are also color or black-and-white photocopied by individuals, and printed (color or black-and-white), at various scaling factors, from downloaded PDF files.
Another example is insurance agencies, where desktop scanners are used at each agency to scan and send documents to an insurance company’s central processing. Again, images may be scanned at 200dpi, 300dpi, color, grayscale, bi-tonal, and output as TIF, PDF, JPEG, and who knows what other format-du-jour.
While no one has had to toss out their existing traditional OCR/ICR capture technology, all these variations take a huge amount of extra effort to deal with. I’ve seen traditional OCR/ICR systems where a single logical form has to be implemented 4 different times to handle these differences.
Does anyone else have a story to tell here, especially one with a happy ending? I know there are different approaches to OCR/ICR capture that don’t have problems with these kinds of variations. Has anyone tried them?
Paul Traite, ICP
Friday, May 23, 2008
EDI Made Easy
Posted by Mark Brousseau
The ultimate goal of many small business owners is to see their products on the shelves of major retailers. In order to automate the transmission and receipt of any number of business documents—including purchase orders, remittance advice, invoices, advance shipping notices, hang tags, labels, and catalogue updates—many suppliers and retailers use EDI, or Electronic Data Interchange. However, EDI is Greek to most small businesses and very difficult to perform internally without significant expenditures of money and manpower.
In addition, suppliers that fail to submit automated business transactions according to retailer specifications are faced with pricey penalties known as “charge backs.” For the supplier, these chargebacks can result in fines in the tens or even hundreds of thousands, and irrevocably damage business relationships. For the consumer, this botched paperwork could mean those red open-toe Ferragamo pumps won’t be available for summer and Tickle Me Elmo will be AWOL for Christmas.
Fortunately, dedicated EDI providers can help small businesses automate this complex process, solve compliance issues, improve trading partner relationships, and focus on core competencies, all while reducing business expenses.
“While the manual workload required for non-EDI vendors is prone to errors, delays, and steep overhead costs, a dedicated EDI provider can be an invaluable asset for small businesses,” says Thomas J. Stallings, CEO of EasyLink Services International Corporation.
Stallings offers the following recommendations for small businesses thinking about choosing an EDI provider:
… Educate yourself about EDI, what it is, how it functions, and what it means for your business.
… Research the depth of the provider’s retail relationships, and the number of pre-existing EDI templates they have on file. (Retailers use fields in different documents in different ways. These individual templates, called “maps,” are what differentiates automated business documents for Wal-Mart from, say, Sears.)
… Make certain your provider is equipped with redundant backup that achieves 100 percent compliance.
… Ensure your provider keeps abreast of industry changes. (For example, about four years ago, several retail giants gravitated away from VAN-based EDI transmissions in favor of a new, secure protocol called AS2.)
“Electronic Data Interchange helps bring small business products to the shelves of some of the nation’s largest retailers,” Stallings says.
The ultimate goal of many small business owners is to see their products on the shelves of major retailers. In order to automate the transmission and receipt of any number of business documents—including purchase orders, remittance advice, invoices, advance shipping notices, hang tags, labels, and catalogue updates—many suppliers and retailers use EDI, or Electronic Data Interchange. However, EDI is Greek to most small businesses and very difficult to perform internally without significant expenditures of money and manpower.
In addition, suppliers that fail to submit automated business transactions according to retailer specifications are faced with pricey penalties known as “charge backs.” For the supplier, these chargebacks can result in fines in the tens or even hundreds of thousands, and irrevocably damage business relationships. For the consumer, this botched paperwork could mean those red open-toe Ferragamo pumps won’t be available for summer and Tickle Me Elmo will be AWOL for Christmas.
Fortunately, dedicated EDI providers can help small businesses automate this complex process, solve compliance issues, improve trading partner relationships, and focus on core competencies, all while reducing business expenses.
“While the manual workload required for non-EDI vendors is prone to errors, delays, and steep overhead costs, a dedicated EDI provider can be an invaluable asset for small businesses,” says Thomas J. Stallings, CEO of EasyLink Services International Corporation.
Stallings offers the following recommendations for small businesses thinking about choosing an EDI provider:
… Educate yourself about EDI, what it is, how it functions, and what it means for your business.
… Research the depth of the provider’s retail relationships, and the number of pre-existing EDI templates they have on file. (Retailers use fields in different documents in different ways. These individual templates, called “maps,” are what differentiates automated business documents for Wal-Mart from, say, Sears.)
… Make certain your provider is equipped with redundant backup that achieves 100 percent compliance.
… Ensure your provider keeps abreast of industry changes. (For example, about four years ago, several retail giants gravitated away from VAN-based EDI transmissions in favor of a new, secure protocol called AS2.)
“Electronic Data Interchange helps bring small business products to the shelves of some of the nation’s largest retailers,” Stallings says.
Labels:
Brousseau,
EDI,
electronic,
mobile payments,
TAWPI
Thursday, May 22, 2008
A Virtual Safe-Deposit Box
Posted by Mark Brousseau
An interesting article from Wall Street & Technology about Wells Fargo's new vSafe offering:
Wells Fargo Introduces vSafe, a Virtual Safe-Deposit Box
May 21, 2008
Security and convenience -- that's what San Francisco-based Wells Fargo is promising customers with its new vSafe service. The Web-based virtual safe-deposit box allows users to archive almost any file format, from Microsoft (Redmond, Wash.) Word documents and PDFs, to audio and video files, according to Stephanie Smith, senior vice president of the Wells Fargo Internet services group.
"You can import any electronic or digitized document," Smith explains. Preestablished folders, such as "Medical" and "Family," help users organize their documents, and users can create or delete folders to personalize the storage solution, she adds. "Customers can just drag and drop their documents into those folders," Smith says, emphasizing the product's ease of use.
According to Smith, the product was conceived in response to Wells Fargo's ($575 billion in assets) research into how customers manage their financial lives, including document storage. A study of the bank's customers revealed that three-quarters of participants "weren't happy" with their own document management, she reports. From a battered shoebox stuffed with papers to unreachable manila folders on top of the refrigerator, customers' homemade storage solutions were neither secure nor convenient, Smith says.
From the catastrophic (accessing insurance policies after a house fire) to the more routine (getting copies of medical records while you're traveling), the the Web-based vSafe service, however, gives customers secure access to their files at any time, from any location, unlike hard drives and other archiving solutions, Smith contends. But, she cautions, vSafe is intended only as a backup storage solution. "We're not telling customers that it's as good as the original," Smith stresses. "It's a copy."
An Online Vault
The vSafe service, Smith adds, leverages the same security measures and stores customer data behind the same firewall that the bank uses for its online banking platform. "It's an extension of what we've been doing for 156 years -- safely and securely storing customer information," she comments, adding that customers who want additional protection can secure their accounts with two-factor authentication supported by RSA's hardware tokens, which can be purchased from the bank. Bedford, Mass.-based RSA is the security division of EMC (Hopkinton, Mass.).
Smith notes that the new offering, which will be integrated with Wells Fargo's other online banking services, was created without outside vendor support. "We built this ourselves," she says, adding that the solution employs "much of the same [technology] as we use for Wells Fargo online banking."
Javelin Strategy & Research (Pleasanton, Calif.) analyst Mary Monahan notes that the trust Wells Fargo has established with consumers may help adoption of the vSafe service. "Wells Fargo is a name known to them," she observes. The service, she adds, appears to meet a customer need and gives users a reason to stick with the bank, helping Wells Fargo's customer-retention efforts.
A Green Solution
In addition to helping customers organize their document storage, the vSafe product also can help the environment, Wells Fargo's Smith says, adding that the bank sees the service as an extension of its environmental initiatives. It will help customers "lessen their carbon footprints ... through online banking," she explains.
For example, customer statements will be uploaded to their vSafe accounts automatically. Though Javelin's Monahan says bank customers in general are "not really ready to get rid of their paper statements," she adds that the new Wells Fargo service is "very green -- and secure."
The cost of the vSafe service to Wells Fargo customers starts at $4.95 a month for one gigabyte of storage -- approximately 10,000 documents. The bank initially offered the vSafe service to its team members in March. A public rollout to personal banking customers and small business clients is scheduled for the summer.
An interesting article from Wall Street & Technology about Wells Fargo's new vSafe offering:
Wells Fargo Introduces vSafe, a Virtual Safe-Deposit Box
May 21, 2008
Security and convenience -- that's what San Francisco-based Wells Fargo is promising customers with its new vSafe service. The Web-based virtual safe-deposit box allows users to archive almost any file format, from Microsoft (Redmond, Wash.) Word documents and PDFs, to audio and video files, according to Stephanie Smith, senior vice president of the Wells Fargo Internet services group.
"You can import any electronic or digitized document," Smith explains. Preestablished folders, such as "Medical" and "Family," help users organize their documents, and users can create or delete folders to personalize the storage solution, she adds. "Customers can just drag and drop their documents into those folders," Smith says, emphasizing the product's ease of use.
According to Smith, the product was conceived in response to Wells Fargo's ($575 billion in assets) research into how customers manage their financial lives, including document storage. A study of the bank's customers revealed that three-quarters of participants "weren't happy" with their own document management, she reports. From a battered shoebox stuffed with papers to unreachable manila folders on top of the refrigerator, customers' homemade storage solutions were neither secure nor convenient, Smith says.
From the catastrophic (accessing insurance policies after a house fire) to the more routine (getting copies of medical records while you're traveling), the the Web-based vSafe service, however, gives customers secure access to their files at any time, from any location, unlike hard drives and other archiving solutions, Smith contends. But, she cautions, vSafe is intended only as a backup storage solution. "We're not telling customers that it's as good as the original," Smith stresses. "It's a copy."
An Online Vault
The vSafe service, Smith adds, leverages the same security measures and stores customer data behind the same firewall that the bank uses for its online banking platform. "It's an extension of what we've been doing for 156 years -- safely and securely storing customer information," she comments, adding that customers who want additional protection can secure their accounts with two-factor authentication supported by RSA's hardware tokens, which can be purchased from the bank. Bedford, Mass.-based RSA is the security division of EMC (Hopkinton, Mass.).
Smith notes that the new offering, which will be integrated with Wells Fargo's other online banking services, was created without outside vendor support. "We built this ourselves," she says, adding that the solution employs "much of the same [technology] as we use for Wells Fargo online banking."
Javelin Strategy & Research (Pleasanton, Calif.) analyst Mary Monahan notes that the trust Wells Fargo has established with consumers may help adoption of the vSafe service. "Wells Fargo is a name known to them," she observes. The service, she adds, appears to meet a customer need and gives users a reason to stick with the bank, helping Wells Fargo's customer-retention efforts.
A Green Solution
In addition to helping customers organize their document storage, the vSafe product also can help the environment, Wells Fargo's Smith says, adding that the bank sees the service as an extension of its environmental initiatives. It will help customers "lessen their carbon footprints ... through online banking," she explains.
For example, customer statements will be uploaded to their vSafe accounts automatically. Though Javelin's Monahan says bank customers in general are "not really ready to get rid of their paper statements," she adds that the new Wells Fargo service is "very green -- and secure."
The cost of the vSafe service to Wells Fargo customers starts at $4.95 a month for one gigabyte of storage -- approximately 10,000 documents. The bank initially offered the vSafe service to its team members in March. A public rollout to personal banking customers and small business clients is scheduled for the summer.
Labels:
Brousseau,
check imaging,
digital vault,
Wells Fargo
Wednesday, May 14, 2008
Mike Leavitt On Electronic Medical Records
Posted by Mark Brousseau
The Hill caught up with Health and Human Services Secretary Mike Leavitt to ask him about the status of the president's electronic medical records initiatives. The full transcript appears below:
Q&A with Mike Leavitt
By Jeffrey Young
In 2004, President Bush introduced a sweeping initiative to promote the development of a nationwide system of health information technology. One key goal is to provide every citizen with access to an electronic medical record by 2014. Bush tasked Health and Human Services Secretary Mike Leavitt to helm the gargantuan effort, which represents an attempt to bring together the private and public sectors to create a technological infrastructure for the healthcare system of the future.
Q: Is President Bush’s 10-year plan for electronic medical records on track?
I believe that will be accomplished. I think the goal may be exceeded. There will be a point where this begins to happen quickly. That’s the way technology develops: There are some early adopters, the mechanism has to be put into place, people begin to catch the vision. Once the consumer begins to see its value, it spreads quickly.
Q: Other than electronic medical records, what are the key components of a fully wired healthcare system?
I think it’s important to remember that the goal here isn’t electronic medical records. The goal is to transform the sector of healthcare into a system of healthcare, a system that provides consumers with information about the quality and the cost of their care.
To accomplish that, we need to have information digitized rather than on paper and we need to have that information mobilized so that it can be assembled in many different ways that are useful, not just to consumers but to various parties. If you look at digital health information, the goal looks different depending on who you are and what your interests are.
If you’re a consumer, you’re interested in having all of your health information accessible to you in a timely, useful way. You’d like to have your pharmacy records for the three pharmacies that you do business with in one place. You’d like to have the information from your doctor, from your hospital, from your specialists, from the labs, accessible to you in one place. You don’t want to have to go to your doctor and pick up a big brown envelope and transport it to a specialist for them to see a test that you took three days before. You’d like that to be electronically transferred, like everything else can be in your life.
If you’re a clinician or a doctor, you’re interested in having a clinical record that has more information than likely a consumer is interested in. You’re interested in being able to have decision-support information that would provide backup for decisions that you need to make, and alternatives and options that would help you make better decisions.
If you are a public-health expert, you’d like to be able to gather a lot of data from many different places. Even though it doesn’t have anyone’s name attached to it, it would provide you with statistical backup to look for trends. If you’re a researcher, you’d like to see something quite similar, but you would like to see it from a wider area.
My point is that, if you’re a consumer or a clinician or a researcher, all of this looks different to you. To put it a different way, it will look different depending on the way you view the world.
The goal here is to, first of all, get information transferred from paper into a digital format, then create a means by which it can be mobilized so it can be assembled in various forms, and to provide benefit to consumers, to practitioners, to hospitals, to researchers, to public health, or to those in the business of health.
Q: What will be the biggest advantages to better incorporating information technology into the healthcare system?
In the long run, it will provide better health, lower costs, fewer medical mistakes and a lot less hassle.
Q: All three of the presidential candidates’ healthcare reform plans depend in part on health IT saving money. How much money could be saved through the efficiencies that should come with from using these technologies?
The best source of that may be the RAND study. The RAND Corporation did a study that said … there was a 30 percent inefficiency in the costs. I’m not predicting a 30 percent reduction but I do believe that over time we can begin to have medical inflation more closely approximate regular inflation. If you could make that change in the glide pattern or the slope of growth, it would make a profound economic difference.
Q: Since the administration’s effort began, what’s done and what’s still to be accomplished?
Three years ago, there were 200 vendors who were producing electronic medical records for sale. None of them had the capacity to be interoperable because there were no standards for interoperability. Interoperability essentially means one computer system being able to talk to another.
We developed a process for developing standards that incorporated the medical family, developers of technology, government entities, insurance companies, et cetera, and have now instituted a means by which those standards are emerging.
We also created a process for certifying systems that meet those standards. Three years ago, there was no process and obviously no means of certifying that people were on the pathway to health IT or to interoperability. We have accomplished that: Some 75 percent of all the systems now available for sale are certified by the CCHIT, which is the Certification Commission on Health Information Technology.
Let me tell you why that’s important. I bumped into, at a pathology bench at Stanford University, a young student who was ready to go out into practice. He said, “I heard your speech about electronic health records and I subscribe to your view. In fact, I’m going to set my practice up next year and I want to buy a system. I only have one question: What system should I buy? I can only afford to do this once and I can’t get it wrong.” It was not possible for me to recommend a system before. Now, I’m able to say to him, “Whatever system you buy, make sure it is CCHIT-compatible. If you do, you’ll be on a pathway to interoperability and your vendor will need to continue to update their system to meet those standards.” I think that’s a significant step forward.
If interoperability was two feet long, we’d be at about the six or eight inches mark. Next year, we will be at eight or 10 inches. Each year, we’ll get a little closer to interoperability. We won’t see full interoperability for some time but we will see functional interoperability beginning to develop real soon.
That’s on records themselves. The next challenge is to lay in a system for a national health information network, where information can be transported between systems. We will see live data transmitted over that system in September. We’ll see real data begin to flow early next year. All of this is being done at a fairly rudimentary level. The sophistication of it will increase as our experience grows.
A second area of real challenge is in adoption among small- and medium-sized physicians. If you talk to a small- or medium-sized physician today, they will ask the question, “Why should I buy this system for $40,000 or $50,000 when the benefit will go to insurance companies and/or consumers/” We have to change the macroeconomics of medical reimbursement so that everyone benefits.
We have just announced and will conduct a Medicare demonstration project that will help us learn how to do that. We’re choosing 1,200 small physician practices throughout the country and we’re going to start paying them more if they have an electronic medical record. The second year, we’ll pay them more if they report a series of quality measures over their system. And the third year and the fourth year and the fifth year, we’ll pay them more if they can demonstrate that they followed the quality measures and report them on their electronic medical record that’s certified. Over that period of time, we’ll get better and better at learning how to share the value of electronic medical records with everyone.
A third step will be actual implementation of certain portions of it on a national basis. The most logical next step is e-prescribing. I’m hopeful that the Congress will act with the SGR [physician sustainable growth rate payment formula] fix, which will likely be at the end of June, to enable HHS to use its leverage as a payer to motivate physicians to adopt and use e-prescribing.
Q: What have been the biggest obstacles that have kept the healthcare system so behind the curve on IT compared to the rest of the economy?
First of all, it’s far more complex than any other sector of the economy. If you look at banks, for example, they’re highly interoperable but they deal with a very basic measure, and that’s the dollar. They have a currency and once you’re managing where those dollars go, it’s easier than if you’re having to develop the means of managing all of the conditions and the information that goes into healthcare. So, it’s substantially more complex. It’s also far more segmented.
There is no such thing as a national healthcare market. It’s a lot of individual communities and the collective network of communities is the national health insurance market. You essentially have to deal with this one marketplace at a time.
Q: Why is it important the federal government be involved in this process? Why can’t the private sector be left to develop this system on its own?
The best illustration of that is the fact that we had 200 vendors producing electronic medical records, all of whom viewed it as in their interests to be separate from the others to create proprietary interest. I believe the government has a role, and it’s to be that of an organizer of the system.
We have a role not just in our capacity as regulators but in our capacity as a payer. Medicare is the largest single payer. Medicaid, through its affiliations with the states, is another very large payer. That gives us the responsibility, in my judgment, to motivate a collective action. There’s no one else who has that amount of influence.
Q: Who should be responsible for covering the costs of developing and implementing IT in the medical system?
I want to make sure that you and your readers understand what’s involved in interoperability. If you were to take five different systems that are different sizes and different complexities, they don’t need to do everything the same. They simply need to do a limited number of functions the same.
Just to give a rudimentary illustration, let’s assume that there were three systems: one that would do 100,000 different things, one that one would 50,000 different things, and one that would do 10,000 different things. We need those three systems only to do about 800 things the same in order for them to [be considered] interoperable.
We have identified those 800 things and were are aggressively working to develop standards that can be adopted by anyone who is developing a new technology, whether it is a device that could monitor health results outside the hospital or whether it’s a clinical system or a system at a pharmacy, and all of these need to converge.
Let me give you a good example. We know that if diabetics have their hemoglobin A1c tested every quarter, that the chances of them having a complication that would cost a lot of money is reduced because we see the complications as they develop and we can get ahead of them. Well, sometimes it’s difficult to get diabetics to test their blood sugar every day, let alone their hemoglobin A1c on a regular basis. I saw a cell phone that had the equipment necessary to check one’s blood sugar, and various other components, built into the cell phone. You could prick your finger, draw the blood with the end of your cell phone, it would then analyze the blood and use the telephone technology to send it back to an electronic health record. That would be a valuable way of managing the chronic illness of diabetes, but if the cell phone collected the information in a form that could not be transmitted through the telephone to the electronic health record, we would have missed out on that opportunity. So the standards not only apply to electronic medical records, they need to apply to devices that may be in the marketplace.
The private sector will be developing the technology. We simply need to provide the basic standards that they can build into the technology so that it can communicate with other technology that may be developed independent of them.
Q: What are the key legislative changes that Congress should consider in the near term?
Congress could do four things in the near term that would have a substantial impact on this vision maturing.
The first would be standing behind the standards process and developing all of our incentives to support that. Interoperability requires standards. A network, a system, requires standards. If you look at other industries that have been through this — you look at cell phones, you look at ATMs, you look at things as big as airports — they all require standards so that information can be used interchangeably as it develops. It’s vital that Congress support the standards-development process that we have in place that’s now working and that any legislation they do points toward that, as opposed to competing with it.
The second thing I would point to is e-prescribing. That’s a logical next step. We have the technology in place, the standards are in place, we know it saves time and money — and it’s time. The legislation that’s being proposed would provide the secretary with tools necessary to leverage our power as a payer in encouraging the rapid adoption of e-prescribing.
I think a third area would be in helping us leverage our power as a payer to encourage the adoption of electronic medical records. …
When they put [ATMs] into banks, people didn’t use them. They were accustomed to walking up to the counter and dealing with a teller who had become their friend. They first put people in the lobbies to try to bring people over to the machine and teach them how to use it. They would give them toaster ovens and lots of things to incentivize them to use the ATM because it was a more efficient way. At some point, the banks concluded, “We can’t afford to do business with you in the same way if you insist on using the teller at the counter for every transaction. Therefore, we’re going to charge you more if you go to the counter.” Once they did that, people started to move to the ATMs. The same thing was true when we went from dial telephones to touch-tones. There was a transition that had to take place in the way people used the telephone.
We’ll have to go through the same thing with electronic medical records. At first, we’ll have to give people incentives and reasons to use it. But there will be a point where we will have to say we can no longer afford to deal with you in the same way if you’re not going to use the most efficient way of doing business. So, we’ll continue to pay you, but we can’t pay you at the highest level unless you have electronic medical records. There may be, at first, some kind of incentive that helps them transition, but at some point, they have to recognize that they can’t be paid the same if they’re less using a less efficient way of doing business.
The fourth one is, one of the real virtues of electronic medical records is the ability to define quality and cost in a way that’s usable for consumers. The most important tool in doing that is information from claims data. Right now, we have two federal courts that are conflicting on the use of Medicare claims data. A Florida court says we can’t give to anybody to measure quality and a D.C. court says we have to give everything to everybody. Neither of those is the right answer. The right answer is to use the data in way that will help define quality in a controlled and effective way. We need legislation to resolve that dispute. I might add that all of those are part of the SGR legislation.…This isn’t about having people have computers that keep electronic data. It’s enabling the data to be in a form that’s usable to people and that can be mobilized and assembled in a lot of different ways.
One of those ways is giving consumers information about the cost and quality of their care so that they have choices and they can compare. We know when people have choices, they make decisions that drive the quality up and the costs down.
Health information technology is an enabler of better quality, lower costs, fewer mistakes and more convenience.
That’s why we push for electronic medical records. It isn’t just because it’s a tidier way to do business. It’s because it produces value. The actual implementation of the records is a necessary step toward that larger goal. The goal is the value that the records produce, not just the existence of the records.
The Hill caught up with Health and Human Services Secretary Mike Leavitt to ask him about the status of the president's electronic medical records initiatives. The full transcript appears below:
Q&A with Mike Leavitt
By Jeffrey Young
In 2004, President Bush introduced a sweeping initiative to promote the development of a nationwide system of health information technology. One key goal is to provide every citizen with access to an electronic medical record by 2014. Bush tasked Health and Human Services Secretary Mike Leavitt to helm the gargantuan effort, which represents an attempt to bring together the private and public sectors to create a technological infrastructure for the healthcare system of the future.
Q: Is President Bush’s 10-year plan for electronic medical records on track?
I believe that will be accomplished. I think the goal may be exceeded. There will be a point where this begins to happen quickly. That’s the way technology develops: There are some early adopters, the mechanism has to be put into place, people begin to catch the vision. Once the consumer begins to see its value, it spreads quickly.
Q: Other than electronic medical records, what are the key components of a fully wired healthcare system?
I think it’s important to remember that the goal here isn’t electronic medical records. The goal is to transform the sector of healthcare into a system of healthcare, a system that provides consumers with information about the quality and the cost of their care.
To accomplish that, we need to have information digitized rather than on paper and we need to have that information mobilized so that it can be assembled in many different ways that are useful, not just to consumers but to various parties. If you look at digital health information, the goal looks different depending on who you are and what your interests are.
If you’re a consumer, you’re interested in having all of your health information accessible to you in a timely, useful way. You’d like to have your pharmacy records for the three pharmacies that you do business with in one place. You’d like to have the information from your doctor, from your hospital, from your specialists, from the labs, accessible to you in one place. You don’t want to have to go to your doctor and pick up a big brown envelope and transport it to a specialist for them to see a test that you took three days before. You’d like that to be electronically transferred, like everything else can be in your life.
If you’re a clinician or a doctor, you’re interested in having a clinical record that has more information than likely a consumer is interested in. You’re interested in being able to have decision-support information that would provide backup for decisions that you need to make, and alternatives and options that would help you make better decisions.
If you are a public-health expert, you’d like to be able to gather a lot of data from many different places. Even though it doesn’t have anyone’s name attached to it, it would provide you with statistical backup to look for trends. If you’re a researcher, you’d like to see something quite similar, but you would like to see it from a wider area.
My point is that, if you’re a consumer or a clinician or a researcher, all of this looks different to you. To put it a different way, it will look different depending on the way you view the world.
The goal here is to, first of all, get information transferred from paper into a digital format, then create a means by which it can be mobilized so it can be assembled in various forms, and to provide benefit to consumers, to practitioners, to hospitals, to researchers, to public health, or to those in the business of health.
Q: What will be the biggest advantages to better incorporating information technology into the healthcare system?
In the long run, it will provide better health, lower costs, fewer medical mistakes and a lot less hassle.
Q: All three of the presidential candidates’ healthcare reform plans depend in part on health IT saving money. How much money could be saved through the efficiencies that should come with from using these technologies?
The best source of that may be the RAND study. The RAND Corporation did a study that said … there was a 30 percent inefficiency in the costs. I’m not predicting a 30 percent reduction but I do believe that over time we can begin to have medical inflation more closely approximate regular inflation. If you could make that change in the glide pattern or the slope of growth, it would make a profound economic difference.
Q: Since the administration’s effort began, what’s done and what’s still to be accomplished?
Three years ago, there were 200 vendors who were producing electronic medical records for sale. None of them had the capacity to be interoperable because there were no standards for interoperability. Interoperability essentially means one computer system being able to talk to another.
We developed a process for developing standards that incorporated the medical family, developers of technology, government entities, insurance companies, et cetera, and have now instituted a means by which those standards are emerging.
We also created a process for certifying systems that meet those standards. Three years ago, there was no process and obviously no means of certifying that people were on the pathway to health IT or to interoperability. We have accomplished that: Some 75 percent of all the systems now available for sale are certified by the CCHIT, which is the Certification Commission on Health Information Technology.
Let me tell you why that’s important. I bumped into, at a pathology bench at Stanford University, a young student who was ready to go out into practice. He said, “I heard your speech about electronic health records and I subscribe to your view. In fact, I’m going to set my practice up next year and I want to buy a system. I only have one question: What system should I buy? I can only afford to do this once and I can’t get it wrong.” It was not possible for me to recommend a system before. Now, I’m able to say to him, “Whatever system you buy, make sure it is CCHIT-compatible. If you do, you’ll be on a pathway to interoperability and your vendor will need to continue to update their system to meet those standards.” I think that’s a significant step forward.
If interoperability was two feet long, we’d be at about the six or eight inches mark. Next year, we will be at eight or 10 inches. Each year, we’ll get a little closer to interoperability. We won’t see full interoperability for some time but we will see functional interoperability beginning to develop real soon.
That’s on records themselves. The next challenge is to lay in a system for a national health information network, where information can be transported between systems. We will see live data transmitted over that system in September. We’ll see real data begin to flow early next year. All of this is being done at a fairly rudimentary level. The sophistication of it will increase as our experience grows.
A second area of real challenge is in adoption among small- and medium-sized physicians. If you talk to a small- or medium-sized physician today, they will ask the question, “Why should I buy this system for $40,000 or $50,000 when the benefit will go to insurance companies and/or consumers/” We have to change the macroeconomics of medical reimbursement so that everyone benefits.
We have just announced and will conduct a Medicare demonstration project that will help us learn how to do that. We’re choosing 1,200 small physician practices throughout the country and we’re going to start paying them more if they have an electronic medical record. The second year, we’ll pay them more if they report a series of quality measures over their system. And the third year and the fourth year and the fifth year, we’ll pay them more if they can demonstrate that they followed the quality measures and report them on their electronic medical record that’s certified. Over that period of time, we’ll get better and better at learning how to share the value of electronic medical records with everyone.
A third step will be actual implementation of certain portions of it on a national basis. The most logical next step is e-prescribing. I’m hopeful that the Congress will act with the SGR [physician sustainable growth rate payment formula] fix, which will likely be at the end of June, to enable HHS to use its leverage as a payer to motivate physicians to adopt and use e-prescribing.
Q: What have been the biggest obstacles that have kept the healthcare system so behind the curve on IT compared to the rest of the economy?
First of all, it’s far more complex than any other sector of the economy. If you look at banks, for example, they’re highly interoperable but they deal with a very basic measure, and that’s the dollar. They have a currency and once you’re managing where those dollars go, it’s easier than if you’re having to develop the means of managing all of the conditions and the information that goes into healthcare. So, it’s substantially more complex. It’s also far more segmented.
There is no such thing as a national healthcare market. It’s a lot of individual communities and the collective network of communities is the national health insurance market. You essentially have to deal with this one marketplace at a time.
Q: Why is it important the federal government be involved in this process? Why can’t the private sector be left to develop this system on its own?
The best illustration of that is the fact that we had 200 vendors producing electronic medical records, all of whom viewed it as in their interests to be separate from the others to create proprietary interest. I believe the government has a role, and it’s to be that of an organizer of the system.
We have a role not just in our capacity as regulators but in our capacity as a payer. Medicare is the largest single payer. Medicaid, through its affiliations with the states, is another very large payer. That gives us the responsibility, in my judgment, to motivate a collective action. There’s no one else who has that amount of influence.
Q: Who should be responsible for covering the costs of developing and implementing IT in the medical system?
I want to make sure that you and your readers understand what’s involved in interoperability. If you were to take five different systems that are different sizes and different complexities, they don’t need to do everything the same. They simply need to do a limited number of functions the same.
Just to give a rudimentary illustration, let’s assume that there were three systems: one that would do 100,000 different things, one that one would 50,000 different things, and one that would do 10,000 different things. We need those three systems only to do about 800 things the same in order for them to [be considered] interoperable.
We have identified those 800 things and were are aggressively working to develop standards that can be adopted by anyone who is developing a new technology, whether it is a device that could monitor health results outside the hospital or whether it’s a clinical system or a system at a pharmacy, and all of these need to converge.
Let me give you a good example. We know that if diabetics have their hemoglobin A1c tested every quarter, that the chances of them having a complication that would cost a lot of money is reduced because we see the complications as they develop and we can get ahead of them. Well, sometimes it’s difficult to get diabetics to test their blood sugar every day, let alone their hemoglobin A1c on a regular basis. I saw a cell phone that had the equipment necessary to check one’s blood sugar, and various other components, built into the cell phone. You could prick your finger, draw the blood with the end of your cell phone, it would then analyze the blood and use the telephone technology to send it back to an electronic health record. That would be a valuable way of managing the chronic illness of diabetes, but if the cell phone collected the information in a form that could not be transmitted through the telephone to the electronic health record, we would have missed out on that opportunity. So the standards not only apply to electronic medical records, they need to apply to devices that may be in the marketplace.
The private sector will be developing the technology. We simply need to provide the basic standards that they can build into the technology so that it can communicate with other technology that may be developed independent of them.
Q: What are the key legislative changes that Congress should consider in the near term?
Congress could do four things in the near term that would have a substantial impact on this vision maturing.
The first would be standing behind the standards process and developing all of our incentives to support that. Interoperability requires standards. A network, a system, requires standards. If you look at other industries that have been through this — you look at cell phones, you look at ATMs, you look at things as big as airports — they all require standards so that information can be used interchangeably as it develops. It’s vital that Congress support the standards-development process that we have in place that’s now working and that any legislation they do points toward that, as opposed to competing with it.
The second thing I would point to is e-prescribing. That’s a logical next step. We have the technology in place, the standards are in place, we know it saves time and money — and it’s time. The legislation that’s being proposed would provide the secretary with tools necessary to leverage our power as a payer in encouraging the rapid adoption of e-prescribing.
I think a third area would be in helping us leverage our power as a payer to encourage the adoption of electronic medical records. …
When they put [ATMs] into banks, people didn’t use them. They were accustomed to walking up to the counter and dealing with a teller who had become their friend. They first put people in the lobbies to try to bring people over to the machine and teach them how to use it. They would give them toaster ovens and lots of things to incentivize them to use the ATM because it was a more efficient way. At some point, the banks concluded, “We can’t afford to do business with you in the same way if you insist on using the teller at the counter for every transaction. Therefore, we’re going to charge you more if you go to the counter.” Once they did that, people started to move to the ATMs. The same thing was true when we went from dial telephones to touch-tones. There was a transition that had to take place in the way people used the telephone.
We’ll have to go through the same thing with electronic medical records. At first, we’ll have to give people incentives and reasons to use it. But there will be a point where we will have to say we can no longer afford to deal with you in the same way if you’re not going to use the most efficient way of doing business. So, we’ll continue to pay you, but we can’t pay you at the highest level unless you have electronic medical records. There may be, at first, some kind of incentive that helps them transition, but at some point, they have to recognize that they can’t be paid the same if they’re less using a less efficient way of doing business.
The fourth one is, one of the real virtues of electronic medical records is the ability to define quality and cost in a way that’s usable for consumers. The most important tool in doing that is information from claims data. Right now, we have two federal courts that are conflicting on the use of Medicare claims data. A Florida court says we can’t give to anybody to measure quality and a D.C. court says we have to give everything to everybody. Neither of those is the right answer. The right answer is to use the data in way that will help define quality in a controlled and effective way. We need legislation to resolve that dispute. I might add that all of those are part of the SGR legislation.…This isn’t about having people have computers that keep electronic data. It’s enabling the data to be in a form that’s usable to people and that can be mobilized and assembled in a lot of different ways.
One of those ways is giving consumers information about the cost and quality of their care so that they have choices and they can compare. We know when people have choices, they make decisions that drive the quality up and the costs down.
Health information technology is an enabler of better quality, lower costs, fewer mistakes and more convenience.
That’s why we push for electronic medical records. It isn’t just because it’s a tidier way to do business. It’s because it produces value. The actual implementation of the records is a necessary step toward that larger goal. The goal is the value that the records produce, not just the existence of the records.
Labels:
Brousseau,
electronic medical records,
healthcare,
TAWPI
Tuesday, May 13, 2008
Mobile Deposits Drive Mitek's Turnaround
Posted by Mark Brousseau
An interesting article on Mitek in yesterday's USA Today:
Mitek CEO pins turnaround on wireless check deposits
By Greg Farrell, USA TODAY
When Jim DeBello launched his technology career two decades ago, a mentor told him that on top of getting an education, he'd also get bloodied and bruised.
"He was right on all counts," says DeBello, CEO of Mitek Systems. "It's been a school of ups and downs and sideways."
Not that DeBello, a defensive end for his college football team, minded getting knocked around. Just the opposite: The lessons from his failures have improved his game immeasurably. "It's a lot of fun to be challenged by the unknown and untried," he says.
Into the unknown is where DeBello has brought Mitek, an image-recognition software company based in San Diego. After being installed as CEO five years ago, DeBello has pushed Mitek's image-recognition tools onto wireless platforms. In January, the company introduced an application that enables consumers to scan and deposit checks with their cellphone cameras.
Whether it works — and returns the company to profitability — remains to be seen. But for DeBello, who dabbles in oil painting in his spare time, technological innovation is inspiring.
"Innovation is the heart of technology start-ups," he says. "I'm not an artist, but it's the closest thing to art I can think of."
DeBello's first tech start-up, Solectek, married wireless technology to laptop computers. Great idea, right? Sure, but not in the early 1990s. Today, nearly every computer is configured for wireless operation, but back then, at a time when Internet connectivity was painfully slow and before the widespread adoption of cellphones, DeBello's wireless local area network concept was an idea ahead of its time.
Getting ahead of yourself can be costly
"If you're way too early, you're thinking too far ahead," says the 49-year-old San Diego native. "Sometimes it takes anywhere from eight to 10 years for technology to get adopted. We need to digest it."
The experience of being sacrificial pioneers was a painful one to DeBello and his colleagues.
After selling the company in 1996, DeBello moved to Qualcomm, where he continued to work in the wireless area, and eventually ran a joint venture. But he grew tired of corporate hierarchy. "I didn't want to spend all my time working on internal alignment, the political nature of the organization and such," he says. "It was just not inspiring or enjoyable."
By 1999, DeBello had accumulated enough experience to qualify as a "grown-up CEO" candidate in the world of dot-coms. He became chief executive of CollegeClub.com, an early social-networking site. But in 2000, just as the company was about to go public, the dot-com bubble burst and the game was over.
Through his mentor, technology investor John Thornton, DeBello had held a seat since 1994 on the board of Mitek Systems. During the Cold War, Mitek had been a major supplier to the U.S. government of security hardware products that helped prevent the Soviet Union from eavesdropping on electronic data transmissions through computers, faxes and printers.
When the Cold War ended, demand for its product disappeared, and the Mitek workforce dropped from 300 to 16. In the 1990s, Mitek used its recognition technology capabilities to help banks with their check-processing operations. But the financial results were disappointing.
Giving Mitek a new direction
In 2003, dissatisfied with the direction the company was taking, Thornton installed DeBello as Mitek's new CEO. Since then, DeBello has divested two products and redirected the company toward mobile imaging.
The result: In January, Mitek announced a new software application, Mobile Deposit, designed to allow consumers to scan and deposit checks into their bank accounts using the cameras on their mobile phones.
Although some banking experts believe consumers will embrace mobile banking in the near future, DeBello wants to market the product to small businesses that accept and deliver goods or services. Of the 32 billion checks written in the USA each year, DeBello says 20 billion are for business transactions.
For truck drivers who collect cash on delivery, Mitek's application would allow them to cash a customer's check instantly, instead of leaving the premises and hoping that the check doesn't bounce. It would also come in handy for anyone from the plumber to the Amway salesperson who accepts checks for payment.
"Mobile banking 1.0 was bill pay and balance transfers on the cellphone," DeBello says. "Mobile banking 2.0 is about payments. We have a real big piece of that in terms of the ability to deposit checks."
For Mitek, which lost $384,000 in fiscal 2007 on revenue of $5.6 million, the new product could transform red ink into black. DeBello's now working with several companies to test drive the product.
How Mitek's technology can be put to work
"This is a technology that will change the game," says Chris Cramer, CEO of Karl Strauss Brewing, a San Diego craft beer. California state law restricts how much credit a beer distributor can extend to restaurants and bars, and Cramer says he's considering putting Mitek's new application into the field.
"There's tremendous turnover in the restaurant business," Cramer says. "You need to keep people 30-days current. Here's an opportunity to know instantly if there are sufficient funds in an account, and to have that information routed through the accounting system and go to the (chief financial officer's) desk so he can make a decision."
Danny Jett, executive vice president at Georgian Bank in Atlanta, says Mitek's product could add greater efficiency to the banking process. "All banks are suffering from margin compression," Jett says. "You look for ways to do things more effectively. That's what I see with Mitek's product. Is it going to be accepted now? Who knows? But within 12 to 18 months, acceptance will increase. That's the way Internet banking was."
An interesting article on Mitek in yesterday's USA Today:
Mitek CEO pins turnaround on wireless check deposits
By Greg Farrell, USA TODAY
When Jim DeBello launched his technology career two decades ago, a mentor told him that on top of getting an education, he'd also get bloodied and bruised.
"He was right on all counts," says DeBello, CEO of Mitek Systems. "It's been a school of ups and downs and sideways."
Not that DeBello, a defensive end for his college football team, minded getting knocked around. Just the opposite: The lessons from his failures have improved his game immeasurably. "It's a lot of fun to be challenged by the unknown and untried," he says.
Into the unknown is where DeBello has brought Mitek, an image-recognition software company based in San Diego. After being installed as CEO five years ago, DeBello has pushed Mitek's image-recognition tools onto wireless platforms. In January, the company introduced an application that enables consumers to scan and deposit checks with their cellphone cameras.
Whether it works — and returns the company to profitability — remains to be seen. But for DeBello, who dabbles in oil painting in his spare time, technological innovation is inspiring.
"Innovation is the heart of technology start-ups," he says. "I'm not an artist, but it's the closest thing to art I can think of."
DeBello's first tech start-up, Solectek, married wireless technology to laptop computers. Great idea, right? Sure, but not in the early 1990s. Today, nearly every computer is configured for wireless operation, but back then, at a time when Internet connectivity was painfully slow and before the widespread adoption of cellphones, DeBello's wireless local area network concept was an idea ahead of its time.
Getting ahead of yourself can be costly
"If you're way too early, you're thinking too far ahead," says the 49-year-old San Diego native. "Sometimes it takes anywhere from eight to 10 years for technology to get adopted. We need to digest it."
The experience of being sacrificial pioneers was a painful one to DeBello and his colleagues.
After selling the company in 1996, DeBello moved to Qualcomm, where he continued to work in the wireless area, and eventually ran a joint venture. But he grew tired of corporate hierarchy. "I didn't want to spend all my time working on internal alignment, the political nature of the organization and such," he says. "It was just not inspiring or enjoyable."
By 1999, DeBello had accumulated enough experience to qualify as a "grown-up CEO" candidate in the world of dot-coms. He became chief executive of CollegeClub.com, an early social-networking site. But in 2000, just as the company was about to go public, the dot-com bubble burst and the game was over.
Through his mentor, technology investor John Thornton, DeBello had held a seat since 1994 on the board of Mitek Systems. During the Cold War, Mitek had been a major supplier to the U.S. government of security hardware products that helped prevent the Soviet Union from eavesdropping on electronic data transmissions through computers, faxes and printers.
When the Cold War ended, demand for its product disappeared, and the Mitek workforce dropped from 300 to 16. In the 1990s, Mitek used its recognition technology capabilities to help banks with their check-processing operations. But the financial results were disappointing.
Giving Mitek a new direction
In 2003, dissatisfied with the direction the company was taking, Thornton installed DeBello as Mitek's new CEO. Since then, DeBello has divested two products and redirected the company toward mobile imaging.
The result: In January, Mitek announced a new software application, Mobile Deposit, designed to allow consumers to scan and deposit checks into their bank accounts using the cameras on their mobile phones.
Although some banking experts believe consumers will embrace mobile banking in the near future, DeBello wants to market the product to small businesses that accept and deliver goods or services. Of the 32 billion checks written in the USA each year, DeBello says 20 billion are for business transactions.
For truck drivers who collect cash on delivery, Mitek's application would allow them to cash a customer's check instantly, instead of leaving the premises and hoping that the check doesn't bounce. It would also come in handy for anyone from the plumber to the Amway salesperson who accepts checks for payment.
"Mobile banking 1.0 was bill pay and balance transfers on the cellphone," DeBello says. "Mobile banking 2.0 is about payments. We have a real big piece of that in terms of the ability to deposit checks."
For Mitek, which lost $384,000 in fiscal 2007 on revenue of $5.6 million, the new product could transform red ink into black. DeBello's now working with several companies to test drive the product.
How Mitek's technology can be put to work
"This is a technology that will change the game," says Chris Cramer, CEO of Karl Strauss Brewing, a San Diego craft beer. California state law restricts how much credit a beer distributor can extend to restaurants and bars, and Cramer says he's considering putting Mitek's new application into the field.
"There's tremendous turnover in the restaurant business," Cramer says. "You need to keep people 30-days current. Here's an opportunity to know instantly if there are sufficient funds in an account, and to have that information routed through the accounting system and go to the (chief financial officer's) desk so he can make a decision."
Danny Jett, executive vice president at Georgian Bank in Atlanta, says Mitek's product could add greater efficiency to the banking process. "All banks are suffering from margin compression," Jett says. "You look for ways to do things more effectively. That's what I see with Mitek's product. Is it going to be accepted now? Who knows? But within 12 to 18 months, acceptance will increase. That's the way Internet banking was."
Labels:
Brousseau,
Check 21,
Mitek,
mobile banking,
TAWPI
Sunday, May 11, 2008
Business Intelligence 2.0
By Mark Brousseau
A new buzz in business intelligence (BI) is business intelligence 2.0. That’s according to Accenture. While traditional business intelligence and data warehousing are concerned with analyzing the past, BI 2.0 concentrates on the future. Put simply, it refers to drawing inferences from historical data, applying the resulting insights to events as they happen and then managing future events through predictive analysis, Accenture says.
Just a few years ago, it took weeks or even months to detect an unusual process, analyze the event, formulate and take the required actions. BI 2.0 provides these capabilities in real-time. BI 2.0 brings a burst of radical thinking and a fair number of promises that, when realized, will make a real difference to bottom lines and help companies move towards high performance levels. But what is really needed to embrace BI 2.0? Smart CIOs are examining their data management.
Accenture’s research has shown that 92 percent of CIOs widely include structured data in their information strategies and almost 60 percent see BI as a core component for competitive differentiation. These findings come as no surprise but disturbingly, traditional business intelligence is too often used in an undifferentiated way. Aggregated data from the past is often viewed outside of its context and compared with static key performance indicators. Knowledge workers receive standardized reports and then take time to interpret the data and make decisions.
Business intelligence 2.0 focuses on business events and how business processes and business users respond to them, Accenture says. For example, unusually high returns of a best-selling product would lead to the examination of many factors. Is that particular batch of product faulty, is there a pattern to the consumers who are returning this product, is there a problem with the packaging, or the sales staff, or even evidence of fraud? In this simplistic example, an "event" called "product return" triggers a series of responses that require access to information and should trigger intelligent decision-making, based on a variety of conditions.
The vast majority of applications and processes have limited ability to absorb changing business needs, are not explicitly defined and do not have comprehensive metadata management processes, Accenture notes. If we are to make BI 2.0 a reality, we first need to look at the assumptions and promises of BI 2.0 from a data management perspective.
How is your organization refining its use of business intelligence? Post your comments below.
A new buzz in business intelligence (BI) is business intelligence 2.0. That’s according to Accenture. While traditional business intelligence and data warehousing are concerned with analyzing the past, BI 2.0 concentrates on the future. Put simply, it refers to drawing inferences from historical data, applying the resulting insights to events as they happen and then managing future events through predictive analysis, Accenture says.
Just a few years ago, it took weeks or even months to detect an unusual process, analyze the event, formulate and take the required actions. BI 2.0 provides these capabilities in real-time. BI 2.0 brings a burst of radical thinking and a fair number of promises that, when realized, will make a real difference to bottom lines and help companies move towards high performance levels. But what is really needed to embrace BI 2.0? Smart CIOs are examining their data management.
Accenture’s research has shown that 92 percent of CIOs widely include structured data in their information strategies and almost 60 percent see BI as a core component for competitive differentiation. These findings come as no surprise but disturbingly, traditional business intelligence is too often used in an undifferentiated way. Aggregated data from the past is often viewed outside of its context and compared with static key performance indicators. Knowledge workers receive standardized reports and then take time to interpret the data and make decisions.
Business intelligence 2.0 focuses on business events and how business processes and business users respond to them, Accenture says. For example, unusually high returns of a best-selling product would lead to the examination of many factors. Is that particular batch of product faulty, is there a pattern to the consumers who are returning this product, is there a problem with the packaging, or the sales staff, or even evidence of fraud? In this simplistic example, an "event" called "product return" triggers a series of responses that require access to information and should trigger intelligent decision-making, based on a variety of conditions.
The vast majority of applications and processes have limited ability to absorb changing business needs, are not explicitly defined and do not have comprehensive metadata management processes, Accenture notes. If we are to make BI 2.0 a reality, we first need to look at the assumptions and promises of BI 2.0 from a data management perspective.
How is your organization refining its use of business intelligence? Post your comments below.
Labels:
analytics,
BI,
Brousseau,
business intelligence,
TAWPI
Benchmarking "Do's" and "Don'ts"
By Mark Brousseau
One of the hottest topics at TAWPI’s Payments Capture & Clearing (PCC) Council meeting last month in Las Vegas was how organizations can get the greatest benefits from benchmarking and best practices.
That same question was put to a gathering of top supply chain executives at the Supply Chain Leadership Forum, an event hosted by the Supply Chain Consortium.
By comparing notes and taking a Consortium survey, forum attendees identified the Top 5 “Do’s” and “Don’ts” of benchmarking and best practices. The survey gathered information from participants’ real-life experiences, including supply chain areas that have achieved performance improvements and benchmarking information that has been used and interpreted by their companies.
The Top 5 Do’s:
1. Do align with key stakeholders.
2. Do succinctly summarize benefits for top management.
3. Do reduce your scope to actionable items.
4. Do maintain perspective of both your business and cultural model.
5. Do test multiple options before drawing conclusions.
The Top 5 Don’ts:
1. Don’t use competitors that match up poorly with your processes.
2. Don’t ignore your competition.
3. Don’t use the “boil the ocean” approach (focus, focus, focus).
4. Don’t use benchmarking and data analysis tools without understanding how they work.
5. Don’t work in a vacuum and think your organization knows it all.
Do you have any benchmarking "do's" and "don'ts" to share? Post them below.
One of the hottest topics at TAWPI’s Payments Capture & Clearing (PCC) Council meeting last month in Las Vegas was how organizations can get the greatest benefits from benchmarking and best practices.
That same question was put to a gathering of top supply chain executives at the Supply Chain Leadership Forum, an event hosted by the Supply Chain Consortium.
By comparing notes and taking a Consortium survey, forum attendees identified the Top 5 “Do’s” and “Don’ts” of benchmarking and best practices. The survey gathered information from participants’ real-life experiences, including supply chain areas that have achieved performance improvements and benchmarking information that has been used and interpreted by their companies.
The Top 5 Do’s:
1. Do align with key stakeholders.
2. Do succinctly summarize benefits for top management.
3. Do reduce your scope to actionable items.
4. Do maintain perspective of both your business and cultural model.
5. Do test multiple options before drawing conclusions.
The Top 5 Don’ts:
1. Don’t use competitors that match up poorly with your processes.
2. Don’t ignore your competition.
3. Don’t use the “boil the ocean” approach (focus, focus, focus).
4. Don’t use benchmarking and data analysis tools without understanding how they work.
5. Don’t work in a vacuum and think your organization knows it all.
Do you have any benchmarking "do's" and "don'ts" to share? Post them below.
Labels:
benchmarking,
best practices,
Brousseau,
metrics,
TAWPI
Choosing a Staffing Provider
By Mark Brousseau
Partnering with a staffing service can help supplement your existing workforce during peak periods, allow you to pre-screen applicants for full-time positions, and even reduce your labor costs. That’s according to Brian Devine, division vice president at ProLogistix. Many industries are embracing the use of contingent workforce – which now accounts for 2 percent of the entire U.S. workforce, Devine notes.
But companies need to recognize that staffing is a strategic decision, Devine told attendees at the Warehousing Education and Research Council’s (WERC) Annual Conference in Chicago, May 4-7. That means choosing the right partner carefully, communicating effectively at the right levels, and being realistic in your expectations.
Devine said it’s a common misconception that all staffing companies are the same because they recruit from the same pool of people. However, there are several qualities companies should look for in a staffing provider, he noted. These include the thoroughness of the applicant screening process (including worker eligibility, drug testing, and criminal background checks), the competency of local recruiters (including industry knowledge and service-level expectations), competitive pricing, and a staffing provider’s financial stability.
And don’t turn a blind eye to undocumented workers. Devine said many companies mistakenly believe that it’s not their problem if the staffing service uses undocumented workers. He warned that the Department of Homeland Security’s revised penalties and automated e-Verify program make the risks associated with undocumented workers even greater. Also raising the stakes are stringent state laws, such as the one in Arizona.
Do have tips for partnering with a staffing service? Post them below.
Partnering with a staffing service can help supplement your existing workforce during peak periods, allow you to pre-screen applicants for full-time positions, and even reduce your labor costs. That’s according to Brian Devine, division vice president at ProLogistix. Many industries are embracing the use of contingent workforce – which now accounts for 2 percent of the entire U.S. workforce, Devine notes.
But companies need to recognize that staffing is a strategic decision, Devine told attendees at the Warehousing Education and Research Council’s (WERC) Annual Conference in Chicago, May 4-7. That means choosing the right partner carefully, communicating effectively at the right levels, and being realistic in your expectations.
Devine said it’s a common misconception that all staffing companies are the same because they recruit from the same pool of people. However, there are several qualities companies should look for in a staffing provider, he noted. These include the thoroughness of the applicant screening process (including worker eligibility, drug testing, and criminal background checks), the competency of local recruiters (including industry knowledge and service-level expectations), competitive pricing, and a staffing provider’s financial stability.
And don’t turn a blind eye to undocumented workers. Devine said many companies mistakenly believe that it’s not their problem if the staffing service uses undocumented workers. He warned that the Department of Homeland Security’s revised penalties and automated e-Verify program make the risks associated with undocumented workers even greater. Also raising the stakes are stringent state laws, such as the one in Arizona.
Do have tips for partnering with a staffing service? Post them below.
Businesses Want Supply Chain Services
Posted by Mark Brousseau
There is a market for integrated financial supply chain management services, according to Financial Insights’ 2007 North American Commercial Payments Study.
Areas of potential cooperation clearly emerge in accounts receivable (A/R) and accounts payable, Financial Insights reports. The full benefits of financial supply chain reengineering have yet to be recognized, the consulting firm notes, indicating prospects for businesses, bankers and vendors to work together more closely to realize joint benefits. Banks have not grappled with their own business processes, and they do not have an execution framework, Financial Insights found. Meantime, businesses do not see them as supply chain providers or, if they do, they see them in the same position as supply chain vendors.
“Businesses are receptive to supply chain services, and banks have everything to gain and lose. Transforming payments to business processes is the key,” says Maggie Scarborough, research manager, Financial Insights Corporate Banking Advisory Services.
Is your bank offering integrated financial supply chain management services? Tell us about it by posting your comments below.
There is a market for integrated financial supply chain management services, according to Financial Insights’ 2007 North American Commercial Payments Study.
Areas of potential cooperation clearly emerge in accounts receivable (A/R) and accounts payable, Financial Insights reports. The full benefits of financial supply chain reengineering have yet to be recognized, the consulting firm notes, indicating prospects for businesses, bankers and vendors to work together more closely to realize joint benefits. Banks have not grappled with their own business processes, and they do not have an execution framework, Financial Insights found. Meantime, businesses do not see them as supply chain providers or, if they do, they see them in the same position as supply chain vendors.
“Businesses are receptive to supply chain services, and banks have everything to gain and lose. Transforming payments to business processes is the key,” says Maggie Scarborough, research manager, Financial Insights Corporate Banking Advisory Services.
Is your bank offering integrated financial supply chain management services? Tell us about it by posting your comments below.
Labels:
Brousseau,
financial services,
supply chain,
TAWPI
Monday, May 5, 2008
Bank On Mistakes
Posted by Mark Brousseau
An interesting article from yesterday's Mail Tribune newspaper:
When banks mess up, consumers often pay, and the costs can be steep
By Gail Libermanand Alan Lavine
MarketWatch
May 04, 2008
PALM BEACH GARDENS, Fla. — The price tag of one recent bank error: At least $2 million. The bank mixed up the account of 49-year-old Benjamin Lovell with the account of a different person with the same name.
Lovell, accused of spending the money without notifying the bank of the mistake, faces a hearing Thursday in Brooklyn's Kings County criminal court. The charge against him: Grand larceny.
Lovell's attorney argues that Lovell didn't intend to steal, but believed he was entitled to the funds.
The case is just one example of the growing problem of bank errors. While most consumers likely won't face charges of grand larceny, there may be other financial pitfalls in store for those who don't carefully monitor the accuracy of bank transactions, including:
Steep, ricocheting bounced check fees — not only charged by your bank, but also by merchants — if a bank error leads to an overdrawn checking account.
Late fees and default interest rates on credit cards if credit card payments aren't properly credited.
Undetected fraud
The Office of the Comptroller of the Currency, regulator of national banks, said complaints of bank errors rose to 2,217 in 2007, a 10 percent rise from 2006. By contrast, total complaints rose 7 percent to 28,362. Of course, the data likely are limited to those customers who detected banks' mistakes.
But how many errors go undetected by those who are too busy to check every detail of their account transactions? After careful scrutiny of her own accounts, one reader says she caught thousands of dollars in bank errors, including:
A check debit for $400 should have been a deposit.
A $3,000 credit card payment was applied to someone else's account.
Despite an ATM withdrawal of $40, no cash actually was provided.
"These items were entirely in my responsibility to fix," the reader complains. "The financial organizations provided little, if any, help, though it was their mistake and if I hadn't pursued it, would not have recovered the money."
More errors, or is it fraud?
Tomas Norton, a Princeton, N.J.-based fraud consultant, says the problem may not necessarily be more bank errors. (One sign that bank errors have been around for years lies in a comical "Beverly Hillbillies" video, dubbed "Before identity theft there were bank errors," at CrazyAboutTV.com/video.)
Rather, more of those errors may be due to fraud, Norton says. That problem is compounded by the fact that it's increasingly difficult to get bank errors fixed.
"The problem with the errors is that no matter how it occurs, whether it's an error or deliberate, the bank is always protected," Norton says.
"If your payment doesn't get to the bank on time, even though there's a plausible delay in the mail, they don't take those excuses," he says. With a credit card, not only could you lose your attractive 7.99 percent rate, but your account balance retroactively can be charged 31 percent!
Also, banks have come to view checking and savings account operations as ways to generate income, Norton says, so fees for customer missteps have escalated dramatically, and your bank may be less willing to quickly fix errors that trigger those fees. In addition, customers often must deal with frustratingly bureaucratic call centers.
Meanwhile, the time period for you to notify your bank of an error — often overlooked in deposit agreements — has been slashed. The latest deposit agreements give you only 60 days to notify your bank of an account error, Norton says. Fail to meet this deadline, and even though an error is your bank's fault, the price tag for the mistake, including accompanying fees, could be yours.
"Billing disputes and error resolution" represented the top consumer complaint among the 4,451 filed with the FDIC in 2007. The same problem also led the 2007 roster of complaints at the Office of Thrift Supervision.
An interesting article from yesterday's Mail Tribune newspaper:
When banks mess up, consumers often pay, and the costs can be steep
By Gail Libermanand Alan Lavine
MarketWatch
May 04, 2008
PALM BEACH GARDENS, Fla. — The price tag of one recent bank error: At least $2 million. The bank mixed up the account of 49-year-old Benjamin Lovell with the account of a different person with the same name.
Lovell, accused of spending the money without notifying the bank of the mistake, faces a hearing Thursday in Brooklyn's Kings County criminal court. The charge against him: Grand larceny.
Lovell's attorney argues that Lovell didn't intend to steal, but believed he was entitled to the funds.
The case is just one example of the growing problem of bank errors. While most consumers likely won't face charges of grand larceny, there may be other financial pitfalls in store for those who don't carefully monitor the accuracy of bank transactions, including:
Steep, ricocheting bounced check fees — not only charged by your bank, but also by merchants — if a bank error leads to an overdrawn checking account.
Late fees and default interest rates on credit cards if credit card payments aren't properly credited.
Undetected fraud
The Office of the Comptroller of the Currency, regulator of national banks, said complaints of bank errors rose to 2,217 in 2007, a 10 percent rise from 2006. By contrast, total complaints rose 7 percent to 28,362. Of course, the data likely are limited to those customers who detected banks' mistakes.
But how many errors go undetected by those who are too busy to check every detail of their account transactions? After careful scrutiny of her own accounts, one reader says she caught thousands of dollars in bank errors, including:
A check debit for $400 should have been a deposit.
A $3,000 credit card payment was applied to someone else's account.
Despite an ATM withdrawal of $40, no cash actually was provided.
"These items were entirely in my responsibility to fix," the reader complains. "The financial organizations provided little, if any, help, though it was their mistake and if I hadn't pursued it, would not have recovered the money."
More errors, or is it fraud?
Tomas Norton, a Princeton, N.J.-based fraud consultant, says the problem may not necessarily be more bank errors. (One sign that bank errors have been around for years lies in a comical "Beverly Hillbillies" video, dubbed "Before identity theft there were bank errors," at CrazyAboutTV.com/video.)
Rather, more of those errors may be due to fraud, Norton says. That problem is compounded by the fact that it's increasingly difficult to get bank errors fixed.
"The problem with the errors is that no matter how it occurs, whether it's an error or deliberate, the bank is always protected," Norton says.
"If your payment doesn't get to the bank on time, even though there's a plausible delay in the mail, they don't take those excuses," he says. With a credit card, not only could you lose your attractive 7.99 percent rate, but your account balance retroactively can be charged 31 percent!
Also, banks have come to view checking and savings account operations as ways to generate income, Norton says, so fees for customer missteps have escalated dramatically, and your bank may be less willing to quickly fix errors that trigger those fees. In addition, customers often must deal with frustratingly bureaucratic call centers.
Meanwhile, the time period for you to notify your bank of an error — often overlooked in deposit agreements — has been slashed. The latest deposit agreements give you only 60 days to notify your bank of an account error, Norton says. Fail to meet this deadline, and even though an error is your bank's fault, the price tag for the mistake, including accompanying fees, could be yours.
"Billing disputes and error resolution" represented the top consumer complaint among the 4,451 filed with the FDIC in 2007. The same problem also led the 2007 roster of complaints at the Office of Thrift Supervision.
Sunday, May 4, 2008
Online Bill Pay Concerns Reporter
Posted by Mark Brousseau
Below is an article from today's Baltimore Sun that illustrates the lingering FUD factor (fear, uncertainty and doubt) surrounding online bill pay:
Paying online can weave a web of problems
Dan Thanh Dang
Consuming Interests
May 4, 2008
My friends tell me it's quick and painless. They say doing it makes life feel so much easier. They also swear that once you start, you won't be able to live without it.
It sounds so enticing. But I still refuse to bank online or pay my bills online. I do feel like an oddball whenever I sit down a couple times a month to write checks. I still lick the envelopes, press a stamp on each and then walk all of it to a mailbox or post office that I trust to get my bills where they need to go, on time.
Scoff if you will, but I'm not really sure it's more secure to click a few buttons on a computer to digitally send a payment in bits to some numerical account somewhere.
It's not that I haven't seen the writing on the wall.
A recent report from the Federal Reserve showed that more than two-thirds of noncash payments are now done electronically. Studies have shown that the chances of someone stealing my data online are no greater than the chances of someone swiping my mail. I know most banks use some sort of encryption and cryptography to safeguard my transactions. I know online banking scored 82 out of 100 on the University of Michigan's American Customer Satisfaction Index this year. And, yes, I would likely save some money since I wouldn't have to buy stamps anymore.
And yet, I'm still leery about taking the electronic plunge.
Why? Because once every couple of weeks, I hear from someone I know or get a call or e-mail from someone like Laurie Hansen who scares the bejesus out of me about how some company withdrew two electronic payments in one month or how a payment never made it to a designated recipient.
And don't even get me started about automatic bill pay, which is the equivalent of giving someone else license to steal straight out of my bank account whenever they want.
In Hansen's case, the 46-year-old Catonsville lawyer signed up for online bill payment in 2003 when she bought her Dodge Durango. Her online payments to Chrysler were trouble-free for five years until last February and March, when she discovered that two $460 payments didn't make it. It was sent to some mortgage company.Stories like that make me cringe.
The fact of the matter is I'm a bit of a control freak about paying my bills. I don't like relying on anyone - be it my bank or the company I owe money to - to make sure I pay a bill on time.
When I write a check and mail it, I know I've done all I can to make sure my bill gets paid.
When I click a button, I'm relying on my bank to make sure that transaction goes through on time. What if the system crashes and my payment does not get sent? What if my bank sends two payments instead of one to my mortgage company?
I'd get whacked with late fees and overage penalties, which would ding my credit, and bad credit would make my interest rates go up, high interest rates would make it harder for me to pay my bills, and before you know it, my dog and I would end up living on a bench in Patterson Park.
OK, I exaggerate. But only a little.In Hansen's dilemma, butterfingers did her in. Someone typed in some wrong keys, and her money was sent to some company she didn't owe money to."
When I signed up for online bill payment, I didn't have to type in an address," Hansen said. "Bank of America said it already had a relationship with Chrysler Financial, so all I had to do was fill in my account number. Things went fine until my payments went missing. When I called Chrysler, they told me they changed their billing address recently."
When Bank of America went back to look at what happened, they found that an incorrect ZIP code was entered from their merchant list. It resulted in my money being sent to the wrong company. The biggest problem for me was not knowing when they would put my money back."
Hansen said one customer service rep told her it would take three to five days to resolve the case, another said five to seven, and a third told her seven to 10 days. Banks have 10 days to conduct an investigation, and if they can't come up with a determination in your case, they must provide you with a provisional credit, according to Craig Stone, deputy ombudsman for customer assistance for the Office of the Comptroller of the Currency, which regulates all national banks.
Bank of America declined to comment on Hansen's personal account because of privacy concerns, but spokesman Tara Burke said, "We capture payments by using P.O. boxes and ZIP codes. You have to enter both correctly in order for the payment to go through accurately. If one is wrong, the payment can be diverted to the wrong place." But, in any case, Burke said, "If a mistake or a late payment is made on our end, our customers are protected under zero percent liability guarantee."
True, but it could still take 10 days to correct the problem. That's a long time for some people living from paycheck to paycheck.
Whether it was Hansen's fault, the bank's fault or both, the whole situation was resolved in one day after Hansen called the OCC, which put her in touch with Bank of America's executive office.
The $920 owed to her was placed back into her account on April 1 - less than 10 business days after she called the bank on March 19."
If someone had explained to me that the money would be returned in 10 days, I would have felt a lot better," Hansen said. "I'm definitely going to pay closer attention to everything in the future and double-check that my payments are going through. I've only ever had one other problem come up in the eight years that I've been using it. I use online bill paying all the time, so I'm not sure I could live without it now."
I remain unconvinced.
Below is an article from today's Baltimore Sun that illustrates the lingering FUD factor (fear, uncertainty and doubt) surrounding online bill pay:
Paying online can weave a web of problems
Dan Thanh Dang
Consuming Interests
May 4, 2008
My friends tell me it's quick and painless. They say doing it makes life feel so much easier. They also swear that once you start, you won't be able to live without it.
It sounds so enticing. But I still refuse to bank online or pay my bills online. I do feel like an oddball whenever I sit down a couple times a month to write checks. I still lick the envelopes, press a stamp on each and then walk all of it to a mailbox or post office that I trust to get my bills where they need to go, on time.
Scoff if you will, but I'm not really sure it's more secure to click a few buttons on a computer to digitally send a payment in bits to some numerical account somewhere.
It's not that I haven't seen the writing on the wall.
A recent report from the Federal Reserve showed that more than two-thirds of noncash payments are now done electronically. Studies have shown that the chances of someone stealing my data online are no greater than the chances of someone swiping my mail. I know most banks use some sort of encryption and cryptography to safeguard my transactions. I know online banking scored 82 out of 100 on the University of Michigan's American Customer Satisfaction Index this year. And, yes, I would likely save some money since I wouldn't have to buy stamps anymore.
And yet, I'm still leery about taking the electronic plunge.
Why? Because once every couple of weeks, I hear from someone I know or get a call or e-mail from someone like Laurie Hansen who scares the bejesus out of me about how some company withdrew two electronic payments in one month or how a payment never made it to a designated recipient.
And don't even get me started about automatic bill pay, which is the equivalent of giving someone else license to steal straight out of my bank account whenever they want.
In Hansen's case, the 46-year-old Catonsville lawyer signed up for online bill payment in 2003 when she bought her Dodge Durango. Her online payments to Chrysler were trouble-free for five years until last February and March, when she discovered that two $460 payments didn't make it. It was sent to some mortgage company.Stories like that make me cringe.
The fact of the matter is I'm a bit of a control freak about paying my bills. I don't like relying on anyone - be it my bank or the company I owe money to - to make sure I pay a bill on time.
When I write a check and mail it, I know I've done all I can to make sure my bill gets paid.
When I click a button, I'm relying on my bank to make sure that transaction goes through on time. What if the system crashes and my payment does not get sent? What if my bank sends two payments instead of one to my mortgage company?
I'd get whacked with late fees and overage penalties, which would ding my credit, and bad credit would make my interest rates go up, high interest rates would make it harder for me to pay my bills, and before you know it, my dog and I would end up living on a bench in Patterson Park.
OK, I exaggerate. But only a little.In Hansen's dilemma, butterfingers did her in. Someone typed in some wrong keys, and her money was sent to some company she didn't owe money to."
When I signed up for online bill payment, I didn't have to type in an address," Hansen said. "Bank of America said it already had a relationship with Chrysler Financial, so all I had to do was fill in my account number. Things went fine until my payments went missing. When I called Chrysler, they told me they changed their billing address recently."
When Bank of America went back to look at what happened, they found that an incorrect ZIP code was entered from their merchant list. It resulted in my money being sent to the wrong company. The biggest problem for me was not knowing when they would put my money back."
Hansen said one customer service rep told her it would take three to five days to resolve the case, another said five to seven, and a third told her seven to 10 days. Banks have 10 days to conduct an investigation, and if they can't come up with a determination in your case, they must provide you with a provisional credit, according to Craig Stone, deputy ombudsman for customer assistance for the Office of the Comptroller of the Currency, which regulates all national banks.
Bank of America declined to comment on Hansen's personal account because of privacy concerns, but spokesman Tara Burke said, "We capture payments by using P.O. boxes and ZIP codes. You have to enter both correctly in order for the payment to go through accurately. If one is wrong, the payment can be diverted to the wrong place." But, in any case, Burke said, "If a mistake or a late payment is made on our end, our customers are protected under zero percent liability guarantee."
True, but it could still take 10 days to correct the problem. That's a long time for some people living from paycheck to paycheck.
Whether it was Hansen's fault, the bank's fault or both, the whole situation was resolved in one day after Hansen called the OCC, which put her in touch with Bank of America's executive office.
The $920 owed to her was placed back into her account on April 1 - less than 10 business days after she called the bank on March 19."
If someone had explained to me that the money would be returned in 10 days, I would have felt a lot better," Hansen said. "I'm definitely going to pay closer attention to everything in the future and double-check that my payments are going through. I've only ever had one other problem come up in the eight years that I've been using it. I use online bill paying all the time, so I'm not sure I could live without it now."
I remain unconvinced.
Labels:
Brousseau,
online banking,
online bill pay,
TAWPI
Friday, May 2, 2008
Regulus Sale Just The Start?
By Mark Brousseau
The sale of Regulus Group LLC earlier this week to 3i Infotech, a global information technology company (see TAWPI Top Stories), could be the start of a wave of consolidation among lockbox providers.
Regulus is the largest independent remittance provider and one of the leading providers of document processing services in the United States – addressing the full document lifecycle from print and electronic bill presentment to remittance. Under the terms of agreement, 3i Infotech has proposed to acquire 100 percent of Regulus, including the company’s products, trademarks and product brands.
John Mintzer, vice president at Citizens Bank, expects more consolidation among lockbox providers, some of it simply as a result of mergers and acquisitions among regional banks.
But the real driving factor, in Mintzer’s view, is the declining number of consumer checks. “The ability for lockbox processors to meet their fixed costs gets increasingly difficult as check volumes decline,” Mintzer told me. “The single biggest challenge that lockbox processors face is the cost of labor, including benefits. This is a significant expense, and one that harder and harder to cover as customers require more exceptions-type processing.”
Serena Smith, senior vice president, Fidelity National Information Services, agrees with Mintzer. “All of us our facing economic pressures as volumes decline,” Smith told me, adding that consolidation among lockbox providers will be biggest story in the market over the next 12 months. “Providers will need an aggressive approach to the market, which means expanding their product offerings or exiting the business altogether. Providers who have not embraced a complete payments offering will miss the boat.”
Smith said that many in-house processors already are looking to outsource, to find the best mix of price, quality and functionality. “Processors have to be creative to differentiate themselves on something other than price,” she noted.
Mintzer believes that survivors of the coming lockbox market shakeout will need to have significant automation that makes their operations less dependent on heads-down labor. “Surviving processors also will require the ability to combine inputs of information received from multiple sources into one concise file or report, essentially providing the customer with an information dashboard.”
Smith said survivors would need to demonstrate robust product offerings, a commitment to the business, sustainable market share, and ready capital for investment.
With consolidation on the horizon, the obvious question is why companies like 3i Infotech are entering the lockbox space. Smith said the trend of in-house processors outsourcing their volume is very compelling, and can offer successful providers a large amount of volume.
But Mintzer warns that the financials don’t seem to support new entrants: “The significant investment in plant and equipment is extremely hard to make up in this ‘penny’ business, and this is before factoring costs associated with disaster recovery.”
Do foresee more consolidation in the lockbox market? Post your comments below.
The sale of Regulus Group LLC earlier this week to 3i Infotech, a global information technology company (see TAWPI Top Stories), could be the start of a wave of consolidation among lockbox providers.
Regulus is the largest independent remittance provider and one of the leading providers of document processing services in the United States – addressing the full document lifecycle from print and electronic bill presentment to remittance. Under the terms of agreement, 3i Infotech has proposed to acquire 100 percent of Regulus, including the company’s products, trademarks and product brands.
John Mintzer, vice president at Citizens Bank, expects more consolidation among lockbox providers, some of it simply as a result of mergers and acquisitions among regional banks.
But the real driving factor, in Mintzer’s view, is the declining number of consumer checks. “The ability for lockbox processors to meet their fixed costs gets increasingly difficult as check volumes decline,” Mintzer told me. “The single biggest challenge that lockbox processors face is the cost of labor, including benefits. This is a significant expense, and one that harder and harder to cover as customers require more exceptions-type processing.”
Serena Smith, senior vice president, Fidelity National Information Services, agrees with Mintzer. “All of us our facing economic pressures as volumes decline,” Smith told me, adding that consolidation among lockbox providers will be biggest story in the market over the next 12 months. “Providers will need an aggressive approach to the market, which means expanding their product offerings or exiting the business altogether. Providers who have not embraced a complete payments offering will miss the boat.”
Smith said that many in-house processors already are looking to outsource, to find the best mix of price, quality and functionality. “Processors have to be creative to differentiate themselves on something other than price,” she noted.
Mintzer believes that survivors of the coming lockbox market shakeout will need to have significant automation that makes their operations less dependent on heads-down labor. “Surviving processors also will require the ability to combine inputs of information received from multiple sources into one concise file or report, essentially providing the customer with an information dashboard.”
Smith said survivors would need to demonstrate robust product offerings, a commitment to the business, sustainable market share, and ready capital for investment.
With consolidation on the horizon, the obvious question is why companies like 3i Infotech are entering the lockbox space. Smith said the trend of in-house processors outsourcing their volume is very compelling, and can offer successful providers a large amount of volume.
But Mintzer warns that the financials don’t seem to support new entrants: “The significant investment in plant and equipment is extremely hard to make up in this ‘penny’ business, and this is before factoring costs associated with disaster recovery.”
Do foresee more consolidation in the lockbox market? Post your comments below.
Labels:
Brousseau,
lockbox,
outsourcing,
Regulus,
retail lockbox
Electronic Deposit Fees Too High?
By Mark Brousseau
During last month’s Payments Capture & Clearing (PCC) Council meeting in Las Vegas, seemingly high bank fees for electronic deposits were cited by several participants as a stumbling block to their organization’s adoption of Check 21. The results of a recent TAWPI Question of the Week shows that PCC Council members aren’t the only ones put off by their bank’s electronic deposit fees.
Sixty-five percent of respondents to the online survey said that their bank’s fees for electronic deposit were “too high.” Conversely, just 14 percent of respondents thought that their bank’s fees were “lower than those of other banks.” Twenty-one percent of respondents said their bank’s electronic deposit fees were “just right” – which would make Goldilocks proud.
What do you think of your bank’s electronic deposit fees? Post your comments below.
During last month’s Payments Capture & Clearing (PCC) Council meeting in Las Vegas, seemingly high bank fees for electronic deposits were cited by several participants as a stumbling block to their organization’s adoption of Check 21. The results of a recent TAWPI Question of the Week shows that PCC Council members aren’t the only ones put off by their bank’s electronic deposit fees.
Sixty-five percent of respondents to the online survey said that their bank’s fees for electronic deposit were “too high.” Conversely, just 14 percent of respondents thought that their bank’s fees were “lower than those of other banks.” Twenty-one percent of respondents said their bank’s electronic deposit fees were “just right” – which would make Goldilocks proud.
What do you think of your bank’s electronic deposit fees? Post your comments below.
Labels:
Brousseau,
Check 21,
electronic deposit,
fees,
Goldilocks
Tepid Interest in 'Digital Vault'
By Mark Brousseau
While a number of prominent financial institutions have recently introduced so-called digital vault solutions to archive images of various documents and records on behalf of their corporate and retail clients, the results of a recent TAWPI Question of the Week shows lukewarm interest in the concept among visitors to the association’s Web site.
Seventy-four percent of respondents said they had no interest in the concept, while just 12 percent of respondents said they were interested. Fourteen percent of respondents weren’t sure what to make of the service (obviously, they weren’t on the banks’ news distribution list).
What do think of the digital vault concept? Post your comments below.
While a number of prominent financial institutions have recently introduced so-called digital vault solutions to archive images of various documents and records on behalf of their corporate and retail clients, the results of a recent TAWPI Question of the Week shows lukewarm interest in the concept among visitors to the association’s Web site.
Seventy-four percent of respondents said they had no interest in the concept, while just 12 percent of respondents said they were interested. Fourteen percent of respondents weren’t sure what to make of the service (obviously, they weren’t on the banks’ news distribution list).
What do think of the digital vault concept? Post your comments below.
Labels:
archive,
Brousseau,
digital vault,
document imaging,
records
Remote Deposit Capture Saves Gas
Posted by Mark Brousseau
An interesting article on remote deposit capture in yesterday's edition of The Charlotte Observer:
Remote deposit saves steps, gas
CHRISTINA REXRODE
In an era of online banking and cash-back at the grocery store, depositing a check is one of the few tasks that forces people to journey to the nearest bank or ATM.
Remote deposit -- or depositing a check online -- could change that.
Half the country's banks offer the service to business customers, touting its convenience. Now some smaller banks, looking for creative ways to distinguish themselves, are considering the same service for consumers.
Remote deposit lets customers scan a check, submit it to the bank online, then destroy it a few days later.
USAA Federal Savings Bank, a Texas-based bank that caters to the military, pioneered remote deposit for consumers when it launched Deposit@Home more than a year ago.
Kerri Herring, a teacher's assistant and part-time photographer in Kannapolis, said she and her husband use it at least a couple of times a month.
People often pay Herring by check for photography work.
"We're checkless most of the time," said Herring, 23, "but there are always going to be grandparents who send birthday money."
Herring hates driving to the bank just to deposit a check. "It wastes time," she said. "I hate standing in line."
Some financial institutions are starting to pick up on that vibe. Massachusetts-based EasCorp, which sells a remote deposit service called DeposZip, says seven credit unions in Massachusetts, New Hampshire, Indiana and Oklahoma now offer it to consumers, and another nine throughout the country plan to do the same.
Charlotte-based NewDominion Bank, which already offers remote deposit for businesses, says consumer remote deposit is "on the drawing board." BB&T Corp. in Winston-Salem also says it's considering it.
Other companies say they're spreading the remote deposit concept, minus the scanner. In February, the Charlotte Metro Credit Union started advertising HomeDeposit, which lets "highly qualified" customers deposit checks by submitting information from the check to the credit union's Web site, then mailing it in.
"They've all had three or four checks laying around for $10, $15," said Nathan Tothrow, the vice president of marketing. "And who wants to get in the car and drive to the bank for that?"
Mitek Systems, a San Diego company, is advertising software that lets consumers deposit checks by photographing them and emailing the image -- all via cell phone.
Biggest thing since ATMs
Charlotte-based Wachovia Corp. and Bank of America Corp. started offering remote deposit for big corporate clients in 2004.At Bank of America, product executive Bob Johnston says the response from businesses has been "phenomenal," especially for global companies who don't want to mail checks across the ocean to their different offices. "Ground courier to an airline, back to a ground courier -- you can imagine the length of time and cost to do that," Johnston said.
Among smaller companies, banks are betting that remote deposit will appeal to niches that still deal often with checks, like property managers or nonprofits.
Akil Boston, the community development coordinator for Charlotte's Second Harvest Food Bank, says that using Wachovia's corporate remote deposit has cut out his almost-daily trip to the bank. "Seven miles roundtrip," said Boston, 27. "When you look at gas prices nowadays, it's pretty economical."
Remote deposit benefits the banks, too. It can enable them to expand their reach, serving customers who don't live near a branch. It can cut down on foot traffic at the branches, which can save money on staffing. Ninety percent of a teller's work involves checks, according to Celent, a financial services research firm.
"There hasn't been a financial services technology that has received so much attention since the adoption of the ATM," said Christine Barry, research director with The Aite Group, a financial services research firm in Boston.
Consumer option unlikely soon
Remote deposit for consumers is in its early stages. A survey of 157 banks, released in March by Celent, found that one-fifth offer or are planning to offer it.
But Wachovia and Bank of America say they have no such plans. Some doubt whether most people would go to so much trouble to deposit a check.
"It would be easier to drop it in an ATM (or) the mail, or walk it into the branch during lunch hour," says Jim Bruene, editor of the Online Banking Report, a trade publication.
Banks usually charge business customers per month and per check for remote deposit, and compatible scanners can cost at least $300. To convince consumers to use remote deposit, Bruene and Barry say, banks will have to drop the fees and make it compatible with low-end scanners.
USAA, the EasCorp credit unions and the Charlotte Metro Credit Union charge no fees for their programs, and USAA and EasCorp say their programs work with most any scanner.
An interesting article on remote deposit capture in yesterday's edition of The Charlotte Observer:
Remote deposit saves steps, gas
CHRISTINA REXRODE
In an era of online banking and cash-back at the grocery store, depositing a check is one of the few tasks that forces people to journey to the nearest bank or ATM.
Remote deposit -- or depositing a check online -- could change that.
Half the country's banks offer the service to business customers, touting its convenience. Now some smaller banks, looking for creative ways to distinguish themselves, are considering the same service for consumers.
Remote deposit lets customers scan a check, submit it to the bank online, then destroy it a few days later.
USAA Federal Savings Bank, a Texas-based bank that caters to the military, pioneered remote deposit for consumers when it launched Deposit@Home more than a year ago.
Kerri Herring, a teacher's assistant and part-time photographer in Kannapolis, said she and her husband use it at least a couple of times a month.
People often pay Herring by check for photography work.
"We're checkless most of the time," said Herring, 23, "but there are always going to be grandparents who send birthday money."
Herring hates driving to the bank just to deposit a check. "It wastes time," she said. "I hate standing in line."
Some financial institutions are starting to pick up on that vibe. Massachusetts-based EasCorp, which sells a remote deposit service called DeposZip, says seven credit unions in Massachusetts, New Hampshire, Indiana and Oklahoma now offer it to consumers, and another nine throughout the country plan to do the same.
Charlotte-based NewDominion Bank, which already offers remote deposit for businesses, says consumer remote deposit is "on the drawing board." BB&T Corp. in Winston-Salem also says it's considering it.
Other companies say they're spreading the remote deposit concept, minus the scanner. In February, the Charlotte Metro Credit Union started advertising HomeDeposit, which lets "highly qualified" customers deposit checks by submitting information from the check to the credit union's Web site, then mailing it in.
"They've all had three or four checks laying around for $10, $15," said Nathan Tothrow, the vice president of marketing. "And who wants to get in the car and drive to the bank for that?"
Mitek Systems, a San Diego company, is advertising software that lets consumers deposit checks by photographing them and emailing the image -- all via cell phone.
Biggest thing since ATMs
Charlotte-based Wachovia Corp. and Bank of America Corp. started offering remote deposit for big corporate clients in 2004.At Bank of America, product executive Bob Johnston says the response from businesses has been "phenomenal," especially for global companies who don't want to mail checks across the ocean to their different offices. "Ground courier to an airline, back to a ground courier -- you can imagine the length of time and cost to do that," Johnston said.
Among smaller companies, banks are betting that remote deposit will appeal to niches that still deal often with checks, like property managers or nonprofits.
Akil Boston, the community development coordinator for Charlotte's Second Harvest Food Bank, says that using Wachovia's corporate remote deposit has cut out his almost-daily trip to the bank. "Seven miles roundtrip," said Boston, 27. "When you look at gas prices nowadays, it's pretty economical."
Remote deposit benefits the banks, too. It can enable them to expand their reach, serving customers who don't live near a branch. It can cut down on foot traffic at the branches, which can save money on staffing. Ninety percent of a teller's work involves checks, according to Celent, a financial services research firm.
"There hasn't been a financial services technology that has received so much attention since the adoption of the ATM," said Christine Barry, research director with The Aite Group, a financial services research firm in Boston.
Consumer option unlikely soon
Remote deposit for consumers is in its early stages. A survey of 157 banks, released in March by Celent, found that one-fifth offer or are planning to offer it.
But Wachovia and Bank of America say they have no such plans. Some doubt whether most people would go to so much trouble to deposit a check.
"It would be easier to drop it in an ATM (or) the mail, or walk it into the branch during lunch hour," says Jim Bruene, editor of the Online Banking Report, a trade publication.
Banks usually charge business customers per month and per check for remote deposit, and compatible scanners can cost at least $300. To convince consumers to use remote deposit, Bruene and Barry say, banks will have to drop the fees and make it compatible with low-end scanners.
USAA, the EasCorp credit unions and the Charlotte Metro Credit Union charge no fees for their programs, and USAA and EasCorp say their programs work with most any scanner.
Labels:
Brousseau,
Check 21,
imaging,
remote deposit capture,
scanners
Subscribe to:
Posts (Atom)