By Wally Vogel (www.wvogel@creditron.com)
When remote deposit capture (RDC) first burst onto the scene, it was billed as a way for companies to eliminate daily trips to the bank. Today, reducing the time and costs associated with depositing checks is still a key factor in the adoption of the technology. But application of the technology has also evolved into a compliment to remittance processing, such as a way for far-flung sales agents to capture check images more quickly – helping to drive faster funds availability and enhanced service.
RDC is a product of The Check Clearing for the 21st Century Act (“Check 21”), a federal law enacted in 2004 that allows billers to electronically capture and transmit (via X9.37 file format) images of checks to their financial institution for clearing. Integrated balancing and automated character recognition tools assist billers in building balanced deposits. Once deposits are transmitted, original checks are truncated (retained) and eventually destroyed. Upon receipt of the file, the bank validates the items, performs any necessary corrections, and creates an image cash letter (or ICL) for deposit.
A Compliment to Remittance Processing
It didn’t take long for billers to recognize that RDC could be used to compliment -- and streamline -- remittance processing, which has historically been a back-office task. By capturing payment images and data at the point of presentment, and integrating the images and data with the back-office payments stream -- rather than redirecting the payments to the back-office -- billers can accelerate processing and funds availability; eliminate the costs to ship or transport payments to the back-office; offload some of the work from their back-office staff (and maybe even offload the work from their staff altogether); free back-office staff to perform other functions; and reduce paper handling and the associated costs. What’s more, capturing payments information sooner provides benefits such as faster posting, better visibility into receivables, and faster responses to customer inquiries.
The best part: remote deposit capture accomplishes all of this without requiring much upfront cost.
New RDC technologies are further expanding the applicability of the technology as a compliment to back-office remittance processing. For instance, support for flatbed TWAIN scanners enables consumers or remote employees to capture payments without the requirement for a specialized check scanner. Recognizing that the user in this environment may not be trained in payment processing, software guides the user through the scanning process. Another advancement in RDC is the use of smartphones to capture payment images. This enables field agents to capture payment images without having to transport or ship them to the back-office for processing – greatly speeding posting.
The Bottom Line
By complimenting back-office remittance processing with RDC, funds are available sooner, overall costs are improved, and customer service is enhanced. And because of the low overhead associated with RDC, growth in the biller’s remittance volume can be accommodated without a corresponding growth in the biller’s back-office infrastructure – meaning they can avoid a lot of capital expense.
Friday, July 30, 2010
Thursday, July 29, 2010
Network like it’s your job
Posted by Mark Brousseau
Finding a job in today’s job market can be like conquering a new frontier for many job seekers. With the unemployment rate still over 9 percent, the job market has been flooded with tons of competition for job seekers—many of whom are experiencing a culture shock when they send out their résumés. After all, the days of mailing in your résumé and receiving a phone call to set up an interview are over. Today, everything is done online, from sending in your résumé to setting up your first interview—and nine times out of ten, you’re lucky to receive any kind of response, even if it’s an automatic one thanking you for your submission.
It doesn’t take long to discover that in a virtual world it can be very difficult to get noticed by the decision makers whom you need to impress in order to land the job. Maribeth Kuzmeski says there are three easy steps to getting noticed in today’s digitally dominated job market—networking, networking, networking.
“Today you need more than a résumé and a cover letter to get that dream job,” says Kuzmeski, author of The Connectors: How the World’s Most Successful Businesspeople Build Relationships and Win Clients for Life. “Think of yourself as CEO of Me, Myself, and I, Inc. You need to be doing everything you can to get the word out about your brand. That means networking.
“Great networkers are capable of leaving something behind with everyone they encounter—a thought, a memory, or a connection. This is exactly what you need to do if you are in the job market. You need to make strong connections, become a relationship builder. You want to be the first person who comes to mind when someone in your network hears about a great job opening.”
Kuzmeski offers advice for how you can network your way to a great new job:
... Rejuvenate your résumé.
... Build your online résumé using LinkedIn.
... Get face-to-face with potential employers!
... Make an impact by using video.
... Become a contrarian networker.
... Let them do the talking.
... Be prepared to pitch yourself in fifteen seconds.
... Network to the people you know.
... Get involved in organizations that are connected to your profession.
... Volunteer.
... Be a mover and a shaker.
... Always be networking.
“Trying to find a job in such an overcrowded job market can be a daunting task,” says Kuzmeski. “But by placing a renewed focus on networking, you open yourself up to many more opportunities than just the ones on the job boards or those being offered at your local job fair. I truly feel that there are only six degrees of separation between everyone in the world—or at the very least the U.S. Every time you make a new connection you get that much closer to a great new opportunity.”
What do you think?
Finding a job in today’s job market can be like conquering a new frontier for many job seekers. With the unemployment rate still over 9 percent, the job market has been flooded with tons of competition for job seekers—many of whom are experiencing a culture shock when they send out their résumés. After all, the days of mailing in your résumé and receiving a phone call to set up an interview are over. Today, everything is done online, from sending in your résumé to setting up your first interview—and nine times out of ten, you’re lucky to receive any kind of response, even if it’s an automatic one thanking you for your submission.
It doesn’t take long to discover that in a virtual world it can be very difficult to get noticed by the decision makers whom you need to impress in order to land the job. Maribeth Kuzmeski says there are three easy steps to getting noticed in today’s digitally dominated job market—networking, networking, networking.
“Today you need more than a résumé and a cover letter to get that dream job,” says Kuzmeski, author of The Connectors: How the World’s Most Successful Businesspeople Build Relationships and Win Clients for Life. “Think of yourself as CEO of Me, Myself, and I, Inc. You need to be doing everything you can to get the word out about your brand. That means networking.
“Great networkers are capable of leaving something behind with everyone they encounter—a thought, a memory, or a connection. This is exactly what you need to do if you are in the job market. You need to make strong connections, become a relationship builder. You want to be the first person who comes to mind when someone in your network hears about a great job opening.”
Kuzmeski offers advice for how you can network your way to a great new job:
... Rejuvenate your résumé.
... Build your online résumé using LinkedIn.
... Get face-to-face with potential employers!
... Make an impact by using video.
... Become a contrarian networker.
... Let them do the talking.
... Be prepared to pitch yourself in fifteen seconds.
... Network to the people you know.
... Get involved in organizations that are connected to your profession.
... Volunteer.
... Be a mover and a shaker.
... Always be networking.
“Trying to find a job in such an overcrowded job market can be a daunting task,” says Kuzmeski. “But by placing a renewed focus on networking, you open yourself up to many more opportunities than just the ones on the job boards or those being offered at your local job fair. I truly feel that there are only six degrees of separation between everyone in the world—or at the very least the U.S. Every time you make a new connection you get that much closer to a great new opportunity.”
What do you think?
Tuesday, July 27, 2010
Electronic invoicing gains momentum
Posted by Mark Brousseau
Accounts Payable (AP) functions are still drowning in paper, but that may be about to change. According to APQC’s Open Standards Benchmarking in accounts payable, on average 69.4 percent of invoices still require manual re-keying of line-item data, and only 20.1 percent of invoice line items are received electronically. Although myriad technologies perform an incredible array of tasks in successful companies around the globe, APQC (www.apqc.org) notes that AP departments are only now approaching the crucial tipping point where electronic invoicing will overtake manual, paper-based processes, a milestone expected to occur in 2011.
Electronic payment systems now on the market promise efficient communication, reliable audit trails, and faster/smoother data processing between internal departments and external suppliers. Differing systems offer various levels of transparency, approvals, and monitoring from procurement to payment. However, the common theme is less paper and less manual keying of data.
APQC says a typical transaction begins when a purchase order request is entered into the buyer’s system; once a supervisor provides approval, the appropriate vendor is notified. The vendor then generates an invoice while simultaneously arranging delivery of their goods or services. The invoice is then routed electronically to the AP department, which matches the invoice to the purchase order and, often, other documents that prove that goods or services were received as expected. Once the verification is complete, the transfer of the payment is then triggered.
The level of automation and sophistication can vary widely; a PDF of an invoice sent via email sits on one end of the automation spectrum, with a fully “touchless” integrated system that connects buyer and seller at the other end.
Automated payment technology has been in place at many large companies for years, but the systems were often large-scale customized initiatives, expensive both to build and maintain. As more advanced technology tools arrive on the market, the costs as well as the barriers to implementation continue to fall, APQC concludes.
What do you think?
Accounts Payable (AP) functions are still drowning in paper, but that may be about to change. According to APQC’s Open Standards Benchmarking in accounts payable, on average 69.4 percent of invoices still require manual re-keying of line-item data, and only 20.1 percent of invoice line items are received electronically. Although myriad technologies perform an incredible array of tasks in successful companies around the globe, APQC (www.apqc.org) notes that AP departments are only now approaching the crucial tipping point where electronic invoicing will overtake manual, paper-based processes, a milestone expected to occur in 2011.
Electronic payment systems now on the market promise efficient communication, reliable audit trails, and faster/smoother data processing between internal departments and external suppliers. Differing systems offer various levels of transparency, approvals, and monitoring from procurement to payment. However, the common theme is less paper and less manual keying of data.
APQC says a typical transaction begins when a purchase order request is entered into the buyer’s system; once a supervisor provides approval, the appropriate vendor is notified. The vendor then generates an invoice while simultaneously arranging delivery of their goods or services. The invoice is then routed electronically to the AP department, which matches the invoice to the purchase order and, often, other documents that prove that goods or services were received as expected. Once the verification is complete, the transfer of the payment is then triggered.
The level of automation and sophistication can vary widely; a PDF of an invoice sent via email sits on one end of the automation spectrum, with a fully “touchless” integrated system that connects buyer and seller at the other end.
Automated payment technology has been in place at many large companies for years, but the systems were often large-scale customized initiatives, expensive both to build and maintain. As more advanced technology tools arrive on the market, the costs as well as the barriers to implementation continue to fall, APQC concludes.
What do you think?
Monday, July 26, 2010
Nailing Down Resource Allocation
Posted by Mark Brousseau
Resource allocation may be the key to IT project investment. Mike Kerrigan (mkerrigan@laurustech.com), vice president of business applications for Laurus Technologies (www.laurustech.com), explains:
The economic downturn may finally be changing directions but it still has a damper on every aspect of business, including IT departments. Even with some of the recent signs of recovery, businesses remains keen on cost savings, and spending is still prioritized around maintaining operations versus new initiatives and challenges. With budgets and resources remaining limited, companies need to be mindful of projects of significant value going by the wayside.
Businesses can recover by working smarter with fewer resources while maintaining high levels of quality and service. This is no easy task, but if everyone – from top down to bottom up – carefully considers what they’re implementing – the dollars spent will go toward the most worthwhile programs. To best allocate available resources, you’ll need to break down the type of information you have, identify the tools needed to pull that information together and focus on document management and workflow.
Breaking Down the 4-1-1
To start with, no matter what type of IT task it is, there is one common element – information. Information about what you want to do, information about how you are going to do it, information about how the plan is progressing (or not) and information about the end result. So a good place to start is a breakdown of the definitions associated with project information.
By Use
Project Governance: This type of data is used to steer individual projects at a high level, such as program and project portfolio management. It is typically referred to as “master data” or “status” information. This is mainly used by project owners or steering committee members on the single project management level, the portfolio manager, portfolio owner/ portfolio management team or other stakeholders.
Project Collaboration: These pieces of information are mainly used to deliver expected results. Major interest groups are Project Managers or Project Team Members. This data enables the whole team to carry out program tasks.
By Type
Project Management: This contains everything used to keep things running smoothly and in an organized fashion. It is strongly project-independent, but is similar across the board. For example, it may include meeting minutes, action item lists, open issue lists, schedules with delivery status information, timesheets, etc.
Project Content: Items needed to reach goals and attain desired results fall into this category. This may include technical plans, construction plans, ingredient lists, recipes, letters to third party suppliers, contracts, etc.
Getting a Grip on the Data – Tools You’ll Need
Now, how do you facilitate the governance of all of this information? It has become essential to manage, monitor, and assess the status of all projects through Enterprise Project Management (EPM). This is a set of uniform processes, methods and application packages. Typically, organizations that adopt EPM set up a Project Management Office (PMO) and select and adopt a specific Project Management Methodology (or create a proprietary method). They might even select and implement software tools to support Enterprise Project Management and collaboration.
EPM Tools: Enterprise project management tools focus on supporting single projects, no matter what type of program or content is involved. Also, this single-project information can be used for multi-project management or project portfolio management based on the master data and status information of all the work in an enterprise. Examples are pure tools for planning and controlling such as MS Project (Microsoft) or sophisticated solutions for managing the lifecycle of a single project. It may bring idea management, approvals, etc. to program and portfolio planning and control, like Clarity (computer Associates), MS Project Server (Microsoft), Primavera, etc. It includes components such as a project master data database, a workflow engine or a reporting engine.
Collaboration tools: These are often developed for various purposes, not just for project management. Facilitating collaboration within groups, these tools allow users to store documents, set up group folders and enable other features like group calendars and forums. Examples are eRoom/ Documentum (EMC) and SharePoint (Microsoft).
De-clutter Your Documents
The next step to optimal resource allocation is to review your document management. Have you ever stopped to think about how much time is wasted searching for documents? It doesn’t matter if you use a shared drive or document management platform – unless standards are in place, you are wasting resources to find what you need. The same can be said for saving documents. You spend time sifting through folders to figure out where something belongs. In the end, you wind up creating a new folder to add to the rest. This is computerized clutter at its finest!
The purpose of document management is to move information from individual computers to a shared space for broad access. By learning from historical data, you can reuse instead of recreating and stop flooding email inboxes with documentation. If project teams use a business process to create, access, and edit information in a centralized location, the need to constantly email documents would decrease tremendously. Only accurate versions of documents would be used. Best of all, team leaders and top management would know how and where to check on progress at all times.
Shared drives, which are just virtual filing cabinets, are very limiting and not so user-friendly. It is too easy to bury information in the multiple layers of folders. A document management platform (DMP) has many more features for easy navigation and searching. DMPs bring the ability to create wiki pages, a knowledge base, shared calendars and document version control. However, like the virtual filing cabinet, DMPs can be just as burdensome unless the following items are addressed:
• Blue Print: Plan a layout of how the tool will be used within the company
• Appoint: Name a Project Manager(s) to manage various areas within the tool
• Architect: Devise a structure and naming standard
• Instruct: Document and teach how to use the system
• Broadcast: Formally communicate the new way of saving and retrieving information
• Verify: Conduct reviews of the platform and hold people accountable
Workflow
Once you’ve taken care of the document stream, it’s time to work on overall project flow. You need to reach consensus on the methodology and process to be used for ALL efforts. If this is already defined, then review how well it is working, make any adjustments and communicate the process to everyone. Next, decide on baseline criteria for project selection – if it doesn’t meet the initial baseline, stop. If you do proceed, plan for continual assessments to ensure business alignment. We all know business needs, goals, and strategies continually change. And there’s no need to continue investing in a project that doesn’t fit in with your overarching business goals. Moving forward in the planning process requires many steps but there are two critical elements to include:
1. Breakdown Structure
2. Risk Register
A Work Breakdown Structure (WBS) is a graphical representation of the entire project with a forecast from beginning to end. By being graphical, a WBS fosters ease of communication of all the details. Think of a WBS like the instructions that come with a “do-it-yourself” kit. If you take time to review the instructions and lay out all of the components in advance, you minimize interruptions caused by searching for parts or tools that were indicated up front.
A risk register contains an ongoing list of anything positive or negative that might cause changes. An owner should be assigned to each risk and this “ownership” should continue throughout the project’s lifecycle. With proper risk evaluation, the team can get a probability of the threats that could cause the greatest impact. Based on this, contingency plans can be created. If needed, the risk owner – not the project manager, takes ACTION by putting the back-up plan in motion. It should have a minimal effect, since the risk was identified and built into the project timeline and budget.
Even in times of fiscal frugality, you can implement successful programs by applying a disciplined approach to all of your resources.
What do you think?
Resource allocation may be the key to IT project investment. Mike Kerrigan (mkerrigan@laurustech.com), vice president of business applications for Laurus Technologies (www.laurustech.com), explains:
The economic downturn may finally be changing directions but it still has a damper on every aspect of business, including IT departments. Even with some of the recent signs of recovery, businesses remains keen on cost savings, and spending is still prioritized around maintaining operations versus new initiatives and challenges. With budgets and resources remaining limited, companies need to be mindful of projects of significant value going by the wayside.
Businesses can recover by working smarter with fewer resources while maintaining high levels of quality and service. This is no easy task, but if everyone – from top down to bottom up – carefully considers what they’re implementing – the dollars spent will go toward the most worthwhile programs. To best allocate available resources, you’ll need to break down the type of information you have, identify the tools needed to pull that information together and focus on document management and workflow.
Breaking Down the 4-1-1
To start with, no matter what type of IT task it is, there is one common element – information. Information about what you want to do, information about how you are going to do it, information about how the plan is progressing (or not) and information about the end result. So a good place to start is a breakdown of the definitions associated with project information.
By Use
Project Governance: This type of data is used to steer individual projects at a high level, such as program and project portfolio management. It is typically referred to as “master data” or “status” information. This is mainly used by project owners or steering committee members on the single project management level, the portfolio manager, portfolio owner/ portfolio management team or other stakeholders.
Project Collaboration: These pieces of information are mainly used to deliver expected results. Major interest groups are Project Managers or Project Team Members. This data enables the whole team to carry out program tasks.
By Type
Project Management: This contains everything used to keep things running smoothly and in an organized fashion. It is strongly project-independent, but is similar across the board. For example, it may include meeting minutes, action item lists, open issue lists, schedules with delivery status information, timesheets, etc.
Project Content: Items needed to reach goals and attain desired results fall into this category. This may include technical plans, construction plans, ingredient lists, recipes, letters to third party suppliers, contracts, etc.
Getting a Grip on the Data – Tools You’ll Need
Now, how do you facilitate the governance of all of this information? It has become essential to manage, monitor, and assess the status of all projects through Enterprise Project Management (EPM). This is a set of uniform processes, methods and application packages. Typically, organizations that adopt EPM set up a Project Management Office (PMO) and select and adopt a specific Project Management Methodology (or create a proprietary method). They might even select and implement software tools to support Enterprise Project Management and collaboration.
EPM Tools: Enterprise project management tools focus on supporting single projects, no matter what type of program or content is involved. Also, this single-project information can be used for multi-project management or project portfolio management based on the master data and status information of all the work in an enterprise. Examples are pure tools for planning and controlling such as MS Project (Microsoft) or sophisticated solutions for managing the lifecycle of a single project. It may bring idea management, approvals, etc. to program and portfolio planning and control, like Clarity (computer Associates), MS Project Server (Microsoft), Primavera, etc. It includes components such as a project master data database, a workflow engine or a reporting engine.
Collaboration tools: These are often developed for various purposes, not just for project management. Facilitating collaboration within groups, these tools allow users to store documents, set up group folders and enable other features like group calendars and forums. Examples are eRoom/ Documentum (EMC) and SharePoint (Microsoft).
De-clutter Your Documents
The next step to optimal resource allocation is to review your document management. Have you ever stopped to think about how much time is wasted searching for documents? It doesn’t matter if you use a shared drive or document management platform – unless standards are in place, you are wasting resources to find what you need. The same can be said for saving documents. You spend time sifting through folders to figure out where something belongs. In the end, you wind up creating a new folder to add to the rest. This is computerized clutter at its finest!
The purpose of document management is to move information from individual computers to a shared space for broad access. By learning from historical data, you can reuse instead of recreating and stop flooding email inboxes with documentation. If project teams use a business process to create, access, and edit information in a centralized location, the need to constantly email documents would decrease tremendously. Only accurate versions of documents would be used. Best of all, team leaders and top management would know how and where to check on progress at all times.
Shared drives, which are just virtual filing cabinets, are very limiting and not so user-friendly. It is too easy to bury information in the multiple layers of folders. A document management platform (DMP) has many more features for easy navigation and searching. DMPs bring the ability to create wiki pages, a knowledge base, shared calendars and document version control. However, like the virtual filing cabinet, DMPs can be just as burdensome unless the following items are addressed:
• Blue Print: Plan a layout of how the tool will be used within the company
• Appoint: Name a Project Manager(s) to manage various areas within the tool
• Architect: Devise a structure and naming standard
• Instruct: Document and teach how to use the system
• Broadcast: Formally communicate the new way of saving and retrieving information
• Verify: Conduct reviews of the platform and hold people accountable
Workflow
Once you’ve taken care of the document stream, it’s time to work on overall project flow. You need to reach consensus on the methodology and process to be used for ALL efforts. If this is already defined, then review how well it is working, make any adjustments and communicate the process to everyone. Next, decide on baseline criteria for project selection – if it doesn’t meet the initial baseline, stop. If you do proceed, plan for continual assessments to ensure business alignment. We all know business needs, goals, and strategies continually change. And there’s no need to continue investing in a project that doesn’t fit in with your overarching business goals. Moving forward in the planning process requires many steps but there are two critical elements to include:
1. Breakdown Structure
2. Risk Register
A Work Breakdown Structure (WBS) is a graphical representation of the entire project with a forecast from beginning to end. By being graphical, a WBS fosters ease of communication of all the details. Think of a WBS like the instructions that come with a “do-it-yourself” kit. If you take time to review the instructions and lay out all of the components in advance, you minimize interruptions caused by searching for parts or tools that were indicated up front.
A risk register contains an ongoing list of anything positive or negative that might cause changes. An owner should be assigned to each risk and this “ownership” should continue throughout the project’s lifecycle. With proper risk evaluation, the team can get a probability of the threats that could cause the greatest impact. Based on this, contingency plans can be created. If needed, the risk owner – not the project manager, takes ACTION by putting the back-up plan in motion. It should have a minimal effect, since the risk was identified and built into the project timeline and budget.
Even in times of fiscal frugality, you can implement successful programs by applying a disciplined approach to all of your resources.
What do you think?
Thursday, July 15, 2010
New Survey -- Trends in Healthcare Payments Automation
Posted by Mark Brousseau
As an industry expert and healthcare provider, TAWPI and HIMSS’ Medical Banking Project would like to invite you to participate in a brief electronic survey designed to help healthcare organizations benchmark their payment operations.
The study will provide unbiased information to help healthcare organizations understand how their peers are using paper-based and electronic payments technologies and processes. Our objective is to help healthcare payments executives gain deeper insights about the adoption and effectiveness of these technologies and processes so they can make better informed strategic and tactical decisions.
Visit here to take the survey: http://www.surveymonkey.com/s/FHLFDN6
Full results on the survey will be published later this summer. The survey includes about three dozen questions and takes about 10 minutes to complete.
Participant Benefits:
Survey participants will receive a complimentary copy of the study, as well as special access to a Webinar on the results of the survey. In addition, you can choose one of the following incentives:
A $25 American Express Gift Card
A more than 50% discount on a full registration to TAWPI's Healthcare Payments Automation Summit (HPAS), scheduled for September 19-21 in Boston (a $395 savings!)
The study’s insights on how other healthcare organizations are using paper-based and electronic payments technologies and processes will be worth the 10 minutes or so you will spend completing the survey.
Confidentiality:
Participation in this survey is voluntary and anonymous. No individual results or information about individual organizations will be reported or recorded; only group/industry results will be reported.
How to Participate:
Click here: http://www.surveymonkey.com/s/FHLFDN6 to complete the survey. If you are not the correct person to fill out this survey please forward along to the proper person within your department.
You may complete the survey any time between now and August 1, 2010.
Your input means a great deal to us and we want to thank you for your time.
As an industry expert and healthcare provider, TAWPI and HIMSS’ Medical Banking Project would like to invite you to participate in a brief electronic survey designed to help healthcare organizations benchmark their payment operations.
The study will provide unbiased information to help healthcare organizations understand how their peers are using paper-based and electronic payments technologies and processes. Our objective is to help healthcare payments executives gain deeper insights about the adoption and effectiveness of these technologies and processes so they can make better informed strategic and tactical decisions.
Visit here to take the survey: http://www.surveymonkey.com/s/FHLFDN6
Full results on the survey will be published later this summer. The survey includes about three dozen questions and takes about 10 minutes to complete.
Participant Benefits:
Survey participants will receive a complimentary copy of the study, as well as special access to a Webinar on the results of the survey. In addition, you can choose one of the following incentives:
A $25 American Express Gift Card
A more than 50% discount on a full registration to TAWPI's Healthcare Payments Automation Summit (HPAS), scheduled for September 19-21 in Boston (a $395 savings!)
The study’s insights on how other healthcare organizations are using paper-based and electronic payments technologies and processes will be worth the 10 minutes or so you will spend completing the survey.
Confidentiality:
Participation in this survey is voluntary and anonymous. No individual results or information about individual organizations will be reported or recorded; only group/industry results will be reported.
How to Participate:
Click here: http://www.surveymonkey.com/s/FHLFDN6 to complete the survey. If you are not the correct person to fill out this survey please forward along to the proper person within your department.
You may complete the survey any time between now and August 1, 2010.
Your input means a great deal to us and we want to thank you for your time.
Labels:
EOBs,
ERAs,
healthcare payments automation,
IAPP,
IARP,
Mark Brousseau,
medical banking,
TAWPI
Tuesday, July 13, 2010
Trends in ACH Dispute Management
Trends in ACH Dispute Management
Thursday, August 12 at 1 p.m. eastern
As ACH volumes have grown, so too have the number of ACH transaction disputes that processors must manage. Expensive to handle, these disputes are subject to a complex mix of rules and regulations, and can lead to hefty charge-offs if improperly managed. Just how big a problem are ACH disputes? This webinar will share the results of an exclusive survey of ACH processors on trends in ACH dispute management, including volumes, costs, levels of automation, future plans and more. Attendees will be able to benchmark their operations, gain actionable insights from our panelists, and learn what some processors are doing to automate their ACH dispute processing.
To register, e-mail Dave Nitchman of IAPP-TAWPI at dnitchman@tawpi.org.
Panelists:
Rossana Salaris, principal, Radix Consulting
Amer Khan, senior vice president, product and sales support, eGistics
Moderator:
Mark Brousseau, facilitator, IAPP-TAWPI Payments and Receivables Council
Thursday, August 12 at 1 p.m. eastern
As ACH volumes have grown, so too have the number of ACH transaction disputes that processors must manage. Expensive to handle, these disputes are subject to a complex mix of rules and regulations, and can lead to hefty charge-offs if improperly managed. Just how big a problem are ACH disputes? This webinar will share the results of an exclusive survey of ACH processors on trends in ACH dispute management, including volumes, costs, levels of automation, future plans and more. Attendees will be able to benchmark their operations, gain actionable insights from our panelists, and learn what some processors are doing to automate their ACH dispute processing.
To register, e-mail Dave Nitchman of IAPP-TAWPI at dnitchman@tawpi.org.
Panelists:
Rossana Salaris, principal, Radix Consulting
Amer Khan, senior vice president, product and sales support, eGistics
Moderator:
Mark Brousseau, facilitator, IAPP-TAWPI Payments and Receivables Council
Monday, July 12, 2010
Economic risks of data overload
By Ed Pearce (epearce@egisticsinc.com) of eGistics (www.eGisticsinc.com)
When data pours in by the millisecond and the mountain of information builds continuously, professionals inevitably cut corners and go with their 'gut' when making decisions that can impact financial markets, medical treatments or any number of time sensitive matters, according to a new study from Thomson Reuters. The study indicates that when faced with unsorted, unverified "raw" data, 60 percent of decision-makers will make "intuitive" decisions that can lead to poor outcomes.
Many government regulators have flagged increased financial risk-taking, which can be traced in some degree to imperfectly managed data, as a contributor to the recent financial crisis. Moreover, the world is awash with data -- roughly 800 exabytes -- and the velocity of information is increasing, Thomson Reuters says.
The challenge is that the staffing and investment needed to ensure that information and information channels are trusted, reliable and useful is not keeping pace. In fact, it is estimated that the information universe will increase by a factor of 44; the number of managed files by a factor of 67; storage by a factor of 30 but staffing and investment in careful management by a factor of 1.4.
"The solution to data overload is to provide decision makers with what Thomson Reuters calls Intelligent Information: better organized and structured information, rapidly conveyed to the users preferred device," says David Craig, executive vice president and chief strategy officer.
Fortunately, as the Thomson Reuters study notes, the same technological revolution that has resulted in the explosion of information also opens the way to new and improved tools for providing intelligent information: better organized and structured information, rapidly conveyed to the user's preferred device.
"We must use the benefits of the information technology revolution to minimize its risks. This is a joint task that the private sector and governments must closely focus on if we are to avoid systemic crises, in the future, whether we speak of finance, healthcare delivery, international security and a myriad of other areas," comments Craig.
How is your organization managing information overload?
When data pours in by the millisecond and the mountain of information builds continuously, professionals inevitably cut corners and go with their 'gut' when making decisions that can impact financial markets, medical treatments or any number of time sensitive matters, according to a new study from Thomson Reuters. The study indicates that when faced with unsorted, unverified "raw" data, 60 percent of decision-makers will make "intuitive" decisions that can lead to poor outcomes.
Many government regulators have flagged increased financial risk-taking, which can be traced in some degree to imperfectly managed data, as a contributor to the recent financial crisis. Moreover, the world is awash with data -- roughly 800 exabytes -- and the velocity of information is increasing, Thomson Reuters says.
The challenge is that the staffing and investment needed to ensure that information and information channels are trusted, reliable and useful is not keeping pace. In fact, it is estimated that the information universe will increase by a factor of 44; the number of managed files by a factor of 67; storage by a factor of 30 but staffing and investment in careful management by a factor of 1.4.
"The solution to data overload is to provide decision makers with what Thomson Reuters calls Intelligent Information: better organized and structured information, rapidly conveyed to the users preferred device," says David Craig, executive vice president and chief strategy officer.
Fortunately, as the Thomson Reuters study notes, the same technological revolution that has resulted in the explosion of information also opens the way to new and improved tools for providing intelligent information: better organized and structured information, rapidly conveyed to the user's preferred device.
"We must use the benefits of the information technology revolution to minimize its risks. This is a joint task that the private sector and governments must closely focus on if we are to avoid systemic crises, in the future, whether we speak of finance, healthcare delivery, international security and a myriad of other areas," comments Craig.
How is your organization managing information overload?
Saturday, July 10, 2010
Same-day ACH settlement highlights need for better dispute management tools
By Ed Pearce (epearce@egisticsinc.com)
Last week's announcement by the Federal Reserve Board of posting rules for a new same-day automated clearing house (ACH) service brought the topic front and center. Everyone from industry analysts and bloggers to trade publications and associations have expounded the pros and cons of same-day settlement. But virtually unmentioned in the all the hubbub is the potential for more ACH disputes as a result of accelerated settlement -- a scenario most banks are ill-prepared to manage.
Starting next month, the Federal Reserve Banks will be offering a same-day settlement service for certain ACH debit payments through its FedACH service. FedACH customers may opt-in to the service by completing a participation agreement. The service will be limited to transactions arising from consumer checks converted to ACH and consumer debit transfers initiated over the Internet and phone. Same-day forward debit transfers will post to a financial institution's Federal Reserve account at 5 p.m. eastern time, while same-day return debit transfers will post at 5:30 p.m.
As a result of the faster settlement, banks undoubtedly will see more consumers coming into their branches complaining of unauthorized transactions. The limitations of traditional in-house ACH systems and the strict time constraints and complex processing requirements imposed by NACHA rules and Regulation E already have led to sharp increases in operations expenses and higher charge-offs associated with ACH disputes. A new influx of consumer disputes will require financial institutions to implement a more centralized, more streamlined approach to dispute management.
Several features will be critical:
• Real-time distributed data access to any authorized user, anywhere
• Intuitive search capabilities
• The ability to annotate comments to disputed transactions
• The ability to export data
• Expanded search capabilities
• Filtering capabilities to block and restrict access to certain transactions
• Unlimited data storage
It may be some time before same-day ACH settlement achieves critical mass. But the next generation of consumers will demand it. This means that banks must begin adapting their ACH infrastructure today or risk even higher operations costs, as well as falling behind the competition. And this includes deploying sophisticated solutions to manage the inevitable spike in ACH disputes.
Last week's announcement by the Federal Reserve Board of posting rules for a new same-day automated clearing house (ACH) service brought the topic front and center. Everyone from industry analysts and bloggers to trade publications and associations have expounded the pros and cons of same-day settlement. But virtually unmentioned in the all the hubbub is the potential for more ACH disputes as a result of accelerated settlement -- a scenario most banks are ill-prepared to manage.
Starting next month, the Federal Reserve Banks will be offering a same-day settlement service for certain ACH debit payments through its FedACH service. FedACH customers may opt-in to the service by completing a participation agreement. The service will be limited to transactions arising from consumer checks converted to ACH and consumer debit transfers initiated over the Internet and phone. Same-day forward debit transfers will post to a financial institution's Federal Reserve account at 5 p.m. eastern time, while same-day return debit transfers will post at 5:30 p.m.
As a result of the faster settlement, banks undoubtedly will see more consumers coming into their branches complaining of unauthorized transactions. The limitations of traditional in-house ACH systems and the strict time constraints and complex processing requirements imposed by NACHA rules and Regulation E already have led to sharp increases in operations expenses and higher charge-offs associated with ACH disputes. A new influx of consumer disputes will require financial institutions to implement a more centralized, more streamlined approach to dispute management.
Several features will be critical:
• Real-time distributed data access to any authorized user, anywhere
• Intuitive search capabilities
• The ability to annotate comments to disputed transactions
• The ability to export data
• Expanded search capabilities
• Filtering capabilities to block and restrict access to certain transactions
• Unlimited data storage
It may be some time before same-day ACH settlement achieves critical mass. But the next generation of consumers will demand it. This means that banks must begin adapting their ACH infrastructure today or risk even higher operations costs, as well as falling behind the competition. And this includes deploying sophisticated solutions to manage the inevitable spike in ACH disputes.
Capture 2011: From Imaging to Archive
Coming in Early 2011 -- Actionable ideas for improving document-driven business applications!
TAWPI, IAPP and IARP have joined forces to create Capture 2011 – the premier event on complex data capture and transactional content management. This one-of-a-kind event focuses on emerging technologies and best practices for the automation of critical document-driven applications such as: invoice processing, order entry, application processing, loan processing, tax processing, healthcare payments processing, mailroom automation, payments and more!
Through end-user case study presentations, interactive panel discussions, and visionary keynote presentations, attendees will gain actionable strategies for improving business outcomes in document-driven applications. The event also will feature valuable networking opportunities, and an expo hall in which attendees can see data capture and transactional content management technologies and services firsthand.
Topics covered will include:
• Complex data capture
• Content management
• Workflow/decisioning
• Enterprise data capture
• Information archive/storage/delivery
• Data security/privacy/compliance
• SharePoint optimization
Vertical markets covered:
• Banking/financial services
• Insurance
• Government
• Healthcare
• Service bureaus
• Utilities/telcos
• Retail/mail order
• And more!
Capture 2011 will be an unparalleled event for professionals responsible for managing document-driven business applications. Don’t miss it!
For more details, visit www.tawpi.org or www.iappnet.org.
TAWPI, IAPP and IARP have joined forces to create Capture 2011 – the premier event on complex data capture and transactional content management. This one-of-a-kind event focuses on emerging technologies and best practices for the automation of critical document-driven applications such as: invoice processing, order entry, application processing, loan processing, tax processing, healthcare payments processing, mailroom automation, payments and more!
Through end-user case study presentations, interactive panel discussions, and visionary keynote presentations, attendees will gain actionable strategies for improving business outcomes in document-driven applications. The event also will feature valuable networking opportunities, and an expo hall in which attendees can see data capture and transactional content management technologies and services firsthand.
Topics covered will include:
• Complex data capture
• Content management
• Workflow/decisioning
• Enterprise data capture
• Information archive/storage/delivery
• Data security/privacy/compliance
• SharePoint optimization
Vertical markets covered:
• Banking/financial services
• Insurance
• Government
• Healthcare
• Service bureaus
• Utilities/telcos
• Retail/mail order
• And more!
Capture 2011 will be an unparalleled event for professionals responsible for managing document-driven business applications. Don’t miss it!
For more details, visit www.tawpi.org or www.iappnet.org.
Wednesday, July 7, 2010
Putting the kibosh on the soaring software maintenance and upgrade costs
By Randy Davis (rdavis@egisticsinc.com)
Finextra reports that in a recent speech to the Committee for Economic Development in Australia (CEDA), CBA Chief Information Officer Michael Harte lambasted legacy technology vendors for their slow embrace of cloud-based computing and their apparent preference for solutions that lock-in users to a "never-ending spiral" of costly maintenance and upgrades.
"We're saying that we will never buy another data center. We will never buy another rack or server or storage device or network device again," Harte said. "I will never let any organization that I work for get locked into proprietary hardware or software again. I'll never tell my teams in the business that it will be weeks to get them hardware provision. I'll never pay upfront for any infrastructure and certainly would never pay for any, or rent any, infrastructure that I would never use."
Harte concluded: "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."
With increasing demand for cloud-based solutions, combined with a general reluctance to pay hefty upfront capital costs, Harte's comments would seem to reflect growing dissatisfaction with the traditional licensed software model -- and its “never-ending spiral” of ongoing expenses.
Are you as fed-up as Harte?
Finextra reports that in a recent speech to the Committee for Economic Development in Australia (CEDA), CBA Chief Information Officer Michael Harte lambasted legacy technology vendors for their slow embrace of cloud-based computing and their apparent preference for solutions that lock-in users to a "never-ending spiral" of costly maintenance and upgrades.
"We're saying that we will never buy another data center. We will never buy another rack or server or storage device or network device again," Harte said. "I will never let any organization that I work for get locked into proprietary hardware or software again. I'll never tell my teams in the business that it will be weeks to get them hardware provision. I'll never pay upfront for any infrastructure and certainly would never pay for any, or rent any, infrastructure that I would never use."
Harte concluded: "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."
With increasing demand for cloud-based solutions, combined with a general reluctance to pay hefty upfront capital costs, Harte's comments would seem to reflect growing dissatisfaction with the traditional licensed software model -- and its “never-ending spiral” of ongoing expenses.
Are you as fed-up as Harte?
The state of storage
Randy Davis (rdavis@egisticsinc.com) of eGistics, Inc. (www.egisticsinc.com) finds several interesting trends in The 2010 State of Storage Report from Networking Computing.
1. The top planned storage project for 2010 is improved allocation
2. Forty-seven percent of respondents say insufficient storage resources for mission-critical applications is their No. 1 concern
3. Storage area network (SAN) vendors are responding to demands for lower-cost storage
4. Storage virtualization is growing
5. Thin provisioning is catching on
6. There is a significant increase in interest in cloud-based storage
How do these trends reflect your storage strategy?
1. The top planned storage project for 2010 is improved allocation
2. Forty-seven percent of respondents say insufficient storage resources for mission-critical applications is their No. 1 concern
3. Storage area network (SAN) vendors are responding to demands for lower-cost storage
4. Storage virtualization is growing
5. Thin provisioning is catching on
6. There is a significant increase in interest in cloud-based storage
How do these trends reflect your storage strategy?
A welcome cloud during the economic recovery
By Ed Pearce (epearce@egisticsinc.com)
In spite of hopeful signs that the economy is on the mend, the 2010 State of Storage report from Network Computing finds that the fallout from the recession has left IT execs without the resources necessary to store the rising volume of information required to support their business applications.
Nearly half (47 percent) of the respondents to the survey say they have insufficient storage resources for their mission-critical applications, while 30 percent say they have insufficient tools for storage management. Another 30 percent of respondents say they have insufficient storage resources for departmental/individual use. Nineteen percent say they lack staff for their storage requirements.
And -- regardless of economic "green shoots" -- the situation isn't likely to change any time soon: 34 percent of respondents say they have an insufficient storage budget to meet their business demands.
Against this backdrop, it's little wonder that survey respondents are showing increased interest in cloud storage services (34 percent in 2010 versus 19 percent in the 2009 State of Storage report).
With a hosted variable cost storage model, if your business struggles, and your volumes drop, your operations costs will be aligned with your usage, and you won’t pay for a “just-in-case” capital investment. The variable cost model also eliminates the need for capital investment (software licenses and hardware) or maintenance contracts; customers typically are charged a one-time load fee to archive documents. And when an array fills up, or a server must be replaced, it’s your service provider’s problem. Using a thin-client interface, there may not even be software to install, manage or maintain. In addition, variably priced storage solutions can facilitate more effective operations by providing scalability that would be very cost prohibitive in a traditional, licensed in-house system.
CBA Chief Information Officer Michael Harte spoke for many users when he recently told the Committee for Economic Development in Australia that, "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."
With the economic recovery still gaining strength, the trend for 2010 will be the more efficient use of existing IT resources. That should make hosted solutions a welcome cloud during the turnaround.
In spite of hopeful signs that the economy is on the mend, the 2010 State of Storage report from Network Computing finds that the fallout from the recession has left IT execs without the resources necessary to store the rising volume of information required to support their business applications.
Nearly half (47 percent) of the respondents to the survey say they have insufficient storage resources for their mission-critical applications, while 30 percent say they have insufficient tools for storage management. Another 30 percent of respondents say they have insufficient storage resources for departmental/individual use. Nineteen percent say they lack staff for their storage requirements.
And -- regardless of economic "green shoots" -- the situation isn't likely to change any time soon: 34 percent of respondents say they have an insufficient storage budget to meet their business demands.
Against this backdrop, it's little wonder that survey respondents are showing increased interest in cloud storage services (34 percent in 2010 versus 19 percent in the 2009 State of Storage report).
With a hosted variable cost storage model, if your business struggles, and your volumes drop, your operations costs will be aligned with your usage, and you won’t pay for a “just-in-case” capital investment. The variable cost model also eliminates the need for capital investment (software licenses and hardware) or maintenance contracts; customers typically are charged a one-time load fee to archive documents. And when an array fills up, or a server must be replaced, it’s your service provider’s problem. Using a thin-client interface, there may not even be software to install, manage or maintain. In addition, variably priced storage solutions can facilitate more effective operations by providing scalability that would be very cost prohibitive in a traditional, licensed in-house system.
CBA Chief Information Officer Michael Harte spoke for many users when he recently told the Committee for Economic Development in Australia that, "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."
With the economic recovery still gaining strength, the trend for 2010 will be the more efficient use of existing IT resources. That should make hosted solutions a welcome cloud during the turnaround.
Capture users eye data extraction and document classification improvements
By Mark Brousseau
When it comes to enterprise capture, end users cite data extraction as the most important process to automate. A whopping 60 percent of respondents to a recent TAWPI Question of the Week identified data capture as the most important step to automate, far outpacing classification/sorting and integration with business processes, which were each identified by 20 percent of respondents.
None of the survey respondents saw filing/archival or data validation as important to automate.
Derrick Murphy (dmurphy@ibml.com), president and CEO of ibml (www.ibml.com), a Birmingham, AL-based document scanning solutions provider, isn't surprised by the survey results. "Data extraction and classification probably offer the greatest opportunities for cost reductions in the enterprise capture process," Murphy says. "By automating data extraction, users can reduce the amount of manual intervention required for data entry and quality control. And the more you can classify automatically, the less pre-sorting of documents is required. This results in decreased document preparation, lower operations costs, and fewer delays in getting documents scanned."
Murphy also sees value in integrating enterprise capture with legacy business processes to reduce or eliminate exceptions and special handling, and to accelerate access to mission-critical information.
What aspect of the capture process is the most important to automate in your operation?
When it comes to enterprise capture, end users cite data extraction as the most important process to automate. A whopping 60 percent of respondents to a recent TAWPI Question of the Week identified data capture as the most important step to automate, far outpacing classification/sorting and integration with business processes, which were each identified by 20 percent of respondents.
None of the survey respondents saw filing/archival or data validation as important to automate.
Derrick Murphy (dmurphy@ibml.com), president and CEO of ibml (www.ibml.com), a Birmingham, AL-based document scanning solutions provider, isn't surprised by the survey results. "Data extraction and classification probably offer the greatest opportunities for cost reductions in the enterprise capture process," Murphy says. "By automating data extraction, users can reduce the amount of manual intervention required for data entry and quality control. And the more you can classify automatically, the less pre-sorting of documents is required. This results in decreased document preparation, lower operations costs, and fewer delays in getting documents scanned."
Murphy also sees value in integrating enterprise capture with legacy business processes to reduce or eliminate exceptions and special handling, and to accelerate access to mission-critical information.
What aspect of the capture process is the most important to automate in your operation?
Monday, July 5, 2010
Digitizing records is an untapped opportunity for many organizations
Posted by Mark Brousseau
In a recent document management industry survey sponsored by Oce Business Services, 83 percent of respondents indicated that their organization has a records management program in place. However, only 10 percent of the executives surveyed said their company's program included an integrated electronic records repository. This indicates that integrating digital processes and technology into their records programs could be an untapped opportunity for many companies.
Today's challenging business environment includes stringent statutory and regulatory mandates from a host of entities that require following specific processes. Digitizing business records – when implemented in an organized fashion using best practices – can help ensure that records are easily retrievable, storage costs are under control, legal discovery costs are mitigated and the organization is compliant.
An effective electronic document management program spans the capture, management, storage, preservation, and delivery of document images. Oce highlights the following key steps for developing and designing a thorough document capture process.
Sorting and Preparing
Sorting and preparing sets the stage for efficiently digitizing hard copy documents. In these steps, documents are sorted into a logical sequence to ensure that they are properly identified and routed for the capture process. Developing and implementing a logical sort sequence enables organizations to maximize the scanning process, insure intelligent distribution of images, and enhance future retrieval requirements. During this process technologies such as barcodes can be applied to reduce manual intervention and automate workflow processes, record retention, and image retrieval. Document preparation helps ensure that there are no obstacles present to interfere with maximum document processing.
Scanning
The preparation process is designed to make sure that documents will be transported through the scanner cleanly and efficiently. Using the proper equipment is important to this step. Scanner manufacturers provide a rated processing speed for each model they produce. When determining the proper scanner to use, organizations should consider the number of separator sheets that will be inserted in order to estimate the true volume and speed of the scanning process. (Separator sheets are pre-printed sheets of paper that have codes on them. Each time the scanning software encounters a separator sheet, it creates a separate document containing the pages found under it.) Other determining factors include size of paper to be scanned, simplex (one-sided page) versus duplex (two-sided page) scanning, and color requirements.
Indexing
The final step of the capture process is indexing. This is the process of assigning metadata (defined as data about data) to each image file. This metadata can be applied in a multitude of ways. One is manual data entry, which entails viewing the images and manually typing in certain metadata residing on the electronic image. Another method is using OCR (optical character recognition) software, which extracts certain data elements from the images and then applies this metadata to the image file. This eliminates possible operator error and improves metadata accuracy. A third approach is to employ technology that reads a barcode, a unique label on the document containing data that can be read by a computer.
In a recent document management industry survey sponsored by Oce Business Services, 83 percent of respondents indicated that their organization has a records management program in place. However, only 10 percent of the executives surveyed said their company's program included an integrated electronic records repository. This indicates that integrating digital processes and technology into their records programs could be an untapped opportunity for many companies.
Today's challenging business environment includes stringent statutory and regulatory mandates from a host of entities that require following specific processes. Digitizing business records – when implemented in an organized fashion using best practices – can help ensure that records are easily retrievable, storage costs are under control, legal discovery costs are mitigated and the organization is compliant.
An effective electronic document management program spans the capture, management, storage, preservation, and delivery of document images. Oce highlights the following key steps for developing and designing a thorough document capture process.
Sorting and Preparing
Sorting and preparing sets the stage for efficiently digitizing hard copy documents. In these steps, documents are sorted into a logical sequence to ensure that they are properly identified and routed for the capture process. Developing and implementing a logical sort sequence enables organizations to maximize the scanning process, insure intelligent distribution of images, and enhance future retrieval requirements. During this process technologies such as barcodes can be applied to reduce manual intervention and automate workflow processes, record retention, and image retrieval. Document preparation helps ensure that there are no obstacles present to interfere with maximum document processing.
Scanning
The preparation process is designed to make sure that documents will be transported through the scanner cleanly and efficiently. Using the proper equipment is important to this step. Scanner manufacturers provide a rated processing speed for each model they produce. When determining the proper scanner to use, organizations should consider the number of separator sheets that will be inserted in order to estimate the true volume and speed of the scanning process. (Separator sheets are pre-printed sheets of paper that have codes on them. Each time the scanning software encounters a separator sheet, it creates a separate document containing the pages found under it.) Other determining factors include size of paper to be scanned, simplex (one-sided page) versus duplex (two-sided page) scanning, and color requirements.
Indexing
The final step of the capture process is indexing. This is the process of assigning metadata (defined as data about data) to each image file. This metadata can be applied in a multitude of ways. One is manual data entry, which entails viewing the images and manually typing in certain metadata residing on the electronic image. Another method is using OCR (optical character recognition) software, which extracts certain data elements from the images and then applies this metadata to the image file. This eliminates possible operator error and improves metadata accuracy. A third approach is to employ technology that reads a barcode, a unique label on the document containing data that can be read by a computer.
How to communicate without saying a word
Posted by Tom Walker, portfolio manager, SAP Accounts Payable Solution, Open Text Corporation:
How to communication without saying a word?
This can be a difficult challenge in the world of Accounts Payable when working to post invoices accurately and quickly. Just accurately and quickly alone is a major task but when you add “quietly”…is it really possible?
Think of all the people involved…Accounts Payable Professionals, Approvers, Corporate Procurement, Field Procurement, Receiving, Contract Management, Master Data Management, Tax Professionals…just to name a few. There are a number of Vendors offering solutions to address the accurate and quick...although in many cases you have to decide…do you want it accurate or quick…one or the other but not both. Yet very few address the quietly issue.
Why is this important? For invoices that are received and immediately posted without any human intervention due to issues such as problem resolution or approval, communication is not a critical factor. Yet when that 80/20 rule kicks in where 20% of your invoices result in 80% of the problems, the Accounts Payable Professional must reach out and communicate. They need to communicate with the individuals that have both the knowledge and security authorization to resolve / approve invoices as required by best practice separation of duties.
As an example, in an ERP such as SAP this communication is often started by running a report such as MRBR to find invoices blocked for payment. Without a solution that includes “quietly” as a building block, the first communication triggers a barrage of activity including but not limited to emails, phone calls, entries into spreadsheets for follow up, follow up calls, making copies of invoices and pulling contracts.
So how do you add “quietly” to the process flow? You must examine the entire process flow from how you receive the invoice, how you capture the meta data at the header and line item level, how you determine if there is a problem and then who must be involved to resolve / approve. Equally important is anticipate what that person requires to complete the task…such as…access to invoice and related document images, history of others that have worked on the process including their comments, transactional data such as purchase order, goods receipt, prior postings to purchase order and options to resolution / approval.
One excellent example of a “quite” solution is provided by SAP with their SAP Invoice Management and optional OCR.
One last thought…quiet extends to reporting also…you need to anticipate the need for information related to the invoice. While invoice payment status is certainly important you must also anticipate others will want to know trends such as invoices paid without problem and if a problem…what type of problem is most common. Yet a truly quiet process goes beyond the expected reporting…the invoice occurred because of a purchase…the purchase occurred due to a larger business process such as a building project and so on. You must anticipate that others must be able to see the invoice as part of the bigger picture.
This bigger picture is ECM. You would expect that a large ERP would anticipate this more holistic requirement and SAP has also done that by providing an ECM solution through it partnership with Open Text that takes the invoice and quietly makes it available as part of the ECM big picture. This allows you to see for example all the invoices from one vendor on one project in one virtual view or to see all the invoices related to the project regardless of vendor. No longer is it required to communicate and ask the Accounts Payable Professional to accumulate all the related information and wait for a response…it is already waiting for you to access immediate and quietly.
So…accurate…quick…quiet…yes it is possible!
How to communication without saying a word?
This can be a difficult challenge in the world of Accounts Payable when working to post invoices accurately and quickly. Just accurately and quickly alone is a major task but when you add “quietly”…is it really possible?
Think of all the people involved…Accounts Payable Professionals, Approvers, Corporate Procurement, Field Procurement, Receiving, Contract Management, Master Data Management, Tax Professionals…just to name a few. There are a number of Vendors offering solutions to address the accurate and quick...although in many cases you have to decide…do you want it accurate or quick…one or the other but not both. Yet very few address the quietly issue.
Why is this important? For invoices that are received and immediately posted without any human intervention due to issues such as problem resolution or approval, communication is not a critical factor. Yet when that 80/20 rule kicks in where 20% of your invoices result in 80% of the problems, the Accounts Payable Professional must reach out and communicate. They need to communicate with the individuals that have both the knowledge and security authorization to resolve / approve invoices as required by best practice separation of duties.
As an example, in an ERP such as SAP this communication is often started by running a report such as MRBR to find invoices blocked for payment. Without a solution that includes “quietly” as a building block, the first communication triggers a barrage of activity including but not limited to emails, phone calls, entries into spreadsheets for follow up, follow up calls, making copies of invoices and pulling contracts.
So how do you add “quietly” to the process flow? You must examine the entire process flow from how you receive the invoice, how you capture the meta data at the header and line item level, how you determine if there is a problem and then who must be involved to resolve / approve. Equally important is anticipate what that person requires to complete the task…such as…access to invoice and related document images, history of others that have worked on the process including their comments, transactional data such as purchase order, goods receipt, prior postings to purchase order and options to resolution / approval.
One excellent example of a “quite” solution is provided by SAP with their SAP Invoice Management and optional OCR.
One last thought…quiet extends to reporting also…you need to anticipate the need for information related to the invoice. While invoice payment status is certainly important you must also anticipate others will want to know trends such as invoices paid without problem and if a problem…what type of problem is most common. Yet a truly quiet process goes beyond the expected reporting…the invoice occurred because of a purchase…the purchase occurred due to a larger business process such as a building project and so on. You must anticipate that others must be able to see the invoice as part of the bigger picture.
This bigger picture is ECM. You would expect that a large ERP would anticipate this more holistic requirement and SAP has also done that by providing an ECM solution through it partnership with Open Text that takes the invoice and quietly makes it available as part of the ECM big picture. This allows you to see for example all the invoices from one vendor on one project in one virtual view or to see all the invoices related to the project regardless of vendor. No longer is it required to communicate and ask the Accounts Payable Professional to accumulate all the related information and wait for a response…it is already waiting for you to access immediate and quietly.
So…accurate…quick…quiet…yes it is possible!
Labels:
AP,
document automation,
document management,
document scanning,
ICR,
invoice processing,
Mark Brousseau,
OCR,
Open Text,
SAP,
TAWPI
Thursday, July 1, 2010
I can see clearly now
Posted by Mark Brousseau
Recent advances in technology are raising the bar for payments analytics capabilities. Some of these technology enablers include: higher capacity and more affordable disk storage; database management systems with data partitioning; and evolving data warehouse capabilities. Leilani Doyle (ldoyle@usdataworks.com), product manager with US Dataworks (www.usdataworks.com) says these advances couldn't have come at a better time:
As a result of the recession, enterprises are under pressure to predict the internal performance of the organization more precisely than ever before. For organizations that do this well, the payoff includes lower costs, higher customer retention, increased responsiveness, a reduction in fraud, increased productivity and ultimately, increased profitability, explains David White of Aberdeen Group.
An Aberdeen Group benchmark report found that Best-in-Class companies used analytics solutions to improve their ability to detect risk by two and a half times, and that 76 percent of Best-in-Class organizations enjoyed a customer retention rate of 90 percent or better, thanks in part to analytics.
Drawn by these benefits, more organizations are evaluating analytics solutions, particularly for payments environments. Twenty-four percent of organizations plan to adopt analytics technologies within the next year, reports Aberdeen Group. "The fundamental drivers of adoption remain strong," says Dan Vesset, program vice president for IDC's Business Analytics Solutions Research service, adding that even during the recession, the business analytics software market continued to grow.
In payments, demands for information reporting and analysis have never been greater. Key decisions depend on the ability to accurately monitor, measure and predict operational needs and payment trends. Most organizations try to bridge legacy payment silos, only to find themselves left wanting -- despite a tremendous effort expended to integrate, report and analyze data across payment channels.
The Case for Analytics
Several trends are driving demand from payments processors for business analytics solutions:
• The need to predict the future. All operations managers, treasury executives and risk managers rely on historical data in order to properly plan for the future. The more timely and more complete the data these individuals have at their disposal, the better the predictive models can be. The same concept holds true for predicting consumer buying behaviors.
• Increasing complexity of payment types. New payment channels have emerged over the last few years, and we are certain to see the evolution of mobile payments during the next year or so. There's no question that consumers and businesses alike are rapidly changing the way they make payments. In the past, consolidating this data for analytical purposes was extremely difficult, if not impossible. Today's advanced analytics tools solve this problem.
• Fraud detection. When it comes to payments, fraudsters are becoming more sophisticated, and the economic downturn has made them more desperate than ever. Businesses need to protect their assets with superior fraud detection. A key element of any fraud detection program is the early and accurate identification of unusual patterns and behaviors. Today's advanced analytics tools give organizations the ability to analyze payments data across channels and produce actionable assessments that can prevent or stop fraudulent activity.
• Staff management. Using analytics to identify changes in peak processing windows enables organizations to better manage their staff, which have been stretched thin after the recession.
• Reduced cost. Consolidating data with an enterprise analytics solution enables organizations to decommission redundant systems, some of which are maintained only for their archived data; some large companies are sitting on old, out-of-production systems for this very reason.
The Bottom Line
Today, financial institutions, corporate billers, and government entities are challenged to manage images and data from multiple payments channels, and to provide a high-level of data protection, data accessibility, and data tracking and reporting for compliance. Adding to these challenges are ever-increasing demands for real-time analytics to improve corporate agility and customer responsiveness. Traditional siloed payment archives are too costly, too inefficient and too fragmented to be effective. The answer lies in emerging payments analytics solutions.
Using analytics solutions, organizations can finally get a clear and timely view of their operations.
What do you think?
Recent advances in technology are raising the bar for payments analytics capabilities. Some of these technology enablers include: higher capacity and more affordable disk storage; database management systems with data partitioning; and evolving data warehouse capabilities. Leilani Doyle (ldoyle@usdataworks.com), product manager with US Dataworks (www.usdataworks.com) says these advances couldn't have come at a better time:
As a result of the recession, enterprises are under pressure to predict the internal performance of the organization more precisely than ever before. For organizations that do this well, the payoff includes lower costs, higher customer retention, increased responsiveness, a reduction in fraud, increased productivity and ultimately, increased profitability, explains David White of Aberdeen Group.
An Aberdeen Group benchmark report found that Best-in-Class companies used analytics solutions to improve their ability to detect risk by two and a half times, and that 76 percent of Best-in-Class organizations enjoyed a customer retention rate of 90 percent or better, thanks in part to analytics.
Drawn by these benefits, more organizations are evaluating analytics solutions, particularly for payments environments. Twenty-four percent of organizations plan to adopt analytics technologies within the next year, reports Aberdeen Group. "The fundamental drivers of adoption remain strong," says Dan Vesset, program vice president for IDC's Business Analytics Solutions Research service, adding that even during the recession, the business analytics software market continued to grow.
In payments, demands for information reporting and analysis have never been greater. Key decisions depend on the ability to accurately monitor, measure and predict operational needs and payment trends. Most organizations try to bridge legacy payment silos, only to find themselves left wanting -- despite a tremendous effort expended to integrate, report and analyze data across payment channels.
The Case for Analytics
Several trends are driving demand from payments processors for business analytics solutions:
• The need to predict the future. All operations managers, treasury executives and risk managers rely on historical data in order to properly plan for the future. The more timely and more complete the data these individuals have at their disposal, the better the predictive models can be. The same concept holds true for predicting consumer buying behaviors.
• Increasing complexity of payment types. New payment channels have emerged over the last few years, and we are certain to see the evolution of mobile payments during the next year or so. There's no question that consumers and businesses alike are rapidly changing the way they make payments. In the past, consolidating this data for analytical purposes was extremely difficult, if not impossible. Today's advanced analytics tools solve this problem.
• Fraud detection. When it comes to payments, fraudsters are becoming more sophisticated, and the economic downturn has made them more desperate than ever. Businesses need to protect their assets with superior fraud detection. A key element of any fraud detection program is the early and accurate identification of unusual patterns and behaviors. Today's advanced analytics tools give organizations the ability to analyze payments data across channels and produce actionable assessments that can prevent or stop fraudulent activity.
• Staff management. Using analytics to identify changes in peak processing windows enables organizations to better manage their staff, which have been stretched thin after the recession.
• Reduced cost. Consolidating data with an enterprise analytics solution enables organizations to decommission redundant systems, some of which are maintained only for their archived data; some large companies are sitting on old, out-of-production systems for this very reason.
The Bottom Line
Today, financial institutions, corporate billers, and government entities are challenged to manage images and data from multiple payments channels, and to provide a high-level of data protection, data accessibility, and data tracking and reporting for compliance. Adding to these challenges are ever-increasing demands for real-time analytics to improve corporate agility and customer responsiveness. Traditional siloed payment archives are too costly, too inefficient and too fragmented to be effective. The answer lies in emerging payments analytics solutions.
Using analytics solutions, organizations can finally get a clear and timely view of their operations.
What do you think?
Subscribe to:
Posts (Atom)