By Mark Brousseau
“ICD-10 is probably one of the biggest changes to occur in health IT in 30 years,” Dr. Joe Nichols, Edifecs medical director, told attendees at the Medical Banking Project Boot Camp this afternoon at HIMSS10 in Atlanta. “It is massive.”
ICD codes, which were developed for coding institutionally related procedures, are maintained by the World Health Organization (WHO). Most developed countries other than the United States use ICD-10, Nichols noted. The United States still uses ICD-9 codes. The international version of ICD-10 contains approximately 12,400 diagnostic codes. WHO approved the U.S. version of ICD-10, which contains approximately 69,000 codes.
As of October 1, 2013, all claims in the United States must use ICD-10.
Why is this so important? Because ICD-10 is a cornerstone of healthcare information, Nichols said. “It is the standard for defining the health state of the patient, and the institutional procedures that patients may receive to maintain or improve their health state,” he explained. “This is a big change in the coding system.” What was 14,300 codes under ICD-9 will rise to 69,000 codes under ICD-10, Nichols noted, with the number of procedure codes increasing from 3,800 to 72,000 under ICD-10.
With ICD codes pervasive throughout most health systems, and many business functions impacted by the codes, it is important that healthcare organizations have plan for supporting ICD-10 codes.
“ICD-10 codes are used for a lot of things,” Nichols said. As examples, he mentioned: actuarial and financial risk; adjudication; outcomes; population health analysis; benefits design; fraud, waste and abuse analysis; quality and efficiency assessment; medical policies and clinical guidelines; payment rules; clinical history; utilization; and regulatory reporting. “We based a lot of our national policies on this,” Nichols said. “The implications are far-reaching. Imperfect mapping from ICD-9 to ICD-10 will affect processing and analytics in a way that impacts revenue, costs, risk and relationships.”
So how do you deal with this?
“If you haven’t started now, you’re going to be behind the gun,” Nichols said. But organizations need to look at their short-term goals with a long-term vision, to determine what solutions they need today, and whether those solutions will meet future needs. Organizations also need to be aware of ICD-10’s touch points with other initiatives, and the potential downstream impacts of the change. And they should collaborate with business and trading partners as they develop their ICD-10 plans. Finally, organizations should use ICD-10 to try to position themselves for competitive advantage. “There are huge competitive advantages to using ICD-10 better than your competitors,” Nichols explained.
Sunday, February 28, 2010
The ICD-10 Challenge
Labels:
5010,
EHR EMR,
EHRs,
EOBs,
HIMSS,
HPAS,
ICD-10,
Mark Brousseau,
medical banking,
medical payments,
TAWPI
Unifying Information Channels
By Mark Brousseau
Healthcare organizations should look at new ANSI 5010 standard as an opportunity to modernize their entire information processing infrastructure, Edifecs CEO Sunny Singh told attendees this afternoon at the Medical Banking Project Boot Camp at HIMSS10 in Atlanta.
By January 1, 2012, all covered entities must be able to send and receive all HIPAA transactions in the 5010 format. With more than 1,300 changes in 5010, compared to 4010A1, organizations must understand the new information contained in 5010 and how best to use this data. Combined with all of the other corporate strategies that healthcare organizations face, 5010 is a complex challenge – and its deadline is fast approaching. “Deadlines make everybody nervous,” Singh said, adding that organizations that haven’t started implementing a 5010 solution need to do it now.
But the new 5010 standard provides healthcare organizations with the chance to finally unify their information channels, to ensure that they process all of their information – incoming and outgoing – in a consistent way. This will help organizations achieve operational efficiencies and remain competitive, while easing their path to compliance with inevitable future regulations, Singh said.
“Healthcare organizations are getting information from various channels,” Singh explained. “When you make sure that every channel is unified, initiatives like 5010 become much easier to implement.”
Faced with meeting the new 5010 standard, healthcare organizations face three options, Singh said: complete replacement of all systems (“rip and replace”); remediation of core processing systems; and step-up/step-down. Singh noted that the complete replacement of core systems is a very expensive proposition for which there doesn’t seem to be a lot of takers. Remediation takes less time, Singh said, but organizations must have a sizeable team working on the project, and, depending on the existing systems involved, may still require considerable time, money and resource investments.
The option that is gaining the most traction, he said, is the step-up/step-down approach. In this scenario, organizations would convert 5010 information to 4010 and pass it throughout their systems; they also can convert the 4010 information to 5010 format and pass that throughout their systems.
“This is the most pragmatic option if 5010 planning and implementation has not yet commenced,” Singh said. “It has the least impact on core processing systems and other ongoing projects.”
But most importantly, it helps to unify a healthcare organization’s information channels.
Healthcare organizations should look at new ANSI 5010 standard as an opportunity to modernize their entire information processing infrastructure, Edifecs CEO Sunny Singh told attendees this afternoon at the Medical Banking Project Boot Camp at HIMSS10 in Atlanta.
By January 1, 2012, all covered entities must be able to send and receive all HIPAA transactions in the 5010 format. With more than 1,300 changes in 5010, compared to 4010A1, organizations must understand the new information contained in 5010 and how best to use this data. Combined with all of the other corporate strategies that healthcare organizations face, 5010 is a complex challenge – and its deadline is fast approaching. “Deadlines make everybody nervous,” Singh said, adding that organizations that haven’t started implementing a 5010 solution need to do it now.
But the new 5010 standard provides healthcare organizations with the chance to finally unify their information channels, to ensure that they process all of their information – incoming and outgoing – in a consistent way. This will help organizations achieve operational efficiencies and remain competitive, while easing their path to compliance with inevitable future regulations, Singh said.
“Healthcare organizations are getting information from various channels,” Singh explained. “When you make sure that every channel is unified, initiatives like 5010 become much easier to implement.”
Faced with meeting the new 5010 standard, healthcare organizations face three options, Singh said: complete replacement of all systems (“rip and replace”); remediation of core processing systems; and step-up/step-down. Singh noted that the complete replacement of core systems is a very expensive proposition for which there doesn’t seem to be a lot of takers. Remediation takes less time, Singh said, but organizations must have a sizeable team working on the project, and, depending on the existing systems involved, may still require considerable time, money and resource investments.
The option that is gaining the most traction, he said, is the step-up/step-down approach. In this scenario, organizations would convert 5010 information to 4010 and pass it throughout their systems; they also can convert the 4010 information to 5010 format and pass that throughout their systems.
“This is the most pragmatic option if 5010 planning and implementation has not yet commenced,” Singh said. “It has the least impact on core processing systems and other ongoing projects.”
But most importantly, it helps to unify a healthcare organization’s information channels.
Start ARRA Awareness Training Now
By Mark Brousseau
If they haven’t done so already, companies in the healthcare space should conduct organizational awareness training on ARRA and HITECH, Mary Rita Hyland, AVP, regulatory affairs, The SSI Group, Inc., told attendees at the Medical Banking Project Boot Camp at HIMSS10 this afternoon.
Organizations also should conduct a HIPAA and HITECH gap analysis to identify any products, procedures and services that need to be updated and modified, Hyland told attendees. As part of this exercise, organizations need to identify and coordinate technical or product updates, as well as coordinate and implement policy and procedural updates. “Operationally, ensuring compliance with HITECH’s security and privacy provisions is, to a large degree, an IT function,” Hyland noted.
Once they’ve reviewed their systems, policies and procedures, organizations need to audit and assess their compliance. “You don’t want to wait for an audit to be done on you by a whistleblower or someone else in the industry who doesn’t believe you are in compliance,” Hyland warned. “Audits are going to be important in meeting the guidelines and maintaining your compliance.”
If they haven’t done so already, companies in the healthcare space should conduct organizational awareness training on ARRA and HITECH, Mary Rita Hyland, AVP, regulatory affairs, The SSI Group, Inc., told attendees at the Medical Banking Project Boot Camp at HIMSS10 this afternoon.
Organizations also should conduct a HIPAA and HITECH gap analysis to identify any products, procedures and services that need to be updated and modified, Hyland told attendees. As part of this exercise, organizations need to identify and coordinate technical or product updates, as well as coordinate and implement policy and procedural updates. “Operationally, ensuring compliance with HITECH’s security and privacy provisions is, to a large degree, an IT function,” Hyland noted.
Once they’ve reviewed their systems, policies and procedures, organizations need to audit and assess their compliance. “You don’t want to wait for an audit to be done on you by a whistleblower or someone else in the industry who doesn’t believe you are in compliance,” Hyland warned. “Audits are going to be important in meeting the guidelines and maintaining your compliance.”
Labels:
ARRA,
EHR,
electronic medical records,
EMR,
EOBs,
HIMSS,
HITECH,
Mark Brousseau,
medical banking,
medical claims,
medical payments,
TAWPI
ARRA: A Whole New World
By Mark Brousseau
Last year was a year of transition for HIPAA, medical privacy and medical banking, Richard D. Marks of McLean, VA-based Patient Command, Inc. (www.patientcommand.com), told attendees this afternoon at the Medical Banking Project Boot Camp at the HIMSS10 conference in Atlanta.
“ARRA changes the rules for security of health information in the United States,” Marks said. “It creates an entirely new framework because it changes HIPAA so much and because it changes privacy in medical records. And, most significantly, it changes the whole approach to enforcement.”
“It’s fair to say that for the last decade, there has not been any real attempt on the part of the federal government to enforce HIPAA,” Marks explained. “ARRA changes that. What it brings into law, for the first time, is the hierarchy of diligence and culpability. There are increased, tiered civil and criminal monetary penalties, topping out at $50,000 per violation, with an annual limit of $1,500,000. These numbers are enough to get your attention. But the statute also includes civil and criminal liability for individuals, as well as organizations. Which individuals, you ask? Well, it could be you! And some people won’t figure this out, and you will see some prosecutions,” Marks predicted.
Integrated health information security is inherent in ARRA, Marks added.
References in business associate contracts now, by law, apply mutually to covered entities and business associates, Marks pointed out. “The impact of that is to rebalance all of the risk allocation that is in these agreements, and it creates a whole new set of possibilities for liabilities. Some folks will be less affected than others. But some of you will be affected will be enormously,” Marks said.
For instance, security is now an active responsibility of the board of directors and senior executives, if you are doing anything that touches healthcare, Marks said. “If you’re a public company you’ve really go to ask yourself how you do disclosure when you have to take on a much greater risk for your information systems,” Marks said. “What this all means is that you must have integrated, shared systems security that is comprehensive and fast, and upgraded from what you now have.”
Some of the changes in ARRA won’t go into effect until 2011. “But some of this is in effect now, because people, such as ambitious state attorneys general, are going to start enforcing HIPAA,” Marks said. “The bottom line is that ARRA makes it a whole new world in healthcare.”
Last year was a year of transition for HIPAA, medical privacy and medical banking, Richard D. Marks of McLean, VA-based Patient Command, Inc. (www.patientcommand.com), told attendees this afternoon at the Medical Banking Project Boot Camp at the HIMSS10 conference in Atlanta.
“ARRA changes the rules for security of health information in the United States,” Marks said. “It creates an entirely new framework because it changes HIPAA so much and because it changes privacy in medical records. And, most significantly, it changes the whole approach to enforcement.”
“It’s fair to say that for the last decade, there has not been any real attempt on the part of the federal government to enforce HIPAA,” Marks explained. “ARRA changes that. What it brings into law, for the first time, is the hierarchy of diligence and culpability. There are increased, tiered civil and criminal monetary penalties, topping out at $50,000 per violation, with an annual limit of $1,500,000. These numbers are enough to get your attention. But the statute also includes civil and criminal liability for individuals, as well as organizations. Which individuals, you ask? Well, it could be you! And some people won’t figure this out, and you will see some prosecutions,” Marks predicted.
Integrated health information security is inherent in ARRA, Marks added.
References in business associate contracts now, by law, apply mutually to covered entities and business associates, Marks pointed out. “The impact of that is to rebalance all of the risk allocation that is in these agreements, and it creates a whole new set of possibilities for liabilities. Some folks will be less affected than others. But some of you will be affected will be enormously,” Marks said.
For instance, security is now an active responsibility of the board of directors and senior executives, if you are doing anything that touches healthcare, Marks said. “If you’re a public company you’ve really go to ask yourself how you do disclosure when you have to take on a much greater risk for your information systems,” Marks said. “What this all means is that you must have integrated, shared systems security that is comprehensive and fast, and upgraded from what you now have.”
Some of the changes in ARRA won’t go into effect until 2011. “But some of this is in effect now, because people, such as ambitious state attorneys general, are going to start enforcing HIPAA,” Marks said. “The bottom line is that ARRA makes it a whole new world in healthcare.”
Solving the Revenue Cycle
By Mark Brousseau
Banks are well positioned to help “solve” the healthcare revenue cycle, thanks to the keystone revenue cycle data that flows through banks every day, Benchmark Revenue Management CEO Tyson McDowell said at the Medical Banking Project Boot Camp in Atlanta this afternoon.
“Banks can solve operational improvement issues for hospitals, while solving transparency and risk management issues for themselves,” McDowell told attendees. He said banks should “grow-up” their healthcare revenue cycle solutions and extend into denial management, denial avoidance, and services that back up their healthcare customers’ revenue cycle workers with “on-demand” talent.
Today, many banks offer lockbox services, patient payment solutions and extended lockbox services.
“The revenue cycle improvement market is exploding due to permanent financial pressures,” noted McDowell. “While the official definition of the revenue cycle is all of the administrative processes related to collecting all fees owed for services to patients, a more practical definition would be: a near futile attempt to collect all the monies owed in a world with thousands of moving parts.”
“A hospital really has no idea how much money it’s going to get paid,” McDowell said. “Hospitals and, to a lesser extent individual doctors, are getting it from all sides. Healthcare providers need to protect themselves. And denial and payment data is the keystone for solving the revenue cycle.”
Banks have unique access to this information, McDowell said, and they offer value-added services like lockbox. “Banks are in a position to provide new services for healthcare. And it comes from the data. The hospital needs someone to tell them why they need to spend money on an improvement.”
McDowell concluded that banks are starting to move in the direction of new healthcare services.
Banks are well positioned to help “solve” the healthcare revenue cycle, thanks to the keystone revenue cycle data that flows through banks every day, Benchmark Revenue Management CEO Tyson McDowell said at the Medical Banking Project Boot Camp in Atlanta this afternoon.
“Banks can solve operational improvement issues for hospitals, while solving transparency and risk management issues for themselves,” McDowell told attendees. He said banks should “grow-up” their healthcare revenue cycle solutions and extend into denial management, denial avoidance, and services that back up their healthcare customers’ revenue cycle workers with “on-demand” talent.
Today, many banks offer lockbox services, patient payment solutions and extended lockbox services.
“The revenue cycle improvement market is exploding due to permanent financial pressures,” noted McDowell. “While the official definition of the revenue cycle is all of the administrative processes related to collecting all fees owed for services to patients, a more practical definition would be: a near futile attempt to collect all the monies owed in a world with thousands of moving parts.”
“A hospital really has no idea how much money it’s going to get paid,” McDowell said. “Hospitals and, to a lesser extent individual doctors, are getting it from all sides. Healthcare providers need to protect themselves. And denial and payment data is the keystone for solving the revenue cycle.”
Banks have unique access to this information, McDowell said, and they offer value-added services like lockbox. “Banks are in a position to provide new services for healthcare. And it comes from the data. The hospital needs someone to tell them why they need to spend money on an improvement.”
McDowell concluded that banks are starting to move in the direction of new healthcare services.
Saturday, February 27, 2010
News from HIMSS: Saturday
Posted by Mark Brousseau
Some headlines from the HIMSS health IT conference in Atlanta:
iSOFT showcases health IT solutions
iSOFT Group Limited will showcase its suite of solutions that focus on interoperability at the HIMSS health IT conference in Atlanta in the US from March 1-4, 2010.
iSOFT, which last year entered the important US market through its acquisition of Boston-based technology developer BridgeForward Software (re-named iSOFT Integration Systems), will at HIMSS demonstrate its solutions that are designed to address the requirements for ‘Meaningful Use’ under the US Government’s US$34 billion health IT stimulus package.
iSOFT solutions to be showcased at HIMSS include:
Health Information Exchange
iSOFT’s Health Information Exchange (HIE) solution provides healthcare organizations with access to clinical, financial and administrative data from any hospital information system across the organization. iSOFT’s HIE supports clinicians’ decisions at the point of care, reduces preventable errors and duplicative testing, and encourages best-practice medicine.
Health Intelligence
Health Intelligence (HINT) provides healthcare organizations with insights into organizational trends and statistics that supports informed decisions for future planning, the delivery of better-quality care and increased operational performance.
Integration
iSOFT Viaduct addresses the interoperability challenge faced by all organizations by providing a platform that enables software solutions to share information when needed and in the required form, ensuring seamless integration.
Solution Engineering
Health Studio provides a healthcare solution engineering environment to allow organizations to design, create and deploy their own solutions without needing to engage specialist vendors.
Patient Safety
iSOFT Patient Safety provides intelligence on safety and quality problems and best practices, empowering managers to make strategic improvements by providing an interactive evidence base.
Quest Software and HealthCast tout end-to-end clinical desktop and workflow solution
Quest Software, Inc. and HealthCast, Inc. will demonstrate an end-to-end clinical desktop and workflow solution for clinician access to protected healthcare information.
HealthCast’s eXactACCESS single sign-on and clinical workflow solution, coupled with Quest vWorkspace virtual desktop management solution, provides access to critical electronic health, order entry, and clinical documentation systems. As a result, these systems can be centrally managed to reduce costs and security concerns while increasing control of the clinical desktop environment.
“Our goal is to give physicians and clinicians the fast and easy access they need to their patient information while improving data security and reducing IT infrastructure and support costs,” said Simon Pearce, vice president and general manager of desktop virtualization, Quest Software.
vWorkspace and eXactACCESS automate clinician access to and management of virtual desktops and applications by eliminating the need to enter multiple passwords to disparate systems. HealthCast's unique proximity badge functionality automates the login to the virtual desktop and launches a clinician’s primary application based on who they are, and then navigates them to a default location within the application. When the badge is “tapped” again, the clinician’s virtual desktop and applications are disconnected so that clinicians can go to any other workstation in the enterprise, and securely pick up their desktop and applications exactly as they had left them with another “tap” of their badge.
California Health Information Exchange Networks interconnect
The Santa Cruz Health Information Exchange (HIE) is using Axolotl’s Elysium NHIN Gateway to connect to two California HIE networks - EKCITA in Tehachapi, CA and the Long Beach Network for Health in Southern CA, for exchange and sharing of critical clinical information.
This connectivity will be demonstrated at the HIMSS Interoperability Showcase, supported by the California Health and Human Agency (CHHS), the Office of the National Coordinator (ONC) and the Federal Health Architecture (FHA) to illustrate the progress towards health IT interoperability, nationwide.
The Santa Cruz HIE, utilizing Axolotl’s Elysium Exchange solutions, will demonstrate the ability to query from and exchange data with other California HIEs. The demonstration will highlight how clinical data, based on national standards, is integrated into different physician workflows at the point of care - by the local systems that are chosen in each care setting.
“Patient care will be radically improved through this inter-HIE exchange capability,” said Bill Beighe, CIO of Santa Cruz HIE. “This demonstration will show that HIE-to-HIE information exchange is technically feasible and available now.”
The HL7 Continuity of Care Documents (CCD) being exchanged are standard electronic documents that include discrete data elements which can be extracted and incorporated into the receiving systems. Elysium is leveraging components of the IHE IT Infrastructure set of profiles, such as Cross Enterprise Document Sharing (XDS) and Cross Community Access (XCA) to enable the transfer of the clinical data between connected communities. Multiple records will be exchanged to show that the process is general and not a special case.
“Axolotl participated with the Northrop Grumman consortium in the NHIN I project and in July 2009 did a live NHIN demo connecting five HIEs in California. Axolotl’s Elysium Gateway products that enable inter-HIE information exchange are available in our latest production platform and are being implemented. These products connect HIE Networks seamlessly to any other HIE either directly or via the NHIN,” said Anand Shroff, Vice President, Engineering, Axolotl.
Some headlines from the HIMSS health IT conference in Atlanta:
iSOFT showcases health IT solutions
iSOFT Group Limited will showcase its suite of solutions that focus on interoperability at the HIMSS health IT conference in Atlanta in the US from March 1-4, 2010.
iSOFT, which last year entered the important US market through its acquisition of Boston-based technology developer BridgeForward Software (re-named iSOFT Integration Systems), will at HIMSS demonstrate its solutions that are designed to address the requirements for ‘Meaningful Use’ under the US Government’s US$34 billion health IT stimulus package.
iSOFT solutions to be showcased at HIMSS include:
Health Information Exchange
iSOFT’s Health Information Exchange (HIE) solution provides healthcare organizations with access to clinical, financial and administrative data from any hospital information system across the organization. iSOFT’s HIE supports clinicians’ decisions at the point of care, reduces preventable errors and duplicative testing, and encourages best-practice medicine.
Health Intelligence
Health Intelligence (HINT) provides healthcare organizations with insights into organizational trends and statistics that supports informed decisions for future planning, the delivery of better-quality care and increased operational performance.
Integration
iSOFT Viaduct addresses the interoperability challenge faced by all organizations by providing a platform that enables software solutions to share information when needed and in the required form, ensuring seamless integration.
Solution Engineering
Health Studio provides a healthcare solution engineering environment to allow organizations to design, create and deploy their own solutions without needing to engage specialist vendors.
Patient Safety
iSOFT Patient Safety provides intelligence on safety and quality problems and best practices, empowering managers to make strategic improvements by providing an interactive evidence base.
Quest Software and HealthCast tout end-to-end clinical desktop and workflow solution
Quest Software, Inc. and HealthCast, Inc. will demonstrate an end-to-end clinical desktop and workflow solution for clinician access to protected healthcare information.
HealthCast’s eXactACCESS single sign-on and clinical workflow solution, coupled with Quest vWorkspace virtual desktop management solution, provides access to critical electronic health, order entry, and clinical documentation systems. As a result, these systems can be centrally managed to reduce costs and security concerns while increasing control of the clinical desktop environment.
“Our goal is to give physicians and clinicians the fast and easy access they need to their patient information while improving data security and reducing IT infrastructure and support costs,” said Simon Pearce, vice president and general manager of desktop virtualization, Quest Software.
vWorkspace and eXactACCESS automate clinician access to and management of virtual desktops and applications by eliminating the need to enter multiple passwords to disparate systems. HealthCast's unique proximity badge functionality automates the login to the virtual desktop and launches a clinician’s primary application based on who they are, and then navigates them to a default location within the application. When the badge is “tapped” again, the clinician’s virtual desktop and applications are disconnected so that clinicians can go to any other workstation in the enterprise, and securely pick up their desktop and applications exactly as they had left them with another “tap” of their badge.
California Health Information Exchange Networks interconnect
The Santa Cruz Health Information Exchange (HIE) is using Axolotl’s Elysium NHIN Gateway to connect to two California HIE networks - EKCITA in Tehachapi, CA and the Long Beach Network for Health in Southern CA, for exchange and sharing of critical clinical information.
This connectivity will be demonstrated at the HIMSS Interoperability Showcase, supported by the California Health and Human Agency (CHHS), the Office of the National Coordinator (ONC) and the Federal Health Architecture (FHA) to illustrate the progress towards health IT interoperability, nationwide.
The Santa Cruz HIE, utilizing Axolotl’s Elysium Exchange solutions, will demonstrate the ability to query from and exchange data with other California HIEs. The demonstration will highlight how clinical data, based on national standards, is integrated into different physician workflows at the point of care - by the local systems that are chosen in each care setting.
“Patient care will be radically improved through this inter-HIE exchange capability,” said Bill Beighe, CIO of Santa Cruz HIE. “This demonstration will show that HIE-to-HIE information exchange is technically feasible and available now.”
The HL7 Continuity of Care Documents (CCD) being exchanged are standard electronic documents that include discrete data elements which can be extracted and incorporated into the receiving systems. Elysium is leveraging components of the IHE IT Infrastructure set of profiles, such as Cross Enterprise Document Sharing (XDS) and Cross Community Access (XCA) to enable the transfer of the clinical data between connected communities. Multiple records will be exchanged to show that the process is general and not a special case.
“Axolotl participated with the Northrop Grumman consortium in the NHIN I project and in July 2009 did a live NHIN demo connecting five HIEs in California. Axolotl’s Elysium Gateway products that enable inter-HIE information exchange are available in our latest production platform and are being implemented. These products connect HIE Networks seamlessly to any other HIE either directly or via the NHIN,” said Anand Shroff, Vice President, Engineering, Axolotl.
Friday, February 26, 2010
The Other Story at HIMSS
By Mark Brousseau
While Electronic Health Records (EHR) and the impact of the recent definition of the meaningful use requirements will be hot topics at next week's HIMSS Conference in Atlanta, HERAE CEO Jim Ribelin thinks a program underwritten by the new HIMSS Medical Banking Project bears watching.
The project, called Designing the Healthcare Financial Network of the Future, is "right on target," Ribelin says. "The program will assemble key stakeholders to discuss what a strong financial network for healthcare could look like. A future that doesn’t siphon 20 cents of every healthcare dollar spent, and works to advance the balance between responsible financial management and clinical needs of patients," Ribelin says. The program's objective is to determine how the healthcare system can enhance value, reduce costs, and empower the shift from simple disease management to improved health for consumers, while at the same time creating better business models for the healthcare providers.
"EHRs are receiving a lot of attention, but the payment system, where a lot of new processes are in place with standards and systems defined such as bank ACH transactions, HIPAA 835s and ERA files, provides a real opportunity for significant change. A chance to create a network that will reduce costs and create efficiencies without negative impact on patient care,” says Ribelin. “Fix the healthcare payment system, create a strong financial healthcare network and the industry would see a savings of resources without sacrificing quality healthcare.”
What do you think?
While Electronic Health Records (EHR) and the impact of the recent definition of the meaningful use requirements will be hot topics at next week's HIMSS Conference in Atlanta, HERAE CEO Jim Ribelin thinks a program underwritten by the new HIMSS Medical Banking Project bears watching.
The project, called Designing the Healthcare Financial Network of the Future, is "right on target," Ribelin says. "The program will assemble key stakeholders to discuss what a strong financial network for healthcare could look like. A future that doesn’t siphon 20 cents of every healthcare dollar spent, and works to advance the balance between responsible financial management and clinical needs of patients," Ribelin says. The program's objective is to determine how the healthcare system can enhance value, reduce costs, and empower the shift from simple disease management to improved health for consumers, while at the same time creating better business models for the healthcare providers.
"EHRs are receiving a lot of attention, but the payment system, where a lot of new processes are in place with standards and systems defined such as bank ACH transactions, HIPAA 835s and ERA files, provides a real opportunity for significant change. A chance to create a network that will reduce costs and create efficiencies without negative impact on patient care,” says Ribelin. “Fix the healthcare payment system, create a strong financial healthcare network and the industry would see a savings of resources without sacrificing quality healthcare.”
What do you think?
Labels:
EHR,
EMR,
EOB,
healthcare records,
healthcare reform,
HERAE,
HIMSS,
HIPAA,
Jim Ribelin,
Mark Brousseau,
meaningful use,
TAWPI
Wednesday, February 24, 2010
Not All Municipalities Outsourcing
By Mark Brousseau
While payments processing outsourcing has gained traction among government entities during the economic downturn, it's still not for everyone. A case in point: the Apache County Tax Collector (Arizona), which recently implemented a solution to automate the processing of its tax payments.
"We never really considered outsourcing," explains Apache County Tax Collector Chief Deputy Sandy Klinchock. She believes that in-house processing provides more control and better quality.
Known as the longest county in the country, Apache County runs 211 miles from the Utah border to just south of Alpine, Arizona. Two-thirds of the population, and over one-half of the land area, belongs to the Navajo Nation, the largest Native American tribe. Currently comprised of 70,000 residents, Apache County is growing : new subdivisions have been approved, permanent jobs are being created, and the county is investing in the services required for an expanding population.
With the Apache County Tax Collector committed to keeping its payments processing in-house, it knew it needed a solution for automating its tax processing. Previously, three employees processed the county's roughly 68,000 tax payments in about a three-week window. By automating, the county also hoped to streamline its deposits to improve funds availability and working capital management.
"We wanted to become more efficient, while enhancing our service to constituents," Klinchock says.
In late 2009, the Apache County Tax Collector implemented an image-enabled remittance processing solution from Creditron. The system includes a 3000t check scanner from NCR, courtesy and legal amount recognition (CAR/LAR), and the ability to deposit funds electronically via Check 21.
The Apache County Tax Collector selected Creditron based on its implementation of a system at nearby Navajo County, and on its willingness to meet the county's fast installation schedule; the Apache County Tax Collector went into production a few weeks after signing its Creditron contract. Navajo County handles back-end tax accounting on behalf of the Apache County Tax Collector.
As a result of automating its tax processing, and depositing funds electronically, Apache County Tax Collector now gets all of its funds to the bank the same day they are processed. In its old manual environment, it took the county two to three days to turn around its deposits. "Now, we can keep our money invested for a longer period of time," Klinchock says. "We also have a clearer picture of how much money we have in the bank, and whether we have to pull from investments to pay warrants."
The county also is able to make electronic deposits for all eight of its departments. And depositing funds electronically eliminated daily courier runs to the bank, which cost the county $350 a month. Additionally, there are fewer calls from customers asking why their check hasn't cleared sooner.
With results like these, Klinchock is surprised that so many government entities are outsourcing their payments processing. "If they take some of these arguments to their board, I think they will find a receptive audience," she says. "When our board heard about the benefits, they were all for it."
Creditron Founder and CEO Wally Vogel adds that the experience of the Apache County Tax Collector shows that government entities don't have to outsource to reduce costs or gain efficiencies.
While payments processing outsourcing has gained traction among government entities during the economic downturn, it's still not for everyone. A case in point: the Apache County Tax Collector (Arizona), which recently implemented a solution to automate the processing of its tax payments.
"We never really considered outsourcing," explains Apache County Tax Collector Chief Deputy Sandy Klinchock. She believes that in-house processing provides more control and better quality.
Known as the longest county in the country, Apache County runs 211 miles from the Utah border to just south of Alpine, Arizona. Two-thirds of the population, and over one-half of the land area, belongs to the Navajo Nation, the largest Native American tribe. Currently comprised of 70,000 residents, Apache County is growing : new subdivisions have been approved, permanent jobs are being created, and the county is investing in the services required for an expanding population.
With the Apache County Tax Collector committed to keeping its payments processing in-house, it knew it needed a solution for automating its tax processing. Previously, three employees processed the county's roughly 68,000 tax payments in about a three-week window. By automating, the county also hoped to streamline its deposits to improve funds availability and working capital management.
"We wanted to become more efficient, while enhancing our service to constituents," Klinchock says.
In late 2009, the Apache County Tax Collector implemented an image-enabled remittance processing solution from Creditron. The system includes a 3000t check scanner from NCR, courtesy and legal amount recognition (CAR/LAR), and the ability to deposit funds electronically via Check 21.
The Apache County Tax Collector selected Creditron based on its implementation of a system at nearby Navajo County, and on its willingness to meet the county's fast installation schedule; the Apache County Tax Collector went into production a few weeks after signing its Creditron contract. Navajo County handles back-end tax accounting on behalf of the Apache County Tax Collector.
As a result of automating its tax processing, and depositing funds electronically, Apache County Tax Collector now gets all of its funds to the bank the same day they are processed. In its old manual environment, it took the county two to three days to turn around its deposits. "Now, we can keep our money invested for a longer period of time," Klinchock says. "We also have a clearer picture of how much money we have in the bank, and whether we have to pull from investments to pay warrants."
The county also is able to make electronic deposits for all eight of its departments. And depositing funds electronically eliminated daily courier runs to the bank, which cost the county $350 a month. Additionally, there are fewer calls from customers asking why their check hasn't cleared sooner.
With results like these, Klinchock is surprised that so many government entities are outsourcing their payments processing. "If they take some of these arguments to their board, I think they will find a receptive audience," she says. "When our board heard about the benefits, they were all for it."
Creditron Founder and CEO Wally Vogel adds that the experience of the Apache County Tax Collector shows that government entities don't have to outsource to reduce costs or gain efficiencies.
Saturday, February 20, 2010
Compliance and Outsourcing
By Mark Brousseau
While new compliance, security and privacy regulations are likely to take a bigger bite out of operations budgets this year, most organizations believe they can meet the stricter rules without having to outsource their payments and document processing. Just 20 percent of respondents to a recent TAWPI Question of the Week said new compliance, security and privacy regulations would force their organization to consider outsourcing. Sixty-five percent of respondents said the tougher regulations wouldn't force them to consider, and 15 percent of respondents said they weren't sure.
The time and cost associated with meeting compliance, security and privacy regulations continues to rise -- giving pause to any company entrusted with sensitive data that must be stored and shared.
"Regulatory compliance is very expensive and extremely time-consuming," says R. Edwin Pearce (epearce@egisticsinc.com), executive vice president of sales and corporate development for eGistics, Inc. "Companies have two choices for meeting regulatory demands for privacy and security: assume the full expense of the resources and time associated with meeting each regulation, or work with an outsource provider that can spread the costs of meeting the regulations across its customer base."
Pearce also believes that organizations should ask themselves whether it makes sense to go through the cost and trouble of becoming compliant, when there are outsource providers that already are.
"Companies don't necessarily have to absorb the full capital burden of meeting various certification and compliancy tests," Pearce explains. "For example, organizations that store images and data for multiple years may have to meet PCI, SAS 70 and HIPAA regulations. Rather than engineer a data center environment that meets all of these requirements -- including policy and procedural standards -- it may make better sense for the organization to partner with a compliant outsource provider."
"The result is faster compliance, at a significantly lower cost," Pearce adds.
With new regulations on the horizon, this is a decision more organizations will have to make.
What do you think?
While new compliance, security and privacy regulations are likely to take a bigger bite out of operations budgets this year, most organizations believe they can meet the stricter rules without having to outsource their payments and document processing. Just 20 percent of respondents to a recent TAWPI Question of the Week said new compliance, security and privacy regulations would force their organization to consider outsourcing. Sixty-five percent of respondents said the tougher regulations wouldn't force them to consider, and 15 percent of respondents said they weren't sure.
The time and cost associated with meeting compliance, security and privacy regulations continues to rise -- giving pause to any company entrusted with sensitive data that must be stored and shared.
"Regulatory compliance is very expensive and extremely time-consuming," says R. Edwin Pearce (epearce@egisticsinc.com), executive vice president of sales and corporate development for eGistics, Inc. "Companies have two choices for meeting regulatory demands for privacy and security: assume the full expense of the resources and time associated with meeting each regulation, or work with an outsource provider that can spread the costs of meeting the regulations across its customer base."
Pearce also believes that organizations should ask themselves whether it makes sense to go through the cost and trouble of becoming compliant, when there are outsource providers that already are.
"Companies don't necessarily have to absorb the full capital burden of meeting various certification and compliancy tests," Pearce explains. "For example, organizations that store images and data for multiple years may have to meet PCI, SAS 70 and HIPAA regulations. Rather than engineer a data center environment that meets all of these requirements -- including policy and procedural standards -- it may make better sense for the organization to partner with a compliant outsource provider."
"The result is faster compliance, at a significantly lower cost," Pearce adds.
With new regulations on the horizon, this is a decision more organizations will have to make.
What do you think?
Labels:
compliance,
computer security,
data privacy,
Ed Pearce,
eGistics,
HIPAA,
hosted solutions,
Mark Brousseau,
outsourcing,
PCI,
SaaS,
SAS 70,
TAWPI
Friday, February 19, 2010
The Image Clearing Opportunity
Posted by Mark Brousseau
Image clearing networks provide an opportunity for bankers banks to grow their revenues and attract new customers. US Dataworks President and COO Mario Villarreal (mvillarreal@usdataworks.com) explains:
At a time when paper cash letter volumes are rapidly declining, and the banking industry approaches the "last mile" in its migration toward electronic clearing, forward-thinking banks are leveraging Check 21 and their existing IT infrastructure to offer new image clearing services to community banks. If there is one thing that bankers have learned from the credit crisis, it is that focusing on their core services can provide a huge payoff.
Bankers banks provide valuable services to community banks, allowing them to compete with their larger counterparts. Bankers banks that have invested in the best technology to clear image exchange items also can offer this as a value-priced service to other bankers banks, extending their ability to service additional community banks, reduce costs, and expand their clearing network for improved funds availability and returns processing.
The Business Case
Prior to Check 21, check clearing was an extremely labor and capital-intensive process. For this reason, most bankers banks provided only settlement services: most community banks deposited paper cash letters at their nearby Federal Reserve Bank, while community banks that weren't members of the Federal Reserve settled their items through a bankers bank.
With the adoption of image clearing reaching more financial institutions, and advances in technology eliminating the need for bankers banks to print substitute checks in order to clear all of their items, more bankers banks are discovering that they can offer check clearing services to their customers with a much smaller investment in staff and equipment than in the past.
The goal for these bankers banks is to provide a lower-cost clearing option for their community bank customers or members, while generating non-interest income through clearing fees. By establishing a regional clearinghouse for in-network financial institutions, bankers banks can help their customers achieve lower costs through direct exchanges, as well as additional discounts through the bankers banks' aggregated volume. In most cases, community banks see savings of up to 20 percent compared to Federal Reserve Bank fees. Similarly, working with a bankers bank provides lower fees to receive image files from other banks. And bankers banks can offer value-added services such as item-level duplicate detection, which reduces errors and helps identify potential fraud. Other services include long-term payment archiving, expedited research and adjustments, and payment trend analysis.
In turn, bankers banks can strengthen their customer relationships, attract new bank customers, and increase the percentage of items cleared through their network. Bankers banks that have implemented this service have increased the rate of membership in the geographic region they serve, which results in a bigger network and more value to all participants.
Bankers banks also gain operational improvements by implementing an image exchange system that is not based on the old way of paper-check processing, but designed from the ground up to process today’s growing variety of payment types. In one case, a bankers bank reduced the time necessary to complete its check image processing from eight hours to less than 45 minutes using the new standard in image exchange processing. And, unlike traditional check systems, advanced image clearing solutions allow for configuration changes -- such as adding or disabling a clearing endpoint -- without a significant lead time or custom code.
Key Buying Criteria
When evaluating clearing solutions, there are four key criteria bankers banks should consider:
1. Scalability -- Image clearing requires the processing of large volumes in an extremely short processing window, as well as the effective management of a large number of image files.
2. Exceptions handling -- To minimize costly errors, an image clearing solution must have capabilities for duplicate detection.
3. Enhanced clearing capabilities -- In order to reduce fees, a clearing solution should have capabilities for in-network and consolidated decisioning, as well as bi-directional clearing.
4. Expandability -- Because of the fast-changing nature of the financial services industry, bankers banks should look for solutions that are designed to support future services.
Other evaluation criteria include startup and ongoing costs, time-to-market, the vendor's track record in high-volume operations, any value-added functionality (such as data analytics), and the level of control the bankers bank will have over product pricing, processing deadlines and service levels.
The Bottom Line
With fee income on the line, and community banks desperate for cost-effective clearing alternatives, more bankers banks will establish internal image clearing networks to harness the cost savings of in-network clearing and direct sends to major financial institutions, Returned items and research and adjustments are also simplified with the network approach. With these networks, bankers banks can provide the highest quality of correspondent services to community banks, as well as help their colleague bankers bank serve their customers. To provide even greater benefit to their customers, some of these bankers banks will create expanded same-day exchanges and exchanges with national and regional banks, and bundle their image clearing services with cash management products such as remote deposit capture (RDC) and automated clearing house (ACH) processing.
All of this will strengthen the role of bankers banks in the emerging financial services environment.
What do you think?
Image clearing networks provide an opportunity for bankers banks to grow their revenues and attract new customers. US Dataworks President and COO Mario Villarreal (mvillarreal@usdataworks.com) explains:
At a time when paper cash letter volumes are rapidly declining, and the banking industry approaches the "last mile" in its migration toward electronic clearing, forward-thinking banks are leveraging Check 21 and their existing IT infrastructure to offer new image clearing services to community banks. If there is one thing that bankers have learned from the credit crisis, it is that focusing on their core services can provide a huge payoff.
Bankers banks provide valuable services to community banks, allowing them to compete with their larger counterparts. Bankers banks that have invested in the best technology to clear image exchange items also can offer this as a value-priced service to other bankers banks, extending their ability to service additional community banks, reduce costs, and expand their clearing network for improved funds availability and returns processing.
The Business Case
Prior to Check 21, check clearing was an extremely labor and capital-intensive process. For this reason, most bankers banks provided only settlement services: most community banks deposited paper cash letters at their nearby Federal Reserve Bank, while community banks that weren't members of the Federal Reserve settled their items through a bankers bank.
With the adoption of image clearing reaching more financial institutions, and advances in technology eliminating the need for bankers banks to print substitute checks in order to clear all of their items, more bankers banks are discovering that they can offer check clearing services to their customers with a much smaller investment in staff and equipment than in the past.
The goal for these bankers banks is to provide a lower-cost clearing option for their community bank customers or members, while generating non-interest income through clearing fees. By establishing a regional clearinghouse for in-network financial institutions, bankers banks can help their customers achieve lower costs through direct exchanges, as well as additional discounts through the bankers banks' aggregated volume. In most cases, community banks see savings of up to 20 percent compared to Federal Reserve Bank fees. Similarly, working with a bankers bank provides lower fees to receive image files from other banks. And bankers banks can offer value-added services such as item-level duplicate detection, which reduces errors and helps identify potential fraud. Other services include long-term payment archiving, expedited research and adjustments, and payment trend analysis.
In turn, bankers banks can strengthen their customer relationships, attract new bank customers, and increase the percentage of items cleared through their network. Bankers banks that have implemented this service have increased the rate of membership in the geographic region they serve, which results in a bigger network and more value to all participants.
Bankers banks also gain operational improvements by implementing an image exchange system that is not based on the old way of paper-check processing, but designed from the ground up to process today’s growing variety of payment types. In one case, a bankers bank reduced the time necessary to complete its check image processing from eight hours to less than 45 minutes using the new standard in image exchange processing. And, unlike traditional check systems, advanced image clearing solutions allow for configuration changes -- such as adding or disabling a clearing endpoint -- without a significant lead time or custom code.
Key Buying Criteria
When evaluating clearing solutions, there are four key criteria bankers banks should consider:
1. Scalability -- Image clearing requires the processing of large volumes in an extremely short processing window, as well as the effective management of a large number of image files.
2. Exceptions handling -- To minimize costly errors, an image clearing solution must have capabilities for duplicate detection.
3. Enhanced clearing capabilities -- In order to reduce fees, a clearing solution should have capabilities for in-network and consolidated decisioning, as well as bi-directional clearing.
4. Expandability -- Because of the fast-changing nature of the financial services industry, bankers banks should look for solutions that are designed to support future services.
Other evaluation criteria include startup and ongoing costs, time-to-market, the vendor's track record in high-volume operations, any value-added functionality (such as data analytics), and the level of control the bankers bank will have over product pricing, processing deadlines and service levels.
The Bottom Line
With fee income on the line, and community banks desperate for cost-effective clearing alternatives, more bankers banks will establish internal image clearing networks to harness the cost savings of in-network clearing and direct sends to major financial institutions, Returned items and research and adjustments are also simplified with the network approach. With these networks, bankers banks can provide the highest quality of correspondent services to community banks, as well as help their colleague bankers bank serve their customers. To provide even greater benefit to their customers, some of these bankers banks will create expanded same-day exchanges and exchanges with national and regional banks, and bundle their image clearing services with cash management products such as remote deposit capture (RDC) and automated clearing house (ACH) processing.
All of this will strengthen the role of bankers banks in the emerging financial services environment.
What do you think?
Thursday, February 18, 2010
Come Together
By Mark Brousseau
At a time when businesses are focusing like never before on operations efficiency and corporate responsiveness, respondents to a recent TAWPI Question of the Week cited data extraction as the most important part of the capture process to automate. Data extraction is the process of interpreting data on documents for further processing or storage. Sixty-percent of respondents to the survey identified data extraction as the key area for automation, topping business process integration (20 percent), classification/sorting (20 percent), filing/archive (0 percent) and data validation (0 percent).
The results come as no surprise to ibml President and CEO Derrick Murphy, who notes that peaking recognition rates leave data extraction as the capture function with the most room for improvement.
But Murphy warns that achieving significant improvements in data extraction results will require organizations and their corresponding integrators to combine capabilities such as higher image quality, database lookups/verification, auto-classification, and physical sorting into an integrated business process. This is a big change from traditionally siloed capture functions that often resulted in simple picture-taking with downstream exceptions, Murphy notes.
Here's how Murphy sees the pieces fitting together:
• Image Quality -- There's no question that higher quality and cleaner images translate into higher read rates. While image quality can be affected by forms design (e.g. clear zones), the type of scanner an organization uses can also play a pivotal role. For instance, many scanners don't actually scan at their advertised rate; the manufacturers are referring to their output, not their real scan rate. "Image enhancement can only do so much," Murphy notes. In this scenario, a fuzzy image translates into a poor black and white image for recognition. To ensure high quality images, as a starting point Murphy recommends that users look for scanners that meet the 300 dpi (dots per inch) scan level (not output) suggested by most recognition vendors.
• Database verification -- Murphy sees increasing demand for database verification -- the process of using logic to utilize existing data to match up ICR results. These results can be used to automatically populate data entry fields or correct intelligent character recognition (ICR) misreads. Database verification is gaining traction in applications such as invoice processing and remittance lockbox processing, Murphy notes. So what's the appeal of this technology? Murphy says that while recognition read rates can now top 90 percent, the remaining misreads may require the manual keying of a lot of fields on a document. Database verification can automatically verify then populate/correct these fields, reducing manual keying as well as the potential for errors that it introduces.
• Classification -- Combining character recognition with sophisticated logic, auto-classification groups documents, reducing the time necessary to organize information and fill in data gaps. Better document classification drives improvements in data extraction rates, Murphy said.
• Physical sorting -- Despite the industry's push towards electronification, there are times when physically sorting documents still makes sense, particularly in complex data extraction environments, Murphy said. "If a document can't pass your validation or quality assurance processes, chances are, you will need to rescan the document," Murphy said. "The time to determine this is early in the capture process, and not after the documents have been boxed up or moved to another location while you have batches of work awaiting completion."
Taken together, Murphy believes these functions will help provide the improvements in data extraction results that respondents to the TAWPI Question of the Week cited as being critical to their capture process.
What do you think?
At a time when businesses are focusing like never before on operations efficiency and corporate responsiveness, respondents to a recent TAWPI Question of the Week cited data extraction as the most important part of the capture process to automate. Data extraction is the process of interpreting data on documents for further processing or storage. Sixty-percent of respondents to the survey identified data extraction as the key area for automation, topping business process integration (20 percent), classification/sorting (20 percent), filing/archive (0 percent) and data validation (0 percent).
The results come as no surprise to ibml President and CEO Derrick Murphy, who notes that peaking recognition rates leave data extraction as the capture function with the most room for improvement.
But Murphy warns that achieving significant improvements in data extraction results will require organizations and their corresponding integrators to combine capabilities such as higher image quality, database lookups/verification, auto-classification, and physical sorting into an integrated business process. This is a big change from traditionally siloed capture functions that often resulted in simple picture-taking with downstream exceptions, Murphy notes.
Here's how Murphy sees the pieces fitting together:
• Image Quality -- There's no question that higher quality and cleaner images translate into higher read rates. While image quality can be affected by forms design (e.g. clear zones), the type of scanner an organization uses can also play a pivotal role. For instance, many scanners don't actually scan at their advertised rate; the manufacturers are referring to their output, not their real scan rate. "Image enhancement can only do so much," Murphy notes. In this scenario, a fuzzy image translates into a poor black and white image for recognition. To ensure high quality images, as a starting point Murphy recommends that users look for scanners that meet the 300 dpi (dots per inch) scan level (not output) suggested by most recognition vendors.
• Database verification -- Murphy sees increasing demand for database verification -- the process of using logic to utilize existing data to match up ICR results. These results can be used to automatically populate data entry fields or correct intelligent character recognition (ICR) misreads. Database verification is gaining traction in applications such as invoice processing and remittance lockbox processing, Murphy notes. So what's the appeal of this technology? Murphy says that while recognition read rates can now top 90 percent, the remaining misreads may require the manual keying of a lot of fields on a document. Database verification can automatically verify then populate/correct these fields, reducing manual keying as well as the potential for errors that it introduces.
• Classification -- Combining character recognition with sophisticated logic, auto-classification groups documents, reducing the time necessary to organize information and fill in data gaps. Better document classification drives improvements in data extraction rates, Murphy said.
• Physical sorting -- Despite the industry's push towards electronification, there are times when physically sorting documents still makes sense, particularly in complex data extraction environments, Murphy said. "If a document can't pass your validation or quality assurance processes, chances are, you will need to rescan the document," Murphy said. "The time to determine this is early in the capture process, and not after the documents have been boxed up or moved to another location while you have batches of work awaiting completion."
Taken together, Murphy believes these functions will help provide the improvements in data extraction results that respondents to the TAWPI Question of the Week cited as being critical to their capture process.
What do you think?
Monday, February 15, 2010
Healthcare Standardization
Posted by Mark Brousseau
Standardization of industry practices is critical to the strength of the healthcare market. Lee Barrett, executive director of the Electronic Healthcare Network Accreditation Commission (EHNAC) explains:
As the healthcare industry continues to evolve to meet regulations and requirements outlined in ARRA, HITECH and HIPAA, more than ever, there’s need for standardization of industry practices and optimization of stakeholder cooperation. Coupled with the complex issues surrounding interoperability, privacy, security and access is the fact that healthcare networks, financial service firms, payer networks, e-Prescribing and other solution providers and vendors need to overtly demonstrate their readiness, competence and capability to address these issues and comply with a complex web of regulations.
When any industry goes through the process of defining the standards to which industry participants should adhere, that industry becomes stronger in its own operations and earns greater respect from affiliated and external stakeholders. This is precisely the case with the electronic healthcare transaction industry.
EHNAC, or the Electronic Healthcare Network Accreditation Commission, is focused on establishing, developing, updating and filtering the criteria that define whether organizations operating in the healthcare electronic transaction industry receive accreditation or not. Through a dialogic process, that builds on stakeholder recommendations, insights and comments, EHNAC develops and promotes criteria for best practices, which focus on simplifying administrative processes, maintaining open competition and enhancing operational integrity.
In January, EHNAC announced the finalization and adoption of program criteria for 2010. This announcement concluded a 60-day public comment period for the following programs:
1. ASPAP-EHR – Application Service Provider Accreditation Program for Electronic Health Records
2. ePAP – e-Prescribing Accreditation Program
3. FSAP EHN – Financial Services Accreditation Program for Electronic Health Networks
4. FSAP Lockbox – Financial Services Accreditation Program for Lockbox Services
5. HNAP EHN – Healthcare Network Accreditation Program for Electronic Health Networks
6. HNAP Medical Biller – Healthcare Network Accreditation Program for Medical Billers
7. HNAP TPA – Healthcare Network Accreditation Program for TPAs
8. HNAP-70 – Healthcare Network Accreditation Plus Select SAS 70© Criteria Program
9. OSAP – Outsourced Services Accreditation Program
In addition, the commission developed draft criteria for Health Information Exchange (HIE) entities. In February, this draft criteria was released for 60-day public comment and review and will be finalized during the second quarter 2010.
The issues addressed through the criteria review and approval process become increasingly complex, as the industry responds to specific provisions in the federal acts. Criteria for accreditation programs today address health data processing response times and security; privacy and confidentiality for financial service providers; and e-Prescribing timeliness and security. As regulatory guidelines become more complex, industry participants are called on to make sure their operations are simplified, secure and compliant.
Accreditation also simplifies the process of discerning between those who are adhering to industry standards, and those who are not.
Standardization of industry practices is critical to the strength of the healthcare market. Lee Barrett, executive director of the Electronic Healthcare Network Accreditation Commission (EHNAC) explains:
As the healthcare industry continues to evolve to meet regulations and requirements outlined in ARRA, HITECH and HIPAA, more than ever, there’s need for standardization of industry practices and optimization of stakeholder cooperation. Coupled with the complex issues surrounding interoperability, privacy, security and access is the fact that healthcare networks, financial service firms, payer networks, e-Prescribing and other solution providers and vendors need to overtly demonstrate their readiness, competence and capability to address these issues and comply with a complex web of regulations.
When any industry goes through the process of defining the standards to which industry participants should adhere, that industry becomes stronger in its own operations and earns greater respect from affiliated and external stakeholders. This is precisely the case with the electronic healthcare transaction industry.
EHNAC, or the Electronic Healthcare Network Accreditation Commission, is focused on establishing, developing, updating and filtering the criteria that define whether organizations operating in the healthcare electronic transaction industry receive accreditation or not. Through a dialogic process, that builds on stakeholder recommendations, insights and comments, EHNAC develops and promotes criteria for best practices, which focus on simplifying administrative processes, maintaining open competition and enhancing operational integrity.
In January, EHNAC announced the finalization and adoption of program criteria for 2010. This announcement concluded a 60-day public comment period for the following programs:
1. ASPAP-EHR – Application Service Provider Accreditation Program for Electronic Health Records
2. ePAP – e-Prescribing Accreditation Program
3. FSAP EHN – Financial Services Accreditation Program for Electronic Health Networks
4. FSAP Lockbox – Financial Services Accreditation Program for Lockbox Services
5. HNAP EHN – Healthcare Network Accreditation Program for Electronic Health Networks
6. HNAP Medical Biller – Healthcare Network Accreditation Program for Medical Billers
7. HNAP TPA – Healthcare Network Accreditation Program for TPAs
8. HNAP-70 – Healthcare Network Accreditation Plus Select SAS 70© Criteria Program
9. OSAP – Outsourced Services Accreditation Program
In addition, the commission developed draft criteria for Health Information Exchange (HIE) entities. In February, this draft criteria was released for 60-day public comment and review and will be finalized during the second quarter 2010.
The issues addressed through the criteria review and approval process become increasingly complex, as the industry responds to specific provisions in the federal acts. Criteria for accreditation programs today address health data processing response times and security; privacy and confidentiality for financial service providers; and e-Prescribing timeliness and security. As regulatory guidelines become more complex, industry participants are called on to make sure their operations are simplified, secure and compliant.
Accreditation also simplifies the process of discerning between those who are adhering to industry standards, and those who are not.
Thursday, February 4, 2010
Don't Turn Cybersecurity into a Bureaucracy
Posted by Mark Brousseau
New legislation being discussed in Washington runs the risk of turning cybersecurity into a bureaucracy. Wayne Crews, vice president for policy at the Competitive Enterprise Institute, thinks a better solution is to enhance private sector practices. He explains:
The House of Representatives is considering HR 4061, the Cybersecurity Enhancement Act. A solid Cybersecurity Enhancement Act might read “Title I: Stop losing federal laptops.” That’s too flip, but consider that there are cybersecurity risks to cybersecurity legislation.
Vulnerabilities in the government’s information security policies and the need to “bring government into the 21st century” have long been noted. But given the constant temptation by politicians in both parties to meddle with cybersecurity policy by steering research and development in unnatural directions, any poor decisions made at this juncture could undermine both public and private information security.
Politicians, especially in frontier industries like information technology, often take the easy path of seeking massive sums to establish taxpayer funded research grants for politically favored cybersecurity initiatives, set up redundant cybersecurity agencies, programs, and subsidies. This is precisely what the Cybersecurity Enhancement Act will do, potentially steering cybersecurity research away from its natural, safer, course.
Vastly expanding federal grants, fleets of scholarships and government-induced Ph.D.s in computer security is not the same as actually bolstering security, nor is there any reason the private sector cannot fund the training of its own such personnel or provide application-specific training as needed. Moreover, many serious security problems are not matters of new training but simply of embracing security “best practices” that already exist.
The Cybersecurity Enhancement Act amounts to pork, and the private sector can and should fund the training of America’s security experts. Online security is an immensely valuable industry today, and there is no shortage of private research incentive and potential profit.
Taxpayer-funded scholarships have already been extended to universities in countless respects, and incentives already abound for students to pursue technology careers. These new programs can easily grow beyond the proposed, already-generous bounds.
It’s beyond doubt that online security problems exist. Yet the tendency of cybersecurity today to be seen as an increasingly government-spearheaded function is worrisome. The taxpayer-funding approach can benefit some sectors and companies at the expense of competition and of computer security itself. Federal spending and intervention may encourage market distortion by skewing private investment decisions, or promoting one set of technologies or class of providers at the expense of others
We need better digital equivalents of barbed wire and door locks, which private companies are constantly competing to improve. While government law enforcement agencies have a necessary role to play in investigating and punishing intrusions on private networks and infrastructure, government must coexist with, rather than crowd out, private sector security technologies. Otherwise we become less secure, not more.
A substantial government role invariably grows into an irresistible magnet for lobbyists and the creation of bloated “research centers” and could all too easily become the locus for establishing sub-optimal government authority over our most vulnerable frontier technologies and sciences.
The solution? Enhancing private sector cybersecurity practices.
Both suppliers and customers in the high-tech sector increasingly demand better security from all players. Improving private incentives for information sharing is at least as important as greater government coordination and investment to ensure security and critical infrastructure protection. That job will entail liberalizing critical infrastructure assets—like telecommunications and electricity networks—and relaxing antitrust constraints so firms can coordinate information security strategies and enhance reliability of critical infrastructure through the kind of “partial mergers” that are anathema to today’s antitrust enforcers.
The future will deliver authentication technologies far more capable than those of today. Like everything else in the market, security technologies—from biometric identifiers to firewalls to network monitoring to encrypted databases—benefit from competition. Private cybersecurity initiatives will also gradually move us toward thriving liability and insurance markets, to help address the lack of authentication and inability to exclude bad actors that are at the root of today’s vulnerabilities.
Security is an industry unto itself, let’s not turn it into bureaucracy.
What do you think?
New legislation being discussed in Washington runs the risk of turning cybersecurity into a bureaucracy. Wayne Crews, vice president for policy at the Competitive Enterprise Institute, thinks a better solution is to enhance private sector practices. He explains:
The House of Representatives is considering HR 4061, the Cybersecurity Enhancement Act. A solid Cybersecurity Enhancement Act might read “Title I: Stop losing federal laptops.” That’s too flip, but consider that there are cybersecurity risks to cybersecurity legislation.
Vulnerabilities in the government’s information security policies and the need to “bring government into the 21st century” have long been noted. But given the constant temptation by politicians in both parties to meddle with cybersecurity policy by steering research and development in unnatural directions, any poor decisions made at this juncture could undermine both public and private information security.
Politicians, especially in frontier industries like information technology, often take the easy path of seeking massive sums to establish taxpayer funded research grants for politically favored cybersecurity initiatives, set up redundant cybersecurity agencies, programs, and subsidies. This is precisely what the Cybersecurity Enhancement Act will do, potentially steering cybersecurity research away from its natural, safer, course.
Vastly expanding federal grants, fleets of scholarships and government-induced Ph.D.s in computer security is not the same as actually bolstering security, nor is there any reason the private sector cannot fund the training of its own such personnel or provide application-specific training as needed. Moreover, many serious security problems are not matters of new training but simply of embracing security “best practices” that already exist.
The Cybersecurity Enhancement Act amounts to pork, and the private sector can and should fund the training of America’s security experts. Online security is an immensely valuable industry today, and there is no shortage of private research incentive and potential profit.
Taxpayer-funded scholarships have already been extended to universities in countless respects, and incentives already abound for students to pursue technology careers. These new programs can easily grow beyond the proposed, already-generous bounds.
It’s beyond doubt that online security problems exist. Yet the tendency of cybersecurity today to be seen as an increasingly government-spearheaded function is worrisome. The taxpayer-funding approach can benefit some sectors and companies at the expense of competition and of computer security itself. Federal spending and intervention may encourage market distortion by skewing private investment decisions, or promoting one set of technologies or class of providers at the expense of others
We need better digital equivalents of barbed wire and door locks, which private companies are constantly competing to improve. While government law enforcement agencies have a necessary role to play in investigating and punishing intrusions on private networks and infrastructure, government must coexist with, rather than crowd out, private sector security technologies. Otherwise we become less secure, not more.
A substantial government role invariably grows into an irresistible magnet for lobbyists and the creation of bloated “research centers” and could all too easily become the locus for establishing sub-optimal government authority over our most vulnerable frontier technologies and sciences.
The solution? Enhancing private sector cybersecurity practices.
Both suppliers and customers in the high-tech sector increasingly demand better security from all players. Improving private incentives for information sharing is at least as important as greater government coordination and investment to ensure security and critical infrastructure protection. That job will entail liberalizing critical infrastructure assets—like telecommunications and electricity networks—and relaxing antitrust constraints so firms can coordinate information security strategies and enhance reliability of critical infrastructure through the kind of “partial mergers” that are anathema to today’s antitrust enforcers.
The future will deliver authentication technologies far more capable than those of today. Like everything else in the market, security technologies—from biometric identifiers to firewalls to network monitoring to encrypted databases—benefit from competition. Private cybersecurity initiatives will also gradually move us toward thriving liability and insurance markets, to help address the lack of authentication and inability to exclude bad actors that are at the root of today’s vulnerabilities.
Security is an industry unto itself, let’s not turn it into bureaucracy.
What do you think?
Tuesday, February 2, 2010
Kofax Enhances Capture Products
By Mark Brousseau
Kofax is seeing traction among small and medium-sized businesses for its Kofax Express solution. Now, the vendor hopes to continue this momentum with the release of Version 2.0 of the product.
Kofax Express is an all-in-one scan-to-archive software package for image capture applications.
Kofax Express 2.0 offers incremental enhancements, including an improved user experience and ease of use, extended support for PDF and PDFa, and upgrades in the way users input documents, Andrew Pery, chief marketing officer at Kofax, told me during a product briefing yesterday afternoon.
"The traction for Kofax Express is coming through our channel," Pery said. "They are focusing on the small to medium-sized segment of the capture market." Kofax resellers are seeing interest across vertical markets, Pery explained, including healthcare, manufacturing, and local governments.
"Even though data capture technology has been around for a long time, there is still a lot of paper in smaller enterprises. Companies recognize the value of utilizing data capture to take costs out of the business," Pery said, predicting single-digital growth for Kofax Express. "Kofax Express is moving capture into the mainstream. This allows us to move down market and defend our market position."
In the higher volume market, Kofax has made significant improvements to the ease of administration and deployment to its Kofax Capture 9 solution. Kofax is positioning the product as the enterprise standard to support multiple instances. "Over the last eight months, we have seen increased adoption of Kofax Capture 9 an enterprise standard," Pery said. "For instance, we're seeing a renaissance in mail room automation. Companies are trying to take significant costs out of the business."
Companies also are looking to streamline downstream business processes. The goal is to address the intricacies of capture enterprises to improve customer service and strengthen customer relationships. "If you take a look at invoice processing, the only thing that matters is exceptions processing," Pery explained. "That's where the cost is. That's where the service issues are. Once companies have addressed downstream business processes, there are significant strategic advantages," he added.
Kofax is seeing traction among small and medium-sized businesses for its Kofax Express solution. Now, the vendor hopes to continue this momentum with the release of Version 2.0 of the product.
Kofax Express is an all-in-one scan-to-archive software package for image capture applications.
Kofax Express 2.0 offers incremental enhancements, including an improved user experience and ease of use, extended support for PDF and PDFa, and upgrades in the way users input documents, Andrew Pery, chief marketing officer at Kofax, told me during a product briefing yesterday afternoon.
"The traction for Kofax Express is coming through our channel," Pery said. "They are focusing on the small to medium-sized segment of the capture market." Kofax resellers are seeing interest across vertical markets, Pery explained, including healthcare, manufacturing, and local governments.
"Even though data capture technology has been around for a long time, there is still a lot of paper in smaller enterprises. Companies recognize the value of utilizing data capture to take costs out of the business," Pery said, predicting single-digital growth for Kofax Express. "Kofax Express is moving capture into the mainstream. This allows us to move down market and defend our market position."
In the higher volume market, Kofax has made significant improvements to the ease of administration and deployment to its Kofax Capture 9 solution. Kofax is positioning the product as the enterprise standard to support multiple instances. "Over the last eight months, we have seen increased adoption of Kofax Capture 9 an enterprise standard," Pery said. "For instance, we're seeing a renaissance in mail room automation. Companies are trying to take significant costs out of the business."
Companies also are looking to streamline downstream business processes. The goal is to address the intricacies of capture enterprises to improve customer service and strengthen customer relationships. "If you take a look at invoice processing, the only thing that matters is exceptions processing," Pery explained. "That's where the cost is. That's where the service issues are. Once companies have addressed downstream business processes, there are significant strategic advantages," he added.
Labels:
data capture,
digital mailroom,
document scanning,
Kofax,
Mark Brousseau,
TAWPI
Subscribe to:
Posts (Atom)