Thursday, September 30, 2010

Healthcare Payables: From Bad to Worse?

By Amer Khan (akhan@egisticsinc.com) of eGistics (www.egisticsinc.com)

Effectively managing the payables process is a big job for most companies, but for healthcare organizations, it is a particularly tall order -- and it's about to get a lot more challenging.

The problem in managing healthcare payables stems from the byzantine network of buyer and seller relationships employed by most healthcare organizations, combined with the increasingly complex procurement processes and contracts that healthcare organizations use to purchase goods and services. Every day, the typical healthcare organization receives a mountain of invoices from many different suppliers, most under different contracts with potentially different payment arrangements.

When you mix in the unusually high number of suppliers that most healthcare organizations use -- a hospital might have thousands of suppliers compared to a few dozen for a big law firm -- you can see how the payables process can quickly become complicated. For instance, on a given day, a hospital might receive invoices for everything from Band-Aids to the pricey cardiology equipment it leases.

The healthcare industry's attempts to address the inefficiencies of the payables continuum have delivered mixed results. Several years ago, group purchasing organizations (GPOs) started sprouting up, allowing healthcare organizations to buy a range of goods and services from a single entity, rather than dealing with multiple vendors. While GPOs have enabled their customers to maximize discounts and reduce the number of vendors they do business with, there are still many cases where healthcare providers must source goods and services directly (such as buying from local suppliers), meaning they still must maintain a high number of supplier relationships.

Here's the scary part: the problem is likely to get worse. Every innovation in the healthcare industry -- whether it's new technologies, new devices or new drugs -- may create more suppliers, generating more invoices, contracts, payment arrangements, and, in some cases, acquisition channels. With our nation focusing like never before on innovations in healthcare, providers have no time to waste.

And while healthcare organizations are focusing tremendous amounts of time and resources on "big issues" such as meeting new requirements for electronic health records (EHRs) and ICD-10, driving down the costs associated with payables can deliver significant benefits as well, and in short order.

So, how can healthcare organizations accomplish this?

Since manual processes don't scale, the healthcare industry will need to rethink its approach to payables. The answer starts with eliminating paper at the earliest point possible in the process.

Whether it's converting paper invoices to electronic images, or convincing business partners to provide electronic invoices in the first place, eliminating paper simplifies and automates the payables process. It allows healthcare providers to apply automated rules for processing, and to initiate an electronic payment with detailed remittance information so the supplier can automatically post the receivables. With these types of solutions, providers can solve their current business challenges and lay a solid foundation to manage the increasingly complex payable environment that is sure to come.

What do you think?

Monday, September 27, 2010

It's OK to Look

Posted by Mark Brousseau

Companies pursue patents for a wide variety of reasons -- competitive advantage, company valuation, tax credits and others -- but in chasing patent claims, did you know that it actually benefits companies to look at patent applications filed by competitors, too?

That’s what one patent expert, attorney Roger Maxwell, believes. In his experience, searching competitors’ published patent applications and existing patents has actually helped his clients maximize their commercial advantages and follow the law.

“We can take a lesson from Hollywood,” he said. “Every aspiring screenwriter’s lament is that no producer will ever accept an unsolicited manuscript. There’s a reason for that. In copyright law, just as it almost always is in patent law, lack of intent is not a defense. In other words, if an agent or a producer even opens an unsolicited script, and 5 years later produces a film with elements from that script entirely by accident, they could be sued for copyright infringement. To prevent that, they simply don’t accept unsolicited material unless it comes from a licensed agent they trust.”

Maxwell says that the same principle can be applied for companies seeking patents, only in reverse. Whereas access in the copyright arena can lead to inadvertent copying and infringement, access in the patent area can lead to a better understanding of the state-of-the-art, better product design and better patents.

“It is very possible -- and even likely in some cases -- that one company’s patent may well duplicate elements from someone else’s application or existing patent,” Maxwell says.

“Because intent is not normally an issue, that company could actually be held liable for patent infringement, even if it didn’t know what the other company’s patents said. In the case of patents, it’s important to refrain from having your patent infringe on anyone else’s, even if it’s by accident,” he explains.

So they don’t waste time and money proving they are violating the law and/or setting themselves up to pay damages, it is critical for companies that are preparing patent applications to search published patent applications as well as existing patents, according to Maxwell.

“In examining other applications, a company can ensure that no elements of their applications cross over with something already on the books, even if it’s in an application filed by someone else,” he adds. “If you know what your competitors are working on with regard to their patents, as the public records afford you the capability to do, then you can work to not use any of their claimable technology in the pursuit of your own patent. That adds more weight to your claim and helps maintain the integrity of the system."

"The whole reason for patents is to recognize the proprietary ownership of a company’s intellectual property and to ensure unique ideas and executions of concepts can be protected. If a company finds a patent that it feels may arguably cover its products, it can further investigate the issue. If the investigation leads them to believe that their design does not infringe or the patent is not valid, an opinion from counsel may shield the company from enhanced damages that could triple the actual damages," Maxwell explains. "If the investigation reveals that the patent is valid and infringed the company can either redesign their product or negotiate either a license or a purchase of the patent."

"In my view, the best way to get a great patent is to check all available sources to make sure your application is verifiably unique and free from infringement of anyone else’s ideas already in the patent system,” Maxwell concludes.

What do you think?

Sunday, September 26, 2010

10 Steps to Innovation

Posted by Mark Brousseau

In these days of economic dislocation, the concepts of government, business and technical innovation are more important than ever. When it comes to entering, creating, or dominating markets, disruptive innovation is the most powerful tool available. Unfortunately, most companies find disruptive innovation difficult to achieve and virtually impossible to replicate.

In Innovate the Future, renowned innovator and executive David Croslin provides the structure and framework to analyze one's own innovation dilemma, and to beat competitors in generating new ideas and delivering those ideas to market first.

Drawing on his experience leading innovation in organizations ranging from start-ups to the Fortune 20, Croslin explains the top ten steps for optimizing the entire innovation lifecycle:

1.Isolate Drivers of User Value - Function, look, adaptability, price - or all of the above. Focus on triggering a decision to buy versus passive interest.

2.Define Good Enough - A product with more positives than negatives will sell. Focus on this goal before refining for enhanced features and niche crowds.

3.Understand Your Delivery Chain - Separate innovative, value-additive suppliers from passive ones. Focus on suppliers who make your products better.

4.Isolate Delivery Chain Pain Points - Underperforming suppliers may not improve. Find an alternative.

5.Align Viewpoints Within Your Organization - Evaluate team for ability/inclination to increase product potential. Pinpoint inhibitors, plan to minimize their impact.

6.Kill Assumptions - Base evaluation of success or failure upon known facts, not subjective interpretation.

7.Isolate Intellectual Property - Break products into components, so intellectual property behind each can be recombined into new products.

8.Identify New Markets for IP - Coke is not marketed strictly as soda - think Coke Slurpees for 7-Eleven and A&W Root Beer candy for Brach's.

9.Don't Confuse Invention for Innovation - A highly inventive product won't necessarily move markets. A highly innovative one will. Know your market.

10."Cool" Is Not Key - Cost, availability, consistency and ability to integrate within consumers' lives matter much more.

What strategies have worked in your organization?

Friday, September 24, 2010

10 Pitfalls to Avoid When Going Social

If you were to make a list of up-and-coming business trends, social media strategies would probably be near the top. Actually, scratch that "up-and-coming" part—social media is already here. However, thousands of companies are rushing headlong into the profile-creating, news-tweeting, blog-posting frenzy...only to find that their valiant efforts are not getting the results they had hoped. If you're looking for fans, followers, and friends to build a Social Nation around your business, don't panic, says Barry Libert. There is simple advice that will help businesses avoid the pitfalls and make a strong online impact.

"It's true: there are countless benefits to joining what I call the Social Nation revolution—but just like any strategy for growth, social media isn't foolproof," points out Libert, author of the new book Social Nation: How to Harness the Power of Social Media to Attract Customers, Motivate Employees, and Grow Your Business. "If you don't want your company's social strategy to fall flat, there are some guidelines you'll need to follow."

Libert knows what he's talking about. After all, he's the Chairman and CEO of Mzinga, a company that provides social software to businesses. Quite literally, it's his job to be social media savvy. And he's adamant that before you start building your own Social Nation, you need to have a well-researched game plan.

"When it comes to building a successful social network for your company, you need to understand that there's a lot of prep work to be done," he explains. "You can't just set up a Facebook profile for your company, tweet once or twice a day, and expect public interest in your company to shoot through the roof. Far from it, actually."

Think about it this way: if you were in charge of your company's booth at a trade show or conference, you wouldn't just slap your company's logo onto a piece of poster board, place your business cards on the table, and hope for the best, would you? Of course not. Yet that's exactly how some companies approach social media—and that's why so many of these initiatives fail.

"If you want to become a meaningful part of social conversations and interactions," explains Libert, "you've got to know who your target 'fan base' is, where they spend their time online, and what sorts of content and programming is valuable and relevant to them, and will foster their continued interest and participation. You also need to make sure you have the wherewithal to commit to growing and sustaining your Social Nation, and you've got to make sure that you have buy-in from within your company. And that's just for starters."

Sure, it may sound intimidating, but don't give up yet. Half the battle is knowing which mistakes not to make, and Libert, in the book Social Nation, is eager to share the top 10 social media pitfalls he's seen organizations fall victim to in the past. Read on to discover what they are:

Pitfall #1: Running a Social Nation like a traditional business. If you want to run a social company, you first need to understand that almost everything you do is a two-way street. That is to say, you're not going to prosper if your products and services are designed solely by folks on the inside. You need to embrace the perspectives and contributions of your employees, as well as those of customers and partners.

Pitfall #2: Underinvesting in social initiatives and abandoning them too soon. Understand that a Social Nation is organic—it won't materialize with a proverbial snap of the fingers. Early on, you'll need to invest a good deal of time, thought, and money in attracting fans and followers—and your efforts will need to be sustained. Only after you've built a firm foundation will your social network begin to sustain itself through participant contribution and recommendation.

Pitfall #3: Neglecting to find ways to encourage and inspire your Social Nation's followers and fans. When you stop to think about it, you'll realize that your fans and followers are essentially volunteering their time and energy to serve as developers, sounding boards, and advertisements for your company. So for goodness' sake, respect what they have to say and take their input to heart!

Pitfall #4: Relying on a "build-it-and-they-will-come" mentality. Ummm...you don't really think that launching a new website and firing off posts at various online networking hotspots will bring fans and followers flocking, do you? Of course not! To some extent—usually a large one—you'll need to purposefully reach out to potential community members and make it worth their while to accept your invitation.

Pitfall #5: Delaying the process of going social. Contrary to what you may wish, your company doesn't have the luxury of waiting until it's "convenient" to go social. Why? Well, you have competitors, right? And if you don't start gathering loyal followers and fans now, there's a good chance that some other company will woo them first.

Pitfall #6: Underestimating the power of a Social Nation. If you believe that social networking is just a window dressing that your company "needs" (but not really), then think again. Social media and community collaboration bring many benefits, including brand-building, customer loyalty and retention, cost reductions, improved productivity, and revenue growth.

Pitfall #7: Neglecting employees, partners, investors, or customers when building your Social Nation. Yes, set up a "focus group" of employees to serve as community leaders who will shepherd your company into the social networking world, but don't put all of the power in their hands. Social Nations are organic organizations, so the more people who are empowered to influence yours, the better.

Pitfall #8: Relying on traditional approaches when designing your Social Nation. A decade ago, you probably would have been horrified at the thought of releasing ideas and products into the hands of your customers before they were as complete as you could get them. With social networking, that monolithic approach is now becoming obsolete.

Pitfall #9: Developing your own social software and analytics solutions. You wouldn't dream of placing "remodeling the office" or "handling legal issues" in the Do It Yourself category, would you? Not too many would. Instead, you'd hire someone skilled in those areas. Do yourself a favor and use the same strategy when it comes to building your own Social Nation.

Pitfall #10: Getting caught without partners to help you succeed. Libert has alluded to this one before, but it bears specific emphasis: make sure that you truly treat your community members as partners, not just as fans or numbers. Yes, integrating into the social web (Facebook, Twitter, and other social networks) is key to your company's future success, but being connected to the social web is only a part of what you need to do. Shifting your business strategically, culturally, and operationally are key components to the equation.

What do you think?

Information, Please!

Posted by Mark Brousseau

After years of discussion, plans to expand the Automated Clearing House (ACH) Network to facilitate the electronic transfer of supplemental remittance information may finally gain traction.

Initiatives to use the ACH rails to transfer information associated with business-to-business payments and healthcare payments were among the hottest topics at WesPay's Payments Symposium this week at the Renaissance Hotel in Long Beach, California, notes Leilani Doyle (ldoyle@usdataworks.com), product manager at US Dataworks (www.usdataworks.com), a Houston-based solutions provider.

"The ACH Network has proven to be a stable and successful payments channel," Doyle explains. "But as electronic transactions continue to gain acceptance, it is clear that the ACH Network needs to be expanded to more efficiently carry remittance information, as well as payment instructions."

Doyle notes that several initiatives already are underway to allow information to be passed along with payment instructions. The most notable effort is the new International ACH Transaction (IAT) format. "It was necessary for OFAC [Office of Foreign Assets Control] reporting that international payments include enough information with ACH transactions for proper screening. To accomplish this, addenda records were added to accommodate the required information," Doyle explains.

Now, NACHA is extending this concept to business-to-business payments, hoping to eliminate one of the largest remaining obstacles in electronifying business checks: the need to communicate remittance information. "This is not a new concept," Doyle says. "NACHA's CTX [Corporate Trade Exchange] format was a start. But the ANSI standards it relies on are too complex to be effectively used by mid-sized businesses." As an alternative, a plan is under consideration to combine standardized addenda records with XML tags that could be interpreted by both sending and receiving ACH systems. "This approach may be a real solution to the B2B ACH trade payments problem."

Another big opportunity for an expanded ACH Network lies in the healthcare space. "Imagine an ACH standard record format that allows EOB [explanation of benefits] information to be linked from within the payment," Doyle says. "Using this approach, there would be no need to send all of the EOB information around on the ACH rails. Instead, an addenda record would provide secure and specific access to EOB information in an XML format," Doyle explains, adding that this "simple and extensible" solution is designed with the healthcare market's fast-changing requirements in mind.

NACHA is hardly standing still as it puts the finishing touches on its information initiatives. The organization introduced Secure Vault, a payment system that allows consumers to pay for goods and services over the Internet, without disclosing their bank account information. "The Secure Vault payment method connects directly to a bank's online banking application where the customer enters their ID and password, and money is then transferred to the merchant using the ACH Network," Doyle says. After a lengthy pilot, Secure Vault is now "ready for primetime," Doyle says. "The Secure Vault concept is sound, but only time will tell whether the adoption rate is high enough for it to become as ubiquitous as credit card and e-check payments for Internet transactions," Doyle adds.

What do you think?

The Hunt for "Orphan Storage"

By Rich Walsh, Viewpointe (www.viewpointe.com)

Storage professionals are now under pressure to find and use “orphan storage,” rather than buying or building more capacity. Orphan storage is a form of unused or unallocated data in everything from a database to disk drives and storage area networks. The problem seems so universal, that I hear this almost everywhere I go. I recently heard one executive say: “When we buy storage, we know where it is, but now our mandate has become finding unused storage, wherever it happens to be.”

Symantec’s CEO has even gone so far as to tell the market to "stop buying storage." I couldn’t agree more with this sentiment. Not being able to use your existing space or, worse, access the storage you already have – those seem to be the larger problems. Certainly IT executives are probably both gratified and mortified that this issue, which is hardly new to them, is finally getting some attention.

Recently, we asked IDC to take a deeper dive into this issue; and in a whitepaper, IDC noted outsourced storage as a good solution to the growing capacity problem. Generally they concluded that for easy access, as well as appropriate amounts of storage, outsourced systems work very well. Moving data to a hosted repository allows companies to pay only for the actual capacity they currently need, as opposed to an in-house infrastructure that is generally built for future consumption. And, this approach may be better suited for accessing the needed data at a later date.

Right now, IT executives want to make good use of all the equipment and devices that they have already purchased, and that is sound business judgment. Still, at some point, organizations are going to deplete the space they have and simply purging existing files may not be enough to keep up with the increased demand.

However, the question remains: What should companies do once they have determined just how much existing storage they have? Will they continue to buy ad-hoc, only to be faced with the exact same orphan storage problem in a few more months? Or, is it time for a fresh approach to this ever-growing problem?

"What are you doing here?"

By Greg Lusch (glusch@ibml.com), ibml (www.ibml.com)

With all of the banks and financial services companies participating in this week's Healthcare Payments Automation Summit (HPAS) in Boston, the healthcare providers and payers in attendance could be excused for momentarily thinking that they were in the wrong place. But they weren't, and neither were their fellow attendees from banking and financial services.

When a single market represents a whopping 17 percent of the country's Gross Domestic Product (GDP) -- as healthcare does -- lots of companies will be looking for ways to cash in. Banks and financial services companies are no exception. And based on my conversations at HPAS, more providers are open to help from banks in automating healthcare payments.

For instance, there was a lot of conversation at HPAS about adapting bank lockbox services to process explanation of benefits (EOBs) and other medical documents. According to the results of a survey released by IAPP-TAWPI at the event, 34 percent of healthcare providers already use a bank lockbox for healthcare payments. Undaunted by the increased security and privacy regulations under the HITECH Act, it is clear that even more banks are pushing forward with lockbox services aimed squarely at hospital and physician practice groups. In fact, the participants on a panel at the event unanimously predicted that the percentage of providers that use a bank lockbox would climb, while a speaker in another session said he expected "slow but steady" growth for both bank and provider-based EOB solutions.

And if HPAS is any indication, banks also are making headway with remote deposit capture (RDC) solutions targeted at the healthcare space, namely, rising patient self-pay and co-insurance/co-payment obligations. According to the IAPP-TAWPI survey released at HPAS, 22 percent of providers currently use RDC. Several vendors of RDC solutions exhibited at the event (Creditron, EPSON and WAUSAU were among them), and a few providers shared case studies of their experiences with the technology as part of the conference agenda (faster funds availability was cited as a key benefit). A common refrain among healthcare providers at HPAS was that lower bank fees have greatly improved the business case for RDC, while banks have done a better job of adapting their solutions to the unique needs of providers.

The role of banks in the healthcare space also was a dominant -- and sometimes heated -- topic during the Healthcare Payments Council meeting that immediately followed HPAS.

The good news for banks looking to crack the healthcare space is that most HPAS attendees believe that while automated payment transactions (claims, remittances and payments) will continue to make gains, paper will be a fact of life in the industry for the foreseeable future.

And that is why banks were at HPAS.

Thursday, September 23, 2010

Bookies Pick iPad Killer

With rumors swirling that BlackBerry maker Research in Motion (RIM) could unveil its forthcoming tablet -- a.k.a. “the BlackPad” -- as early as next week, consumers and analysts are guessing what company will be next to officially announce a tablet to compete with Apple’s iPad.

Mickey Richardson and his team at Bookmaker.com, one of the leading sportsbooks, have calculated the odds on who the next company to release a tablet to compete with Apple’s iPad in 2010 will be.

LG +300 25%
HTC +100 50%
SHARP +300 25%
MOTOROLA +300 25%
NOKIA +150 40%
SANYO +300 25%

For those of you unfamiliar with the joys of wagering, the +/- indicates the return on the wager. The percentage is the likelihood that response will occur. For example: Betting on the candidate least likely to win would earn the most amount of money, should that happen.

Where would you place your bet?

Online Storage and Privacy Laws

Posted by Mark Brousseau

If you store sensitive files on your personal computer which law enforcement authorities wish to examine, they generally cannot do so without first obtaining a search warrant based upon probable cause. But what if you store personal information online—say, in your Gmail account, or on Dropbox? What if you’re a business owner who uses Salesforce CRM or Windows Azure? How secure is your data from unwarranted governmental access?

Both the U.S. Senate and the House of Representatives are investigating these crucial questions in two separate hearings this week. Congress hasn’t overhauled the privacy laws governing law enforcement access to information stored with remote service providers since 1986. The Electronic Communications Privacy Act (ECPA), the key federal law governing electronic privacy, has grown increasingly out of touch with reality as technology has evolved and Americans have grown increasingly reliant on cloud services like webmail and social networking. As a result, government can currently compel service providers to disclose the contents of certain types of information stored in the cloud without first obtaining a search warrant or any other court order requiring the scrutiny of a judge.

Against this backdrop, the Competitive Enterprise Institute has joined with The Progress & Freedom Foundation, Americans for Tax Reform, Citizens Against Government Waste, and the Center for Financial Privacy and Human Rights in submitting a written statement to the U.S. Senate and House Judiciary Committees urging Congress to reform U.S. electronic privacy laws to better reflect users’ privacy expectations in the information age. The groups also belong to the Digital Due Process coalition, a broad array of public interest organizations, businesses, advocacy groups, and scholars who are working to strengthen U.S. privacy laws while also preserving the building blocks of law enforcement investigations.

“The success of cloud computing—and its benefits for the U.S. economy—depends largely on updating the outdated federal statutory regime that currently governs electronic communications privacy,” the statement argues. “If Congress wants to ensure Americans enjoy the full benefits of the cloud computing revolution, it should simply reform ECPA in accordance with the principles proposed by the Digital Due Process coalition.”

What do you think?

Wednesday, September 22, 2010

From Healthcare to Baseball

Posted by Mark Brousseau



Chuck Garcia of BOK Financial, Kendall Brown and Gordon Sellers of Systemware, Serena Smith of FIS, Mark Brousseau of IAPP-TAWPI, and Alan Beaney of Systemware take in a Boston Red Sox game after attending the Healthcare Payments Automation Summit.

Health Reform’s Impact on AP Costs

Posted by Mark Brousseau

The new federal health reform law will drive accounts payable (AP) costs higher over the next two years according to industry stakeholders who responded to a survey at this week’s IAPP-TAWPI Healthcare Payments Automation Summit (HPAS) in Boston. The survey was conducted during the conference by IAPP-TAWPI, APQC and PRGX. Survey respondents included healthcare payers and providers; third-party services providers (such as medical billing firms); banks; and IT vendors.

More than half (51.9 percent) of the HPAS attendees who responded to the survey predicted that health reform will result in higher AP costs over the next two years, while 48.1 percent of survey respondents said that AP costs will remain unchanged. None of the conference attendees that responded to the survey believe that short-term AP costs will decrease as a result of health reform.

HPAS attendees who responded to the survey were more divided on the long-term impact of health reform on AP costs. More than one-third (36.2 percent) of survey respondents believe that health reform will drive AP costs higher long-term (defined in the survey as over two years from now), while an equal percentage of respondents believe AP costs will remain unchanged. On the bright side, 27.7 percent of respondents predicted that health reform will result in lower AP costs long-term.

Among the other findings of the HPAS survey:

… Data integration, processing performance, and integration of physician data were the top healthcare AP challenges identified by respondents, followed by cost pressures, manual data entry (which drives costs up), and the ability to track and report evidence-based improvements in cost.

… Most survey respondents (57.7 percent) believe that health reform will have no impact on AP processing performance over the next two years, while a plurality of respondents (39.6 percent) predicted that health reform will result in lower AP processing performance long-term.

… Nearly two-thirds (64 percent) of survey respondents believe that health reform will have no impact on AP late payments and error rates. Long-term, survey respondents were more divided, with a plurality (38.3 percent) predicting that health reform will have no impact on AP late payments and error rates, 31.9 percent predicting that health reform will result in more AP late payments and errors, and 29.8 percent predicting that health reform will help decrease AP late payments and errors.

… HPAS attendees are not optimistic about health reform’s impact on IT systems costs. Nearly two-thirds (62.3 percent) of respondents believe that health reform will drive IT systems costs higher over the next two years, while 37.7 percent of respondents predicted that systems costs would remain unchanged. None of the respondents believe that health reform will result in lower systems costs over the next two years. Long-term, half of the survey respondents believe that health reform will result in higher overall IT systems costs, while 18.8 percent believe IT systems costs will decrease. About one- third (31.3 percent) of respondents predicted that systems costs will remain unchanged.

“Big changes are coming in healthcare, and AP organizations must ask themselves if they are ready,” APQC Analyst Neville Sokol told HPAS attendees. “At times like these, organizations are turning to data and best practices to help them solve problems, improve processes, or design something better. These tools can help make sense of a complex world, and provide a roadmap for moving forward.”

Growing Opportunity for Banks in Healthcare

Posted by Mark Brousseau

The opportunity for banks in the healthcare market is growing, Aaron McPherson, practice director, Payments and Security, Financial Insights, told attendees yesterday afternoon at the Healthcare Payments Automation Summit (HPAS) in Boston. “In the short run, healthcare reform hurt bank sales as providers were waiting to see what would happen. Now, patient payments, in particular, are an underdeveloped segment of the market that will become a key focus for banks,” McPherson said.

McPherson told attendees that several provisions of the healthcare reform legislation will provide a “big boost” to banks that are marketing payments processing services to healthcare providers:

… Greater operations complexity: Healthcare reform will create many more plans for healthcare providers to “deal with” -- each with different deductibles and co-pays.

… Electronic health records: The federal mandates for healthcare providers to implement electronic health records will sap limited resources for payments processing initiatives.

… Cost cutting: “Steep reductions in Medicare payments will force cost-cutting,” McPherson said, adding that this will drive some providers to partner with banks on payments processing.

… Higher patient payments volumes: “Healthcare reform will result in an increase in patient payment volumes, which, in turn, will stress the systems at many providers,” McPherson said.

But if banks are to take advantage of the growing opportunity in the healthcare market, McPherson said they should heed the lessons learned by their peers that were among the pioneers in the space.

Dedicated focus is critical: Three out of four banks that McPherson spoke with before the conference had a dedicated sales force for their healthcare remittance offerings.

Partners are important: “The banks I spoke with said their partners were critical to their success,” McPherson said. “Most banks will want to partner with a processor or specialty service provider. Experience and integration with clearinghouses, payers and such are important differentiators in the healthcare market. One bank bought their partner after a successful year-long collaboration. Another bank only found success in the healthcare space on their third partner.”

Prepare for sales challenges: “All of the banks I spoke with said the healthcare sales cycle was long and required significant subject matter expertise on the part of their salespeople,” McPherson said. “Banks can’t rely on their existing sales staff. They need people who understand the product and the market. Banks also should look for ways to leverage their existing relationships with providers.”

Patience and persistence do pay off: “The banks I spoke with have been at this for years,” he said.

HPAS Vendor Showcase

Posted by Mark Brousseau



Jim Wanner of KeyMark, Mark Brousseau of IAPP-IARP-TAWPI and Bo Minogue of MAVRO Imaging at the vendor showcase Tuesday at IAPP-TAWPI's Healthcare Payments Automation Summit at the Boston Sheraton.

Tuesday, September 21, 2010

The Mid-Term Elections and Healthcare Reform

Posted by Mark Brousseau


Even if Republicans win majorities in Congress this fall, it's unlikely that they will be able to repeal the recently passed healthcare reform legislation, Dennis G. Smith, managing director of the Medicaid practice at Leavitt Partners, said during a keynote presentation this morning at IAPP-IARP-TAWPI’s Healthcare Payments Automation Summit (HPAS) at the Boston Sheraton.

“Nobody is talking about the Republicans winning veto-proof majorities,” Smith said, adding that even if the Republicans did win big, repealing the legislation would only put the country “right back where we started, with the same problems. And when I travel around the country, employers are telling me that they are fed up with the current healthcare environment.”

Against this backdrop, significant changes in the healthcare reform law “really depends on whether Obama pivots, and does what Clinton did in the 1990s,” Smith said. “But Obama is far more ideological than Clinton was.”

So what changes can Republicans push through Congress if they were to win majorities as a result of the mid-term elections? One tool available to them is the Congressional Review Act, which allows Congress to veto regulations. They also can cut appropriations for certain mandates. “Even entitlements are subject to appropriations,” Smith notes. Congress can also demand a budget summit, which has occurred about every 12 years, Smith said.

But healthcare industry stakeholders shouldn't wait on Congress. “If you expect to be on the winging team when healthcare reform goes into effect, now is the time to prepare,” Smith concluded.

Monday, September 20, 2010

Tommy Thompson Addresses HPAS

Posted by Mark Brousseau

This morning, former Wisconsin Governor Tommy G. Thompson delivered the opening keynote address at IAPP-IARP-TAWPI's Healthcare Payments Automation Summit (HPAS) in Boston. Weaving his experiences as a four-term governor and the Secretary of the Department of Health and Human Services with his recent experiences in the private sector, Thompson told the standing-room only crowd that it's unlikely that healthcare reform will achieve the cost-saving objectives advertised by the Obama Administration, or that hospitals will be ready to meet new federal mandates for health information exchanges, without financial support from the federal government.



WAUSAU Financial Systems Executive Vice President and member of the IAPP-IARP-TAWPI Board of Directors Kathy Strasser greets former Wisconsin Governor Tommy G. Thompson before his keynote presentation this morning at HPAS.










IAPP-IARP-TAWPI President and CEO Tom Bohn greets former Wisconsin Governor Tommy G. Thompson before his keynote presentation at HPAS this morning.












Eric Jones (right), chairman of the IAPP-IARP-TAWPI Board of Directors, and Kathy Strasser, executive vice president at WAUSAU Financial Systems and member of the IAPP-IARP-TAWPI Board of Directors, greet former Wisconsin Governor Tommy G. Thompson before his keynote address this morning at HPAS.










Former Wisconsin Governor Tommy G. Thompson addressing HPAS this morning.

SWIFT Service Bureaus & Corporate Connectivity

Theodore K. Baxter, Product Manager, EastNets

Implementing a global treasury system involves different channels of communication, via many different standards. SWIFT provides a globally recognized standard for communicating to your bank, reducing the workload of having to implement different banking formats and methods. This provides you with a single, secure, standardized global platform to conduct treasury management on an international level. And, using a single format across your global banks allows you to easily integrate SWIFT into your Treasury Management Systems and ERP.

SWIFTNet provides the following functions as part of its FIN:

• Treasury payments and notifications (MT 101), notice to receive (MT 210)
• Intraday (MT 942)/end-of-day bank statements (MT 940) and credit/debit advices (MT 900/910)
• Deal confirmations for foreign exchange/interest rate/money market deals (MT 3xx)
• Instructions to deliver/receive securities and statements of holdings (MT 5xx)

Using SWIFT in Liquidity Management
Having effective liquidity management allows an organization to gain the maximum benefits of their money at minimal cost. As a corporate you would Request for Transfer (MT 101) of your funds from one bank to another using FIN. Reporting on these treasury payments from each of your banks can be done again over FIN using the Interim Transactions Report (MT 942) or the end-of-day Customer Statement (MT 940) sent by each of your treasury banks. By integrating these end-of-day and intra-day statements into your corporate treasury systems you can better monitor accounts and obtain global visibility on cash. This allows for better control, and decisions on fund management.

The Business Case for Connecting to SWIFT
Corporates wanting to make the business case to connect to SWIFT take into account the costs and weigh them up against the benefits. These costs can include both SWIFT (registration, connectivity and messaging) and Non-SWIFT (project management, operations and applications integration) costs. Benefits can include both Operational benefits from reducing the number of different bank messaging systems and the reduction in staffing costs (either through reduction, reallocation or growth avoidance), as well as improved automation, and the financial benefits that come from working capital optimization and having better visibility of global cash, and transaction processing efficiencies. Other benefits include security, cost, standardization and integration.

Three Options for SWIFT Connectivity
There are three options when it comes to SWIFT connectivity: Direct, Indirect and Alliance Lite. The Direct method means that the SWIFT Connectivity infrastructure is owned and maintained by the company itself.

Indirect connectivity is achieved through a SWIFT Service Bureau or Member Concentrator, where they host and maintain the technical infrastructure for SWIFT connection, and can also provide additional services for the SWIFT connectivity. This is the most popular method for corporates to connect to SWIFT, allowing the Service Bureau to help guide the corporate through the SWIFT process, and taking advantage of the vast knowledge and experience held by some of these Service Bureaus.

Alliance Lite is the newest option for Direct Connectivity for corporates. Alliance Lite provides a simplified, secure, internet-based connectivity to SWIFTNet. It also offers a reduced uptime to get onto the SWIFT network.

Operational Benefits
One operational benefit of joining SWIFT is to reduce the number of communication channels to each of its banks; whether from fax banking, e-banking etc., to one SWIFT channel. SWIFTNet provides a communication platform, and the products and services that allow you to exchange financial information across a highly secure and reliable network. Another major benefit of adopting SWIFT is the reduction in staff costs from the improved automation. Staff can then be allocated for more important core functions. Using a Service Bureau can further extend this benefit, as it alleviates the need to have SWIFT experts on staff.

Financial Benefits
The financial benefits from using SWIFT include faster communication with your banks, allowing for better working capital optimization and better insight to what their money is doing. Reducing the number of cash management channels to one SWIFT channel reduces operational and maintenance costs. It allows for easier integration as a common standard is used.

The Service Bureau Option
One of the benefits of choosing a Service Bureau for your connectivity option is the ability to utilize the company’s experience in helping customers with their SWIFT onboarding process. SWIFT Service Bureau at companies such as ours help corporates join SWIFT by walking them through the process step-by-step. An experienced Service Bureau can help answer your questions relating to the planning of your SWIFT Project as in Figure 1, and the process of going live as in Figure 2. Another benefit of joining SWIFT through a Service Bureau is the reduced time it takes to get onto SWIFT – from months with a Network Provider installation, for example, to weeks with a company such as EastNets.

Providing a common language that banks understand, SWIFT allows you to utilize its standards driven formats such as FIN & ISO 20022 for liquidity and risk management. SWIFT provides the documentation to make sure you understand what information and services you are seeking to use and the messages to send to achieve these.

What do you think?

Tuesday, September 14, 2010

4 Reasons Government Entities may adopt Integrated Payments Hubs

Posted by Mark Brousseau

If you think your operations budgets are tight, try managing payments processing for a government entity. Badly stung by declining tax revenues, most state, county and municipal governments have squeezed their operations budgets dry. And it couldn't have come at a worse time for government operations managers: like their counterparts in the private sector, governments are struggling with how best to adapt their operations to declining check volumes and emerging payments channels.

Leilani Doyle (ldoyle@usdataworks.com), product manager at Houston-based US Dataworks (www.usdataworks.com), believes government entities may find a solution in so-called enterprise payments hubs (or integrated payments hubs), which consolidate paper-based and electronic payments into a single platform, in turn, streamlining processing and eliminating operations silos.

About one-quarter (22.2 percent) of all government entities that responded to a recent IAPP-TAWPI survey indicated that they have implemented an enterprise payments hub to consolidate paper and electronic payments. The responses from state revenue agencies nearly mirror the overall findings for this question, with 21.4 percent indicating that they have implemented an enterprise payments hub. Non-revenue state agencies and county government entities have made a little more progress in this area, with 33 percent of (non-revenue) state agencies indicating that they have implemented an enterprise payments hub, and 40 percent of county government entities (by far the highest adoption rate among the groups tracked) stating that they have implemented an enterprise payments hub.

Doyle says 4 factors could drive faster growth of enterprise payments hubs among governments:

1. Declining paper volumes. As government agencies achieve success with electronic payments, their existing paper-centric infrastructure becomes obsolete. "Paper will not go away any time soon, but there's no need to maintain equipment and applications designed to manage large volumes of paper payments," Doyle explains. "Moving forward, government entities will need an integrated payments platform that can scale up or down as needed. This type of payments processing platform operates like a utility that can be easily adjusted to changing payment types and volumes."

2. Focus on serving constituents. Implementing an integrated payments hub enables government entities to provide better service to their constituents, Doyle explains. Research can be performed from a single location. Posting is more accurate. And check images can be retrieved instantly.

3. Push to reduce bank fees and operations costs. With an integrated payments hub, government entities can consolidate their bank deposit files, putting them in a stronger position for negotiating bank fees. Inside government operations, an integrated payments hub helps government entities increase overall staff productivity by not requiring them to learn different applications for processing each payment type. Similarly, reports for staffing and efficiency can be produced from a single system, streamlining the generation of Key Performance Indicators each agency must produce.

4. Lower capital expenditures and ongoing costs. With the emergence of enterprise payments solutions that offer a Software-as-a-Service (SaaS) or hosted delivery model, government agencies can replace their aging systems with little to no upfront cost. This is a creative way to allow agencies without the budgeted dollars to replace antiquated legacy systems, Doyle says. "SaaS services also provide an added layer of security and compliance protection, starting with PCI compliance and SAS-70. This can significantly reduce risks and audit costs for government agencies." What’s more, leveraging a SaaS or hosted delivery model means government entities can offload the management of their IT infrastructure. This not only saves money, but also allows government entities to better focus on their core competency -- serving taxpayers. And this may be the biggest benefit of all.

It's for these reasons that Doyle thinks government entities may adopt integrated payments hubs.

What do you think?

Scanner Demonstration Video

Posted by Mark Brousseau

Below is a link to a brand new video demonstrating ibml's scanners.

http://www.ibml.com/demovideo.php

Monday, September 13, 2010

How Do Apple, Ford, and Microsoft Survive In The New Economy While Others Crash?

Posted by Mark Brousseau

Six out of ten new businesses fail. Unemployment isn’t getting any better. The housing market is set for another bump in the road next quarter. And as if the cake needed icing, the FDIC is reporting that about half of America’s banks -- including the four largest -- are on the bubble, and may fail by the end of the year.

As serious people at serious companies are looking for answers to the dilemma, one expert wants them to focus on a principle that is often overlooked in hard times: innovation.

“The equation is simple: innovate or perish,” says Robert Brands, a veteran corporate executive. “At every major crossroads in the history of American business, innovation has been the driving force behind the companies that made it through the bad times. After all, as we all look for the hot new product or the ‘killer app’ in our respective industries and professions, we tend to overlook the fact that someone has to create or invent it first.”

Brands believes that innovation is the governing philosophy behind companies that succeed.

“Whether it is a multinational corporation or an entrepreneurial startup, innovation can help a business launch, recover or overcome even the greatest of competitive pressures,” he adds. “If you are a manufacturer, distributor, service provider, supplier, retailer or even a not-for-profit, the pressures of the new economy are worse than anything the business world has seen for decades. So, how do you get through it? Look at the companies that are prospering, despite the economy. Apple, Ford, Microsoft and others didn’t stand pat as the economy crashed. They reinvented themselves and their product and service lines. After falling behind to Japanese competition amid the GM bailouts, Ford went back to the drawing board on their line of cars and emerged stronger than before, having one of their best quarters ever. It wasn’t layoffs or the mitigation of risk that accomplished that. It was innovation, creating something new to satisfy its customer base.”

Brands wants people to expand their notion of innovation.

“When people think of innovation, many of them think of simple brainstorming for ideas,” he adds. “This is a fallacy. Brainstorming is just one small element of a much larger process. Innovation is NOT a tactic. It is a process, and if businesspeople follow the right steps, they can achieve innovation regularly -- not just when someone slips on the soap in the shower and the next killer app just comes to them as they put ice on the bruise on their head.”

Brands recommends some rules to govern that process.

“For instance, everyone wants to achieve that ‘a-ha’ moment, when they think they’ve struck upon an idea that could be big for their company,” he says. “Part of it centers on recognizing a need in the market place, but then combining all the elements and resources within your company to see if you have the ability to leverage existing research, development, contacts and distribution to fill that need. For instance, the iPod was an innovation that came about from Apple’s examination of the consumer’s desire to buy single songs instead of whole CDs, and the record industry’s inability to leverage the Internet as a viable delivery medium. Now, in reality, the process was far more complex than that simple sentence, but the essence of the process is there. The key to making innovation a profit center is to be able to sustain it through the entire life cycle of a business."

Brands concludes that, "innovation should not be a one-shot deal.”

Wednesday, September 8, 2010

Google giveth then taketh away

Google’s test of “streaming search” not so short lived

Google has just announced its “streaming search” service, Google Instant, is coming out of limited beta testing and going live for all users.

According to Adam Bunn, head of search at independent search and social marketing agency Greenlight, when it comes to search engine optimisation campaigns (SEO), some websites may now suffer a drop in traffic. This service could also potentially result in complications for rank checking software and impact on search demand figures given by Google’s keyword tools, Bunn says.

With regards to paid search, Matthew Whiteway, director of campaign management (paid search) at Greenlight, says it could play havoc with an advertisers Google Quality Score. Whiteway also says Google’s motives for doing this must be questioned. Given the “longtail” is becoming increasingly important, with search queries, the cost-per-click (CPC) Google can charge for “longtail” keywords is significantly lower than that on one or two keyword search queries, Whiteway says. Therefore the more people search for “longtail” search queries, the less money Google can charge the advertiser, he explains.

Google’s development uses AJAX to dynamically serve search results as you type, Greenlight notes. Each time a new recognizable word or phrase is typed that changes the results set in a meaningful way, Google will fetch the search results for that word – without you having to hit “search.” So, if you’re intending on searching for ‘scary books suitable for children,’ Google might first fetch results when you’ve finished typing ‘scary,’ then ‘scary book,’ then ‘scary books,’ then finally ‘scary books suitable for children.’

Bunn says this is a mightily impressive display of processing power on Google’s part. Now, for every search you do Google may have to process anywhere from a couple to half a dozen different searches. It has got to do this fast enough to keep up with your average typing speed. This, on top of the fact that retrieving and sorting thousands of documents in a split second is already a modern marvel - admittedly one that few people spend much time thinking about, Bunn adds.

What of the impact for SEO?
According to Bunn, SEO campaigns including long multi-word keyword variants may see a drop in traffic for those keywords as a result of streaming search. Why? Users may now find something to click on before completely typing their originally intended search term (depending, of course, on Google being able to provide accurate enough results at an earlier stage in the search). Consequently, to be visible/show up in search results, it may become more important for websites to optimise for the shorter, constituent parts of longer keywords, Bunn says.

“For example, if a website has optimised for and holds good rankings for ‘cheap car insurance UK,’ that term may lose search traffic as UK users find that the shorter ‘cheap car insurance’ returns several relevant looking results, negating the need to finish their sentence,” Bunn says.

Bunn points out that the constituent parts of longer keywords are often the types of generic keywords that are typically dominated by big brands and powerful sites with the cash to maintain rankings in an extremely competitive keyword space.

“So for smaller websites, this could well be a case of first Google giveth (the 'May Day update') then it taketh away (streaming search results). We’ll have to hold tight for the exact repercussions, which could also extend to complications for rank checking software (if AJAX is involved in retrieving search results) and impacts on the search demand figures given by Google’s keyword tools (if each stage in the streaming search counts as an impression)," Bunn says.

Ramifications for paid search
In relation to paid search, the question is whether Google will count each refresh/change of the search engine results pages (SERPS) as an impression for the advertiser. While some advertisers will believe increasing the number of impressions/eyeballs that see their ad will help improve brand awareness and brand recall, from a pay per click (PPC) marketing perspective, this increase in unwanted impressions could play havoc with an advertisers Google Quality Score, Greenlight says.

“At Greenlight, we are constantly looking for ways of reducing wasted impressions for our clients with the objective being to improve click through rate (CTR) and therefore relevancy, one of the most important factors of Google’s Quality Score,” says Whiteway. “If Google is going to count these dynamic changes/refreshes to the SERP then should we also expect to see some fundamental changes to the Quality Score algorithm, the keyword Match Types, or do we simply need to increase the number of negative keywords in the account to several hundred thousand? Only time will tell.”

Whiteway says Google’s motives for doing this must also be questioned. It has been suggested that as users become more and more internet savvy, the number of keywords used for each search query is increasing, he adds. For example, users looking for low annual percentage rate (APR) credit cards historically may have simply searched for “credit cards” and then conducted the filtering process manually, whereas in recent years the “longtail” has become increasingly searched for and important, with search queries such as “credit cards with low APR” for example, growing in popularity, Whiteway explains.

So why would the “Google financiers” not like this “longtail” trend? Money, says Whiteway.

“The CPC that Google can charge for ‘longtail’ keywords is significantly lower than that on more generic (one or two keyword search queries). Therefore the more people search for ‘longtail’ search queries, the less money Google can charge the advertiser," Whiteway says. "With ‘streaming search’ therefore, Google is potentially ‘helping’ users find relevant results with less search term queries, thus increasing the number of clicks on generic terms and therefore increasing the CPC for the advertiser.”

Many would argue Google Instant is an example of Google flexing its technological processing power and helping users get results quicker, Greenlight notes. However there must also be some form of financial benefit for Google in making such a dramatic change to the way it finds and displays the results. Which explanation is true? Greenlight says we are unlikely to ever really know.

What do you think?

Tuesday, September 7, 2010

eForms Checklist: Find the Right Fit for Your Business

By Laurel Sanders, Director of Public Relations and Communications, Optical Image Technology (lsanders@docfinity.com)

“Work faster.”

“Get everything right the first time.”

“Outperform our competitors with better service.”

In challenging economic times, these goals are imperative. Yet achieving all three simultaneously can be challenging.

According to AIIM’s State of the ECM Industry 2010 report, 41% of businesses aren’t confident their digital information (except emails) is accurate, accessible, and trustworthy — a severe obstacle to efficiency. Forms are a small part of the web of business information. Yet when content is captured accurately and managed properly, eForms address the challenges of accuracy, accessibility, and trustworthiness while transforming service, increasing profitability, and encouraging sustainability. How? By:

... Capturing data quickly, consistently, and cost-effectively.
... Offering self service.
... Making the information captured on forms useful instantly, enterprise-wide.
... Establishing a framework for information governance.

However, just implementing eForms software doesn’t guarantee results. You must understand your business objectives and ensure the solution you choose will meet those needs.

This checklist will help you to develop a customized requirements list to ensure your eforms solution meets your unique needs.

Form Design
Useful forms begin with good design, so the design function must be user friendly, flexible, and adaptable. Examine your broader organizational needs so your selected product meets requirements as your installation expands.

Will your solution let you:

___ Design custom forms, or will you be restricted to templates?

___ Add unlimited components to the design canvas, with form length expanding according to your needs?

___ Align, match, select, delete, or erase form components during the design process?

___ Decide the order in which form-related actions will be completed?

___ Save unfinished form designs so you can complete them later?

Also, are online user guides and tool tips available to guide you through every step of the process?

Applying Form Controls
Form controls help designers to ensure standardized documents are completed correctly. Ideally, they enforce behind-the-scenes rules, dictating how each form should be completed and used.

Can you apply:

___ Form controls wherever you want them, to regulate how forms are completed?

___ Data entry controls for each field you create, so you can ensure quality data input?

___ Validations that compare data entered on forms with pre-specified criteria for each field?

___ Data sources to the eForm, so data captured in pre-specified fields can be extracted from (or pushed to) other databases and applications, recycling information meaningfully and eliminating errors?

Data Management
Data — the specific fields of information that you collect, such as name, contact information, dates, terms, and more — drives processes forward, enabling smart and timely decision making.

Can you:

___ Apply validations that compare entered data on a form against specified criteria and alert users when entered data is invalid?

___ Extract pre-specified form data as variables within a business process, allowing specific content to be available without making entire forms visible to users? (This is essential if sensitive information is stored on the form, and some of the data needs to be processed by employees who shouldn’t see every detail.)

___ Configure your ECM system to store form data in a third-party database or other external location after a form has been completed or submitted?

Form Administration and Security
Regulations and internal policies demand controlled access to sensitive information. eForms that are part of an integrated electronic document management (EDM) solution let you secure sensitive information while making sure those who need it have appropriate access.

Will your solution let you:

___ Assign forms to specific groups, controlling who can access, view, delete, annotate, or sign them?

___ Hide specific text on forms so only specified user groups can view it?

___ Configure actions that should occur whenever a specific type of form is submitted (send an email, launch a business process, send data from a form to another source, convert a form to PDF), based on the needs for that form?

___ Configure validation criteria and specific datasources for each form type?

___ Apply hot keys to certain actions as desired, to make work more efficient?

Form Use
End user needs vary greatly, and an eForms product should be adaptable enough to meet diverse requirements.

Will your forms product:

___ Convert forms to PDF when desired?

___ Provide helpful online user guides and tool tips to guide you through form completion and submission?

___ Apply drag-and-drop design components to assist end users in completing forms?

___ Index completed forms as PDFs, forms, or both, as desired?

___ Add a form (or its PDF) into a package of documents and send the complete package into a business process?

___ Search for completed forms?

___ Resize your windows and workspaces so you can work efficiently?

Launch a Business Process
Ultimately, you will most likely want eForms to launch routine business processes and exponentially increase efficiency. Will your solution:

___ Automatically launch a pre-specified business process when a form is submitted?

___ Amend/manage form data within a business process?

___ Make specific forms viewable at the correct time within a business process?

___ Use a submission to trigger an email message, accompanied by a link to the form?

___ Create drop-down menus to guide data entry and ensure relevant, accurate data is collected?

End the content chaos
If you are already implementing EDM, eForms will help you to gain control over your content at the point of capture, where benefits are greatest. If you’re evaluating EDM for the first time, consider eForms and business process automation as part of your strategy from the beginning so you can extract the full value of digital information. Understand short- and longer term needs so the technology you select supports your goals. Make sure your vendor is as committed to your success as you are, since a strong partnership dramatically improves business outcomes.

Fast, accurate, and customer-friendly service is the aim of any business. eForms, EDM, and good planning can help you to achieve all three. Now…get started!

Make More of Your Data-rich Systems to Meet Dodd-Frank Requirements

By Laurel Sanders, Optical Image Technology (lsanders@docfinity.com)

Remember Aesop’s fable, The Miser and His Gold? A miser buries his cache of gold coins under a tree, periodically unearthing them and marveling at his lustrous collection before hiding them again. One day, an onlooker notices. Shortly afterward, the fortune disappears. The miser’s opportunity to use his treasure is gone. The moral: “Wealth unused might as well not exist.”

The lesson applies to the valuable information systems you own, too. If they aren’t integrated to enable efficient sharing of your content everywhere it has value, their potential is wasted. Idle information might as well not exist.

Dodd-Frank: implications for the enterprise
If you’ve followed the latest financial publications, you’re aware of the Dodd-Frank Wall Street Reform and Consumer Protection Act. Like other current legislation, the new laws are designed to:


· Reduce fragmentation and complexity in data management;

· Demand data consistency enterprise-wide; and

· Increase organizational transparency.


Similar to recent healthcare regulations, Dodd-Frank has significant implications for data management across the enterprise. The good news: if you already have quality information systems, you may be able to meet numerous challenges without starting over—by giving your systems a common foundation.

Building on what you have
When your institution chose its core financial systems, line-of-business applications, email application and other software, significant deliberation probably preceded each purchase. Unless your systems are ancient (or worthless), we’ll assume good quality capture of data, and solutions that achieve what they were designed to do.

Challenges in meeting recent regulatory requirements arise from demands that exceed what your systems were designed to accomplish. Many solutions were intended to address departmental needs without a vision to enterprise-wide communication. Now, legislation is demanding an enterprise approach that:


· Unifies data classification practices;

· Certifies information accuracy and consistency;

· Standardizes reporting; and

· Creates uniform data governance frameworks.


Enterprise content management (ECM) software, when integrated with business systems, provides centralized, uniform access to diverse digital information -- no matter how it’s captured or where it resides. Think of it like a credit card. If you’ve traveled internationally, you know it’s challenging to manage purchases amid constantly changing currency. A credit card alleviates the aggravation, allowing diverse systems to communicate seamlessly. Instead of converting currencies, just swipe your card. The exchange is automatic; the transfer is understood. You get what you need, within moments, wherever you are.

The role of browser-based ECM
If your workers value their separate information systems – which they probably do – ECM doesn’t demand change. Instead, it enables secure 24/7 access to a centralized repository that connects authorized persons with all of the systems and information they’re allowed to see, and to use the latter according to their permissions. ECM lets you dictate things like:


· Who can list or view specific document types;

· Who can edit, annotate, sign, or email them; and

· Who may purge or delete files.


Information becomes standardized, accessible via a single repository and a consistent interface. The system knows where all of your content resides, and which information belongs together, just as your credit card recognizes purchases you make and the countries, currencies, and US equivalencies each represents.

1.Unifies data classification practices
Uniform classification requires a strategic file plan alongside carefully conceived taxonomies that meet diverse needs. ECM provides the tools to execute that plan faithfully. Scans, bar codes, and online forms consistently follow your prescribed indexing rules, easing search.

Together, your indexing plan and ECM ensure:


· Metadata criteria are complete upon document capture (file type, lifespan, format, source, etc.);

· Data captured meets criteria for length, format, type, etc. (i.e., ID numbers requiring a pre-set sequence of digits/dashes);

· Data pertinent to search is complete and compliant, ensuring success;

· Document types are segmented to ensure searches return relevant information.


2.Encourages information accuracy and consistency
When ECM includes process automation, meaningful data is captured and re-used intelligently. Business process management (BPM) software throws your documents and information against your rules, ensuring speed and uniformity in routine decision making and exception handling.

Together, they let you:


· Associate, package, and flow related files for action;

· Pre-fill forms and documents with stored information, eliminating keying errors;

· Extract and push data from one source to another at specific points in recurrent processes.


3.Enables standardized reporting
Just as your credit card bill summarizes your purchases--regardless of where and how they were made--ECM extracts data from multiple systems with which it is integrated so you can create holistic, complete reports. Instead of separate audits detailing customer transactions from various applications, everything is centralized, providing better insight. Automatic conversion to PDF and other formats ensures universal access while guarding against tampering.

4.Creates uniform data governance frameworks
Good governance--a central thrust of Dodd-Frank legislation--requires an IT infrastructure that supports fairness and uniformity in decision-making and implementation. For informed decision-making, data used to reach decisions must be accurate, timely, and appropriately accessible at the exact moment individuals need it. Information no longer required to be kept (and could put you at risk) can be migrated, purged, deleted, or destroyed according to the law.

Wide-ranging document types, diverse users, ever-changing retention laws, and the challenges of overseeing them make quality governance one of the greatest enterprise challenges. ECM levels the playing field, ensuring organizational practices are upheld. Rather than subjecting your documents, information and policies to the preferences and personalities of departmental managers, they are subject to your rules. No favoritism. No oversights. No mistakes…and a thorough, digital audit trail of transactional activity verifies compliance.

Use what you have—better
Managing your content effectively is like managing your credit card: it requires forethought, planning, and procedural adherence. Don’t be miserly with your data; ECM ensures you use it while it’s timely and relevant.

ECM can’t do your planning, but it ensures policies and rules are honored faithfully without exception, enterprise-wide. I can’t speak for you, but when someone invents a credit card that knows every resource at my disposal and flawlessly honors my intent, I want one!

Saturday, September 4, 2010

Why Can't Tellers be Sellers?

By Vijay Balakrishnan, president of StratEx, LLC (vijay.balakrishnan90@gmail.com)

Stepping into a debate that is as old as retail banking is perhaps unwise. There are passionate adherents ranged on both sides of the question. To some, the issue is not whether, but should tellers be sellers?

The question brings the raison d'etre of the retail branch network into sharp relief. Are branches retail storefronts with the primary mission to enhance customer relationships, or are they collection points for myriad transactions processed by centralized back office operations centers? Is the driving imperative one of customer intimacy, or does operational efficiency rule the roost?

A tilt towards operational efficiency has traditionally driven retail banking, with occasional overtures to the selling side of the equation. These overtures, however, tend to be fleeting, and with few exceptions, have not survived beyond some concerted marketing and employee incentive programs.

To understand why the push towards serving and selling the customer has not been sustainable, consider a few points. Making check deposits is by far the main reason customers visit a branch. When they do visit, the teller is the person they most often interact with. Regardless of all the training and incentives that may have been put in place, consider what tellers actually do. They are heads down punching numbers into keyboards (try counting the number of teller keystrokes the next time you're in a branch). They have barely enough time to complete the data entry and squeeze out a quick thank you before the next customer is at their window. Imagine a Neimann Marcus salesperson wordlessly packing what you've picked out and intently ensuring that the bow on the package is just right! Yes, the analogy is not quite right- but you get the picture.

So despite many a marketing push, it is the fundamental transaction tether that yanks the teller back into the role of a frontline operations clerk- the first cog in the vast infrastructure that we put in place to process paper checks, featuring planes, trains, automobiles and giant "paper factories".

There is an alternative, courtesy the legislative cover of Check 21 and advances in imaging and recognition technology. Teller Capture allows the teller to drop the entire deposit into a small foot print scanner and interact heads up with the customer, while an imaging application reads all the necessary information, ensures the transaction is balanced, and prints out a receipt when done. Teller Capture eliminates teller induced data entry errors, and also catches math errors up front. This "ready-to-post" transaction at the very beginning of the deposit stream results in major efficiency savings further down the value chain. It is as close to straight-through-processing as one can get in the check world.

"Not so fast," say some. "You want to make my tellers into check operators?" The reality is that the opposite is true. There is now evidence of major savings in teller time per deposit, including data from a Top 5 U.S. bank of having reduced keystrokes from 75 to 5!

"What about the cost of a scanner and software at every station?" challenge others. "It is really difficult to integrate these capture applications with teller systems." The cost per node for both hardware and software is steadily declining, making it well worth the while to examine the return on investment. The hard numbers on transportation savings, back office labor elimination, and funds availability make it interesting- leave alone the soft benefits in customer service and added sales. Capture systems are also increasingly being integrated into teller systems, both by teller vendors that have acquired check-capture technology, and pure play check imaging vendors that have certified their applications with leading teller vendors.

Coming back to the tellers-to-sellers paradigm, what do you do with the saved time? Do you use it to push even more transactions through? Do you have tellers refer customers to other branch personnel based on prompts from an integrated CRM system? Or do you have tellers take on more of a sales and service role themselves? Those are decisions that will be driven by your overarching strategic intent. Do you want tellers to be sellers in the first place? As you ponder that question, you may want to look at teller capture as an opportunity to cut the transaction tether that keeps pulling you back, yo-yo-like, to the paper factory of another era.

What do you think?

(Fr)agile Software Development

By Vijay Balakrishnan, president of StratEx, LLC (vijay.balakrishnan90@gmail.com)

Much of our world is made possible by software. There are myriad software systems that manage and move our money, keep track of our health histories, light our homes and offices, and indeed even enable you to read this post. While the sheer scale of accomplishment from zeros and ones flitting about at the speed of light is astounding, the manner in which some of these systems are developed, tested, and delivered raises a few questions.

Over the falls in a barrel. The early years of evolution in software development owed much to needs of the defense and aerospace industries. These were highly mission-critical systems that had to work correctly almost ten times out of ten. A linear process that involved detailed specifications, technical designs, strict coding discipline, reviews, and rigorous testing ensured the delivery of many high performance systems.

A version of this made its way into the commercial marketplace under the broad "waterfall process" moniker. The series of hand-offs, from product management, to architecture, design, development and testing, with intermediate review cycles, hearkened a series of waterfalls as in a cataract. While the process worked well for the most part, it lacked speed. The many steps limited organizations to one or two releases to the marketplace a year. It was difficult to nimbly respond to competitive and regulatory changes. If changes were not included early enough in the cycle, it was tantamount to missing an exit on a tollway, and waiting for the next one.

Sprints around the racetrack. In the 1970's, the automotive industry introduced the concept of "simultaneous engineering", where design engineers, manufacturing engineers, and quality control worked together in teams. As opposed to the linear, "throw it over the transom" model, this engendered both speed and sharing of ideas. That germ of an idea made its way into software as Agile Development. While there are many agile methodologies, the general concept is that specifiers, programmers, and testers work together in short, iterative, "sprints" to produce executable software. Over multiple sprints, complete, ready-to-release applications can be built.

Lost in translation. While agile development has made it possible to release software more frequently, a few challenges have appeared on the way to nirvana. To the agile purists, I will grant that many of these have to do with incorrect interpretation and implementation, and perhaps not because of fundamental drawbacks in the methodologies. The challenges are amplified when you add offshore development where the advantage of co-located teams disappears. They are also most acute when software is developed for General Availability to a large and varied customer base, as opposed to internal use within an enterprise. Here are some of the pitfalls I have observed over the years:

What we have here is a failure to communicate. With apologies to "Cool Hand Luke", one of the main complaints I have seen is, "We don't know what is coming, and when!" We have moved from exhaustive, written requirements to writing nothing down. The refrain is that the sprint teams communicate with each other, and are on top of release content. Some will add that everything can be discerned from documentation within the code. The problem is that there are many stakeholders outside the sprint team, such as sales, marketing, professional services, and support. These people are not adept at reading code, and think in terms of functions and applications, as opposed to individual features. The result often is that market facing groups either oversell or undersell the product (more often the former!).

Who's on first? While sprint teams are cohesive and democratic, the flip side is that it can result in no one at the helm. While the methodologies call for a "function customer" who signs off on software content and quality, this role is often missing in action. Either the role is completely absent, or it is relegated to a Product Manager who is more of a Product Marketer than someone who can go head-to-head with a technician. In the absence of this key role, many cooks jump in to influence the software broth in one direction or the other, resulting in content churn. The process is agile yes, but highly unstable.

Tried and tested. Agile methodologies like test driven development put testing and quality at the center of the process. In practice, however, quality often ends up getting the short end of the stick. The very expectation of agility can compress timelines due to unrealistic promises made to customers. In the rush to "get it out of the door", thorough testing is skipped, and some vendors essentially do their quality assurance on the customer's dime, by continuously band-aiding software at the customer site until it works. In extreme cases, this becomes a license to hack with little regard to version control, belying the very concept of "General Availability". While poor quality is not limited to agile methods, the less rigid process restrictions can exacerbate the tendency in organizations that already have a culture of treating quality lightly.

Customs and traditions. In organizations that cater to customers of varied sizes, the concept of General Availability can be turned on its head. There is often the case of a large customer that wants software customized to meet a unique need. There are very few vendors that have the discipline to examine whether that particular capability warrants inclusion in the software delivered to the general marketplace. The path of least resistance is to include it as a base capability that is "configurable". Over time, the preponderance of configurable customizations makes the software incredibly difficult to implement and support. Again, the lack of a process to adjudicate the "base versus custom" question can result in a multi-headed Hydra, with hidden heads that can appear to bite you when you least expect it.

Distant shores. Every one of the problems discussed explode in complexity when offshore development is involved. The communication challenge now includes time zones, national cultures, and language. The concept of sprint teams working in iterations is predicated on the concept of co-located personnel who can discuss, white-board, and resolve questions face-to-face. Getting this done with people somewhere else on the planet is very difficult, and contributes to hidden costs in offshore development that can obliterate the wage differential in the early stages of the offshore journey. The challenge can be overcome, but it takes special focus and attention to drive out the inefficiencies.

Brave new world. The benefits of agile development have ensured that it is here to stay in most environments. The word to the wise is that getting it to work right involves recognizing the pitfalls, and addressing them involving the right stakeholders. I would not be surprised if many of you recognized your organizations in some of the challenges I have outlined. It is important to recognize that getting software development to work is not just the purview of the programmers alone. Someone said, "War is too important to be left to the generals". If you'll allow the stretch, let me end by saying, "Software is too important to be left to programmers, and methodologies."

What do you think?

Friday, September 3, 2010

ECM & Shared Services

By David Buttgereit senior partner, the BPM Group, KeyMark

Succinctly and in the context of Shared Services, Enterprise Content Management (ECM) is the conceptual term for a range of tools, processes, and procedures used to capture, store, deliver, manage, and preserve business process documents.

But what does all this mumbo-jumbo really mean? Let’s break it down and see how it applies to the Shared Services Organization (SSO) model of business service delivery.

Nomenclature, with SSO Flair
• Enterprise – not just departmental in scope; at its core ECM is supportive of the SSO model.

• Content – the paper documents, faxes, e-mail messages, electronic forms, and – increasingly so – instant messages (IMs) that drive and contain supporting information about business processes and transactions.

SSOs by their nature require content but can drown in it too. For example, the content necessary to complete a complex Accounts Payable transaction may include a lengthy master contract, multiple purchase orders, receipt confirmations, invoices, and perhaps records of IM communications among purchasing agents, requesters, and vendors spelling out discount terms.

Importantly, content often spans departments and multiple lines-of-business software applications and needs to be managed and stored in a manner that makes it accessible to multiple systems and SSO staff members concurrently.

• Capture – the collection, electronic transformation (recognition, classification, validation, quality control, etc.), and delivery of content into a format usable by other computer processing systems.

Traditionally, capture has been thought of as the process of scanning paper documents, using Optical Character Recognition (OCR) and similar automated technologies to extract information from the documents, and then sending the resulting data to lines-of-business applications for transaction processing.

Fortunately, capture has now matured to the point that it can also handle additional sources (faxes, e-mails, IMs, etc.) and can be used to sort, classify, and authenticate complex document sets according to pre-defined sets of business rules. In an efficient SSO, these advanced capture capabilities mean that fewer hands need touch content, greatly minimizing exceptions processing downstream.

• Store – once content has been captured, it must be properly indexed and securely stored – typically in an enterprise repository – for later processing.

• Deliver – the process of making content available to multiple lines-of-business applications while at the same time allowing it to be easily located and viewed through a variety of user interfaces. For example, an SSO Customer Service Representative may need to locate and view an outstanding HR document as part of a customer contact (a job promotion status inquiry, for example) at the same time the document is being actively used to drive corresponding payroll and benefits line-of-business transactions.

• Manage – has multiple meanings, from initiation and management of workflow processes that span multiple lines-of-business applications, to enforcing document security according to Health Insurance Portability and Accountability Act (HIPAA) and similar compliance rules, through providing metrics to Business Intelligence (BI) and Business Activity Monitoring (BAM) applications. It is here that most SSO business transactions are completed and the greatest efficiencies can be gained, with all other components of an integrated ECM system playing important supporting roles.

• Preserve – long-term management of content after it has been used for transactional purposes. Typically preservation is based on sets of Document and Records Management (RM) rules and is tightly controlled for both compliance and discovery purposes. Content may be maintained in an enterprise repository for a finite length of time (or in perpetuity, in some cases) or may be migrated to an off-line storage medium or external repository for archival and eventual destruction.

It’s clear from the above that ECM has great implications for SSOs that provide transactional business services across a single or multiple organizations or agencies.

SSO Content Challenges
Without ECM, an SSO will likely:

• Handle paper documents, faxes, e-mails, e-mail attachments, and IMs in an ad hoc manual manner, slowing processing as transactions traverse departmental boundaries.

• Employ scores of data entry clerks to transcribe information from documents into lines-of-business applications – often multiple times and likely inconsistently.

• Manually validate data accuracy and integrity, with inevitable human errors causing significant rework, exceptions, and costs downstream.

• Inconsistently or poorly secure and protect the data and privacy of customers and business partners.

• Create multiple copies of documents as they traverse departmental boundaries as each department is skeptical that the next will adequately preserve documents if they are needed for review or rework (lengthy contracts can be prime offenders because they consume a significant amount of both paper and storage space).

• Incur high costs for paper, transport, duplication, and eventual destruction of documents.

• Complete complex transactions in a serial manner, even though many components could be processed simultaneously if the supporting content was simultaneously available to multiple staff members and systems.

• Gather BI metrics in an inconsistent manner where the output from one process or system may not easily or directly map to the input of the next, losing continuity.

• Apply the perhaps flawed BI metrics as the basis for managing productivity, quality, staffing, load balancing, and Service Level Agreements (SLAs).

• Employ multiple manual searches when attempting to retrieve transaction, customer, or business partner content found on different documents and housed in different locations.

• Preserve historical business documents and apply RM rules inconsistently, if only because of the multiple copies stored in physical files in multiple departments.

ECM Solutions to SSO Content Challenges
With a properly implemented ECM, the SSO described above could:

• Efficiently capture and store all input types in a consistent, automated manner while speeding transaction initiation.

• Consistently extract business data from captured documents – once – and then feed multiple lines-of-business applications with precise input.

• Automate validation of data accuracy and integrity, reducing downstream rework costs for correcting input errors.

• Consistently secure and protect the data and privacy of customers and business partners, reducing both business risk and easing inevitable audit burdens.

• Use a single, canonical, set of documents for all business purposes and systems – simultaneously.

• Employ a single, comprehensive search and retrieval function, eliminating costly and time-consuming multiple searches and ensuring that only canonical versions of documents are returned.

• Eliminate most of the costs for document storage, transport, duplication, and destruction.

• Complete complex transactions in a parallel manner, supporting aggressive SLAs and leading to higher customer satisfaction.

• Leverage a set of cross-referenceable BI metrics to manage productivity, quality, staffing, load balancing, and SLAs.

• Appropriately preserve historical documents by consistently applying RM rules to the single set of canonical electronic documents.

Pulling it All Together
In conclusion, a robust and comprehensive ECM system can augment and link the multiple lines-of business applications inherent to SSOs, while decreasing costs and yet increasing consistency and customer satisfaction. Careful implementation of an appropriate ECM system should be considered a best practice for any SSO.