Wednesday, October 27, 2010

7 Leadership Skills CIOs Need

Posted by Mark Brousseau

Technology is the single most powerful enabling force available in business today, but as executives and boards of directors recognize its potential, CIOs must have the right leadership skills in place to deliver on heightened expectations, warns Gartner, Inc. and Korn/Ferry.

There has never been a more energizing time to be a CIO, the analysts say. However, the flip side to this is that today’s most successful CIOs must deliver exceptional results.

In the recently published book “The CIO Edge – Seven Leadership Skills You Need To Drive Results”, (Harvard Business Review Press November, 2010, $29.95), Graham Waller vice president and executive partner with Gartner Executive Programs; George Hallenbeck director, intellectual property development, for Korn/Ferry Leadership and Talent Consulting; and Karen Rubenstrunk, formerly with Korn/Ferry’s CIO practice, examine the key skills CIOs need and how to develop them.

“CIOs understand they need to manage IT processes in order to deliver results and to meet key expectations. They also understand the need to lead people in order to deliver on those goals. However, what many don’t understand is the incredibly important interplay between the two,” says Waller. “Focusing on leadership and people skills - the ‘soft’ things that many CIOs tend to minimize in their quest to keep up with their day-to-day responsibilities of managing IT - is in fact the biggest determinate of their success, or failure.”

IT executives who have the best relationships and can earn ‘followership’, not only with their employees, but more importantly with their business partners within and outside the organization, tend to make the most effective business technology executives.

“During the course of our research, we observed the CIOs with the best people skills used these soft skills to influence expectations well ahead of when priorities were set or a project began,” Hallenbeck says. “Before a dime was budgeted, or staff time allocated, they were meeting with their colleagues and engaging in candid two-way conversations that defined what success would look like. Then they delivered against the expectations they helped set and as a result, the organization felt the investment of time and money in IT was worth it. Soft skills produced hard results.”

Rubenstrunk says, “Cynics might argue that CIOs who excel at soft skills might deliver soft results. However, a clear pattern from our interviews showed that the best CIOs, the ones who excel at people leadership, also set the most aggressive goals and hold their people accountable to the highest performance standards.”

Following three years of data-driven research, Waller, Hallenbeck and Rubenstrunk distilled their findings down to the behavioral patterns and key skills they believe to be the most critical to success. Specifically, high-performing CIOs distinguish themselves by mastering the following seven skills:

1. Commit to Leadership First and Everything Else Second.
Gartner and Korn/Ferry’s research reveals that the highest performing CIOs are effective because they embrace the idea that everything they need to accomplish will be achieved through people, by people, and with people. They don’t pay lip service to that idea. They live it. They lead.

2. Lead Differently than You Think.
A high-performing CIO is an incredibly complex and creative thinker. Yet when the time comes to lead, they don’t rely on their superior ‘smarts’ and analytical skills to come up with the best possible solution. They act collaboratively.

3. Embrace Your Softer Side.
Effective CIOs manage the paradox of gaining more influence by letting go of control and allowing themselves to be vulnerable. In turn, that vulnerability enables them to create deep, personal connections — connections that provide the ability to inspire people both inside and outside their organization.

4. Forge the Right Relationships to Drive the Right Results.
This skill may not be surprising. High performing CIOs spend a greater percentage of their time and energy managing relationships that exist sideways: with internal peers, external suppliers, and customers. They purposely invest in horizontal relationships which form the foundation to drive extraordinary results.

5. Master Communication.
The best CIOs know that their colleagues - especially the people who work for them - are always watching. These executives understand they are always on stage. They take advantage of that situation by constantly reiterating core messages and values. Through their focus on clarity, consistency, authenticity, and passion, they make sure their message is not only understood but also felt. They want to communicate a feeling that compels people to take the right actions.

6. Inspire Others.
In exchange for a regular paycheck, most people will give an adequate performance. But they will only give their best work if they believe they are involved in something greater than themselves. The best CIOs provide a compelling vision that connects people to how their enterprise wins in the marketplace and that their contributions are meaningful and valued.

7. Build People, Not Systems.
By developing people all around them, these CIOs increase their capability and capacity to deliver results. They also know that leaving behind the next generation of leaders is the best thing they can do for the organization—it will be their lasting legacy.

The three authors warn CIOs that mastering soft skills can never be a replacement for the key management aspects of the job. It is instead a powerful enabler and an amplifying force that allows individuals to exceed expectations and maximize the value from IT.

“All CIOs must deliver results. What distinguishes the best is how they do it: through people, by people, and with people,” Waller concludes.

Tuesday, October 26, 2010

E-Discovery: Addressing the Risks

By Rich Walsh of Viewpointe

At the heart of many risks facing companies today lurks e-discovery – the locating and accessing of electronically stored information (ESI) for purposes of litigation. ESI can be any electronically stored information – documents, emails, databases, etc. – potentially for use as evidence by lawyers in legal cases.

Compounding the risk to companies is the volume of data subject to e-discovery. In fact, the Association of Certified E-Discovery Specialists, a group dedicated to dealing with this problem, calls the deluge of electronically stored material used as evidence in civil actions the single biggest storyline in the legal world today.

In a recent cross-industry report commissioned by the Deloitte Forensic Center, “E-Discovery: Mitigating Risk Through Better Communication,” just 43 percent of the respondents felt that their companies were somewhat up for the e-discovery challenge. The report notes that in the e-discovery process legal, IT and other departments – those that don’t normally work together – are often thrown together “in a room” to do a difficult job under quite a bit of pressure. And with a lack of common language and systems among these groups, it only further muddies the process.

Where is all of this leading? The Deloitte report found that 49 percent of respondents expect their company’s IT department to have to work more on e-discovery efforts in the near future. So, on top of IT’s workload and limited budgets, adding new e-discovery work will further challenge their priorities. In preparation, companies will need to figure out, sooner than later, where (and even if) they have stored and can easily retrieve everything they might need to produce.

I’d love to hear from TAWPI members as how your companies may be preparing for this challenge. Any tips for colleagues? Share with us.

Rich Walsh is president, Document Archive & Repository Services at Viewpointe. He has more than 25 years of operational information technology experience.

Monday, October 25, 2010

Hey, America: TMI!

Posted by Mark Brousseau

A new national survey reveals half of Americans who use social networking sites have seen people divulge too much personal information, yet more than a quarter of Americans (28 percent) who use these sites admit that they rarely think about what could happen if they share too much personal information online.

Additionally, more than four in ten Americans (44 percent) are concerned that the personal information they share online is being used against them, and more than one in five (21 percent) Americans who use social networking sites believe that their personal information has been accessed by people who take advantage of weak privacy settings on social networking sites.

That's according to the 2010 Social Networking Survey.

“The Social Networking Survey reveals a clear disconnect between the privacy concerns of users and their actual behaviors and disclosures on social networking sites,” said Carol Eversen, vice president of Marketing at LexisNexis. “Nearly every week we hear about the negative consequences resulting from inappropriate disclosures and uses of personal information on social networking sites, however the data suggests that Americans are not taking the necessary steps to protect themselves.”

More than half of Americans who use social networking sites have seen people divulge too much personal information online. In fact, the majority of Americans who use social networking sites admit that they have posted their first and last name (69 percent), photos of themselves (67 percent), or an email address (51 percent) on a social networking site. In addition, survey respondents have also shared the following details on a social networking site:

•Travel plans (16 percent)
•Cell phone numbers (7 percent)
•Home address (4 percent)

Determining how much is too much is still a struggle for many people. Nearly half of Americans (46 percent) agree that sometimes it is hard to figure out what information to share and what to keep private.

As many Americans struggle with what type of personal information to post online and keep private, they also seldom think about the consequences of sharing personal information online. More than a quarter of Americans (28 percent) admit they rarely think about what could happen if they shared too much personal information online.

A quarter of Americans (25 percent) who use social networking sites say that they have seen people “misrepresent” themselves (e.g., posted incorrect information and created fake profiles) and alarmingly, more than one in ten Americans (14 percent) who use social networking sites say that they have received communication from strangers as a result of sharing information on a social networking site.

Other backlash from using social networking sites includes:

•Someone posting unflattering pictures of them (11 percent)
•Having personal relationships with family or friends affected from revealing too much information (7 percent)
•Being scolded or yelled at for information they’ve posted (6 percent)
Surprisingly, 38 percent of Americans agree that people who share too much of their personal information online deserve to have their information used inappropriately.

Three-quarters of Americans (76 percent) worry that the privacy settings on social networking sites are not adequately protecting their personal information. In addition, more than four in ten Americans (43 percent) admit that they typically just click “agree” without reading the entire terms and conditions on social networking sites.

Meanwhile, many believe that their personal information may already be in the wrong hands. More than four in ten Americans (44 percent) are concerned that the personal information they share online is being used against them, and one in five Americans (21 percent) who use social networking sites believe that their personal information has been accessed by people who take advantage of weak privacy settings on social networking sites.

What do you think?

Friday, October 22, 2010

Utilities grow weary of siloed payments solutions

Posted by Mark Brousseau

Rising payment processing costs may be catching up to utilities, if this week’s 13th Annual Utility Payment Conference hosted by Dominion at the Hilton Hotel in Richmond, VA, is any indication.

Historically, utility payments were among the easiest and cheapest to process, thanks in large part to the high number of full-pay transactions that included a remittance document with an OCR scan line. In fact, TAWPI’s 2009 Payments Benchmarking Survey showed that the average cost per paper-based remittance payment remained unchanged at $0.15 per item between 2005 and 2009. But the emergence of new payment channels, such as the Web and credit card, has disproportionately “taken out” so-called “clean” transactions, leaving utilities with more complex paper-based remittances. At the same time, utilities must cost-effectively manage this growing number of payment streams.

Not surprisingly, utilities in attendance at this week’s payments conference were keen on finding solutions that would consolidate both paper-based and electronic payment streams onto a single platform. Mario Villarreal, president and COO of US Dataworks, Inc. – an exhibitor at the event – can’t remember a time when utilities have shown as much interest in enterprise payments solutions.

This follows a trend I observed this summer at the Federation of Tax Administrators conference.

“The utility market is clearly thirsting for enterprise payments solutions that provide them with greater visibility, efficiency and consistency in their revenue management,” Villarreal explained.

For utilities, centralized processing is one of the key advantages of an enterprise payments approach. Villarreal says centralized payments processing is especially appealing to utilities that are expanding their geographic footprint; one utility he spoke with at the conference operates in 33 states. With an enterprise approach, utilities gain better visibility into their payments, regardless of their footprint.

Utilities see similar benefits to centralizing the archival of their payments images and data, he adds.

But vendors may be slow in getting the message, Villarreal added. “Most of the exhibitors at the conference are still taking a fragmented approach to automated utility payment processing.” The vast majority of the 38 exhibitors at the Payment Utility Conference strictly offer siloed solutions for payments applications such as remittance, cashiering, remote deposit capture or ACH processing.

In Villarreal’s eyes, that won’t solve the challenge utilities face in their payments operations.

“Until you consolidate systems and apply standard processes and controls across payment channels, you will always be saddled with inefficient and costly payments operations,” Villarreal concluded.

What do you think?

Wednesday, October 20, 2010

6 tips for messages that resonate

Posted by Mark Brousseau

Today we are overwhelmed with messages. Some are just 140 characters long. Others are much longer, but they are constantly bombarding us—trying to lure us to acquire and consume information (then repeat the process over and over). Technology—social media specifically—allows for constant communication, but easy communication doesn't necessarily translate to messages that are received, understood, and capable of driving action.

At a time when people are tweeting, blogging, emailing, and more 24/7, the best way to genuinely connect and create change, says author and CEO Nancy Duarte, is via truly human, in-person presentations. She stresses that everyone in every company should know how to present and communicate that company's messages with clarity and passion.

"Great presentations are like magic," says Duarte, CEO of Duarte Design, author of the award-winning book Slide:ology, and author of the new book Resonate: Present Visual Stories That Transform Audiences.

"It takes a lot of work to breathe life into an idea. Spending energy to understand the audience and carefully crafting a message that resonates with them means committing time and discipline to the process. Think about it this way: You likely spend countless hours collaborating and innovating to put forth really good ideas. You should spend just as much energy ensuring they are delivered in a way that is impactful. The payoff is that learning how to present in a captivating way—be it at a formal event or to a client across the conference room table—can be your competitive edge in a business environment where too many companies are confusing communication with noise."

So how can you make sure you present information in a way that truly resonates?

"If people can easily recall, repeat, and transfer your message, you did a great job conveying it," says Duarte. "To achieve this, you should have a handful of succinct, clear, and repeatable sound bites planted in your presentation that people can effortlessly remember. A thoroughly considered sound bite can create a Something They'll Always Remember (S.T.A.R.) moment—not only for the people present in the audience but also for the ones who will encounter your presentation through broadcast or social media channels."

To help you get started creating presentations that really stick with your audiences, here are a few tips on how you can incorporate repeatable sound bites:

Create crisp messages. Picture each person you speak to as a little radio tower empowered to repeat your key concepts over and over. "Some of the most innocent-looking people have fifty thousand followers in their social networks," says Duarte. "When one sound bite is sent to their followers, it can get re-sent hundreds of thousands of times."

Craft a rally cry. Your rally cry will be a small, repeatable phrase that can become the slogan and rallying cry of the masses trying to promote your idea. President Obama's campaign slogan, "Yes We Can," originated from a speech during the primary elections.

Coordinate key phrases with the same language in your press materials. For presentations where the press is present, be sure to repeat critical messages verbatim from your press materials. "Doing so ensures that the press will pick up the right sound bites," explains Duarte. "The same is true for any camera crews who might be filming your presentation. Make sure you have at least a fifteen- to thirty-second message that is so salient it will be obvious to reporters that it should be featured in the broadcasts."

Use catchy words. Take time to carefully craft a few messages with catchy words. "For example, Neil Armstrong used the six hours and forty minutes between his moon landing and first step to craft his historic statement," says Duarte. "Phrases that have historical significance or become headlines don't just magically appear in the moment. They are mindfully planned."

Make them remember. Once you've crafted the message, there are three ways to ensure the audience remembers it: First, repeating the phrase more than once. Second, punctuating it with a pause that gives the audience time to write down exactly what you said. And finally, projecting the words on a slide so they receive the message visually as well as aurally.

Imitate a famous phrase. "Everyone knows the Golden Rule," says Duarte. "'Do unto others as you would have them do unto you.' Well, an imitation of that famous phrase might be 'Never give a presentation you wouldn't want to sit through yourself.'"

"The future isn't just a place you'll go," says Duarte. "It's a place you will invent. Your ability to shape your future depends on how well you communicate where you want to be when you get there. When ideas are communicated effectively, people follow and change. Words that are carefully framed and spoken are the most powerful means of communication there is."

What do you think?

Investment in knowledge workers critical to economic recovery

Posted by Mark Brousseau

One of the keys for economic recovery lies in aggressive investment in Social Business Systems designed to dramatically improve the productivity of middle tier knowledge workers.

That’s according to a new report by AIIM and noted author Geoffrey Moore (Dealing with Darwin, Crossing the Chasm, Inside the Tornado, The Gorilla Game and Living on the Fault Line).

These "Systems of Engagement" enhance the ability of knowledge workers to quickly cooperate with each other in order to improve operating flexibility and customer engagement, the report says.

"We have spent the past several decades of IT investment focused on deploying 'systems of record.' These systems accomplished two important things," notes Moore. "First, they centralized, standardized, and automated business transactions on a global basis, thereby better enabling world trade. Second, they gave top management a global view of the state of the business, thereby better enabling global business management. Spending on the Enterprise Content Management technologies that are at the core of Systems of Record will continue -- and will actually expand as these solutions become more available and relevant to small and mid-sized organizations. However, there is also a new and revolutionary wave of spending emerging on Systems of Engagement -- a wave focused directly on knowledge worker effectiveness and productivity. Social Business Systems are at the heart of Systems of Engagement."

According to AIIM Chair Lynn Fraas, Vice President of Crown Partners, "Social Business Systems provide a means for organizations to build on their investment in content management solutions. Increasingly, Systems of Record have become a necessary but not sufficient prerequisite for business success. In the future, organizations will differentiate themselves based on how well they deploy Social Business technologies to improve organizational flexibility and better engage customers. These Social Business technologies are transforming customer engagement through such consumer facing tools as Facebook, LinkedIn, and Twitter. They are simultaneously creating new models of employee and partner collaboration, cooperation and conversation within organizations -- models that will eventually replace e-mail as the primary means of internal collaboration."

According to Moore, "The first wave of spending left knowledge workers mostly on their own. We gave our workers laptops, connectivity, email, and the Office suite, and told them to go be more productive. The world of consumer social technology has given our workforces a taste of what is possible beyond this kind of rudimentary e-mail driven collaboration. Given the pressures that global business models are putting on collaboration and coordination across enterprise boundaries, the demand for increased capabilities is escalating rapidly. The implications of this for IT organizations and CIOs are revolutionary -- organizations need to quickly get in front of this curve or they run the risk of getting run over by it. We are on the cusp of a new wave of investment in Social Business Systems that will focus on providing knowledge workers with the tools to collaborate with a business purpose."

"We are not just talking about collaboration for collaboration's sake," concludes AIIM President John F. Mancini. "Nor are we talking about companies tentatively setting up Facebook fan sites or Twitter accounts to appear to be 'social.' We are talking about the strategic deployment of Social Business Systems that can help organizations improve the flexibility and responsiveness of their core processes and be more responsive to customers. As organizations implement these Social Business Systems, they need to meet three criteria: 1) How to do so quickly; 2) How to do so responsibly; and 3) How to do so in a way that achieves a business purpose."

What do you think?

Book says to get ready for the next boom (really!)

Posted by Mark Brousseau

A bumper sticker popular in West Texas during the oil bust of the early 1980s went something like this, “Please God, just give me one more boom—I promise not to blow it this time.”

Today, millions of people around the world may be having similar thoughts.

In his book to be published in November, Jack W. Plunkett, a widely followed analyst of global trends, writes that massive changes in America and around the world will bring on a sustained period of economic growth. In The Next Boom, he argues that we are on the verge of developments that will boost job creation, investment and international trade over what he calls the “near future,” from 2013-2025.

“The next boom is already rolling down the tracks in the emerging world,” Plunkett says. “America will get on board shortly.”

The book is subtitled, “What you absolutely, positively have to know between now and 2025,” because Plunkett believes that managers, investors, entrepreneurs and leaders need to understand the changes that will soon occur in order to perform effectively. He presents a panorama of developments in areas including energy, healthcare, education, demographics, global trade, technologies and the rapidly-growing global middle class—showing how trends in America and around the world have tremendous synergy that will lead to a surge in business.

Plunkett, who describes himself as a “pragmatic optimist,” explains that “the coming boom will be supported by three building blocks. First, consumers in America are building savings and becoming financially prudent, while population growth is expanding markets for businesses. Next, global trade is about to enter an evolved, vastly higher level while the middle classes in emerging nations are soaring. Third, advanced technologies will boost the global economy in an unprecedented manner that will make the last technology boom seem tame.”

What do you think?

Monday, October 18, 2010

Dodd-Frank to Usher in 'Decade of the Whistleblower?"

Posted by Mark Brousseau

When President Obama signed the Wall Street reform bill into law on July 21, two prominent attorneys say he likely ushered in what might be called "the decade of the whistleblower"—an era marked by a flood of federal investigations sparked by bounty-hunting employees looking to cash in on rewards that, in some cases, could turn them into instant millionaires.

Indeed, the Dodd-Frank bill became law just three months ago, but plaintiff's firms already report an astronomical jump in calls from would-be whistleblowers, note LeClairRyan attorneys James P. Anelli, a veteran labor and employment attorney with decades of experience representing management, and Carlos F. Ortiz, a seasoned white-collar defense attorney who served as a federal prosecutor for more than 15 years. Both attorneys are shareholders in LeClairRyan, based in the firm's Newark, N.J., office.

While the Dodd-Frank Act has been widely discussed, it's extremely significant whistleblower provisions have gone nearly unnoticed, the attorneys say. And yet, under those provisions, whistleblowers that provide information that exposes SEC violations will get up to 30 percent of fines exceeding $1 million. "Bear in mind that recent fines involving violations of the Foreign Corrupt Practices Act (FCPA) have reached up to $100 million," Ortiz notes. "The fallout from these whistleblower provisions will be huge. This is an incredible incentive for employees who are looking to get rich to do all they can to gather information on, and report, potential violations by their employers. Why would they go through existing compliance hotlines when they can contact a plaintiff's attorney and pursue such potentially lucrative payouts?"

Generally speaking, the scope of previous SEC whistleblower laws was limited to cases of insider trading. Dodd-Frank, which will be administered by the newly created Bureau of Consumer and Financial Protection, applies to all potential SEC and commodities-trading violations. For a variety of reasons, it will affect a broad swath of both private and public entities, Anelli notes. "In the old days, whistleblower laws applied to Wall Street traders using insider knowledge to swap 'hot stock tips' with each other, but the new framework is quite broad," he explains. "It applies to virtually any company that deals with consumer credit, loans or property in any capacity, including mortgage brokers, financial advisors and credit-counseling services."

Ortiz says public companies that do business overseas could be forced to deal with an upsurge in employee-generated complaints under FCPA (the conduct of foreign intermediaries, for example, is already under close federal scrutiny.) But public companies are not the only ones that will be affected by the bounty-hunting provisions, Ortiz warns: their subsidiaries and privately-held competitors might also come under closer federal scrutiny.

"Let's assume your company is privately owned and does business in Malaysia," Ortiz says. "If your chief competitor in the market is a publicly-traded American company that, thanks to a whistleblower complaint, becomes the target of a federal investigation, the Department of Justice might launch a broader 'industry probe.' DOJ might say, in effect, 'Now that we know Company X was bribing officials in Malaysia to get work, let's investigate all of its competitors.'"

Moreover, Anelli says the new whistleblower provisions apply to all of the subsidiaries of any public company. "A large public company might have 100 subsidiaries, and as long as the financial information of those subsidiaries is used in its consolidated financial statement, those entities are covered under this law," he says. "The 'Wall Street reforms' actually have a reach that is far beyond the publicly-traded realm."

The potential stakes, the attorneys note, are high: Federal enforcement actions have been increasingly aggressive in recent years, with approximately 150 companies already under investigation for FCPA violations and a growing number of individual executives being singled out for prosecution. "The reforms included a burden-shifting framework that is favorable to employees," Anelli concludes. "Under this framework, employees in many instances will now be able to show that they meet the burden of proof that is required to recover their cut of the eventual fine. Because of the amounts involved, whistleblower cases are going to turn into big business for plaintiff's law firms. As more whistleblowers start making big bounties—and headlines—the number of investigations will only grow. Careful preparation clearly is in order."

What do you think?

Tuesday, October 12, 2010

The Proper A/P Toolkit

By Bruce Bourdon, CPCP
Vice President, Healthcare Channel Sales Manager
U.S. Bank Corporate Payment Systems

Two key challenges face healthcare accounts payable departments today: Shrinking profit margins due to rising costs, and decreased cash flow due to slower collections and reimbursements.

The cash flow pipeline often plugs up due to an inability of the healthcare provider to extend payment terms with its top suppliers. Operational costs, meantime, have been soaring due to the high cost of printing and mailing paper checks, and often re-issuing and re-mailing checks that get lost. Finally, AP staff spent far too much time researching vendor inquiries about the status of the payment they are owed.

If any industry could stand to benefit from going paperless, it’s healthcare. Yet, a 2010 U.S. Bank/IAPP survey showed that 61 percent of all healthcare payments today are made by paper check. A similar survey, this one by PayStream Advisors in late 2009, found that 68 percent of all invoices are traded by paper, and only about 25 percent of all purchase orders are sent electronically to suppliers.

That’s about to change. The U.S. Bank/IAPP survey that showed such a high rate of paper check payments also predicts a 2/3 reduction in check payments and a three-fold increase in use of purchasing cards over the next three years, based on feedback from respondents.

Some may wonder, what is taking the healthcare industry so long to jump on the technology conversion bandwagon? The answer: it is hampered by many of the same roadblocks being experienced by other industries. Namely, perceived external barriers such as limited willingness or capability of suppliers to handle e-payments, and perceived internal barriers such as the high cost of conversion to e-payments or worries about their own capability to manage the transition.

Such concerns are often overblown. The cost of conversion, for example, is dwarfed by the savings realized over time, according to recent studies. To the extent that it’s measured at all, cost-per-paper-invoice can vary from a dollar to over $15 dollars, says the PayStream Advisors survey. But interestingly enough about half the companies surveyed have no idea what it’s costing them to process each paper invoice.

Electronic processing makes the costs much more transparent and easier to measure, therefore making it easier to spot the cost bottlenecks and act upon them. Aberdeen Group has shown that electronic invoice processing shaves $6 to $7 off the cost or processing each invoice. How? By accelerating the approval cycle, reducing the number of lost and missing invoices, reducing the number of “exceptions” and, ultimately, reducing FTE or allowing redirection of work into more value-added activities.

Annapolis Consulting puts it this way: Automation increases ease of use, ease of use increases adoption, adoption increases on-contract spend, on contract spend enhances visibility and visibility reduces wasteful spend. Just as important, visibility enhances leverage when it comes time to negotiate contracts with suppliers.

Today’s payables toolkit brims with options for the healthcare provider, from Electronic Invoice Presentment and Payment (EIPP) to a wide array of paperless e-payment options including commercial cards, virtual or “ghost” card accounts, wire payments and Automated Clearinghouse (ACH). End-to-end automation is both possible and achievable. It’s easier than ever to establish e-payments as the standard for conducting business with your key suppliers.

Friday, October 8, 2010

Calculating ERP TCO

By Erik Kass

It's no secret that ERP is a major investment. ERP systems are company-wide and have long-term implications for the financial, human resources and information technology departments and various other aspects of the business as a whole - which is why careful upfront planning is so critical.

Total cost of ownership, or TCO, analysis can help business owners determine how much it will really take to make their ERP implementation projects a success.

Undertaking ERP implementation can be a risky decision, but TCO analysis is designed to help mitigate that risk by preparing a company for all the costs of ERP ownership - not just the obvious ones.

When done right, a good TCO analysis will help companies separate a good ERP investment from a bad one. However, a high TCO doesn't necessarily signal a poor investment if the corresponding returns on that investment are high enough to offset the expenses. The ratio between costs and rewards is more important than the numbers alone.

TCO begins with an estimate of all the direct and indirect costs associated with ERP implementation, including the cost of the software itself, maintenance costs, operational expenses, upgrades and eventual replacement. Naturally, this necessarily involves making some projections and assumptions about the future, so to achieve the most accurate predictions possible, TCO analysis includes several alternative scenarios.

One industry that is heavily invested in TCO analysis is the automotive and transportation industry. Owning a vehicle comes with a few obvious costs - the initial cost of purchasing the car, for example - and a whole lot of hidden costs that accrue over time. First, just driving the vehicle out of the dealership results in a significant loss of value, which means that even if a car owner sold his or her vehicle mere days after purchasing it, he or she would not be able to sell it for the same value the dealer did. Second, owning a car comes with a lot of responsibility. Car owners need to pay for maintenance and check-ups, buy new tires every few years and of course pay for gasoline. All of these factors combine to give an estimated TCO for the cost of owning a car - and that figure going to be significantly more than the car's MSRP on the lot. A similar line of thinking can be applied to determine the cost of a business to own a jet or a yacht, for example.

The same principle goes for ERP software. The cost of owning an ERP system is likely to end up being significantly more than the sticker price on the software, but companies that understand, prepare for and budget for these expenses will find that their ERP systems save them a lot more money than they cost.

The basic tenet of TCO is this: You cannot manage what you do not measure.
There are five major components of TCO analysis - acquisition, implementation, operations, maintenance and replacement. These five components represent the five life-cycle stages of an ERP system, and each one is associated with specific costs. Understanding all of these costs - in other words, planning beyond simply the initial ERP software costs - is one of the best ways that companies can prepare themselves for ERP and better their chances of becoming an ERP success story.

A graph of these expenses often resembles the Nike "swoosh" logo. There is an initial peak in costs when the software is first purchased and implemented, a dip as it begins running smoothly, and then a steady rise as the system becomes older, requires more maintenance and is eventually replaced. This is the natural cost cycle of an ERP system, and budgeting accordingly will help businesses steer clear of any unpleasant cost surprises.

A second critical part of TCO analysis is determining the direct and indirect costs and risks associated with ERP systems, and managing and controlling these costs and risks accordingly. Direct or budgeted costs include anything paid to clients, servers, peripherals and networks, along with capital, fees and labor in each area. Indirect costs are things like downtime and service to end users - costs that can be hidden and difficult to measure.

Once all these costs and risks are known, TCO analysis conducts a series of what-if scenarios to determine the best implementation strategy that will yield the lowest cost of ownership and offer the highest potential reward with the fewest risks.

Evaluating the TCO is the first step to understanding the potential return on investment, or ROI. Once a company is prepared for all the expenses of ERP - from the software to the maintenance to the upgrades - it can begin reaping the significant financial benefits without worrying about unexpected costs.

Erik Kaas is Director of Product Management for Mid Market ERP products at Sage. He is responsible for managing the product line life cycle from strategic planning to tactical activities. Erik manages a team of product managers responsible for specifying market requirements for current and future products.

Thursday, October 7, 2010

26 Tips to Remember When Classifying Documents

By Jim Thumma
Vice President of Sales and Marketing
Optical Image Technology

Before the inception of electronic document management (EDM), most of us organized documents alphabetically by subject. As EDM continues to redefine how we file and store important information—classifying documents by their type and use or by vital content that can be searched—alpha listings are slowly become obsolete. Yet humans often learn best by association. Thus, this alpha listing seems a viable way to provide useful tips for transitioning from a paper-based filing system to electronically indexed documents. The path is fairly straightforward, but unseen obstacles can play havoc with your intent.

With this cursory nod to the past and both eyes fixed on your future, we’re sharing our experience helping customers in the form of 26 ‘directives’. A bit presumptuous, perhaps? Maybe. After all, we can’t make you take these steps, but if you do, you’ll be off to a good start. If you don’t, well… that’s for another article. Take heed!

Analyze your document types. Who will use each? Which content is important to each type of end user in your department? Which content is valuable to users in other areas of the business?

Be a good listener. If you want your business to run smoothly, vital content must be available to the right people when they need it. To build a strong indexing plan, listen first and make decisions later.

Classify your information with process automation in mind. Which routine processes depend on each document type? What content must be available, and at what point in each process is it needed?

Don’t under- or over-index. Indexing too little information makes future search challenging or futile. Classifying extraneous information that no one cares about makes searches slow and cumbersome.

Educate your end users. Show them how EDM will help them to succeed in their jobs. Fear breeds doubt; insecurity promotes lackluster projects. Address users’ fears. Provide sufficient training.

Find an indexing structure that meets your needs. Should searches return direct hits or just narrow your search results? Consider your resources. Simplified indexing schemes help if resources are stretched.

Give workers adequate time for training. Although a configurable and user-friendly solution should demand minimal schooling, everyone’s needs are different. Hire temporary help to get other jobs done.

Hire outside know-how where you need it. Some vendors and consultants conduct document inventories, create indexing schemes, and more. Know your limitations. Plan accordingly.

Identify which documents are non-essential. If a doc type isn’t needed for business, legal, historical, or reference purposes, it’s probably not worth keeping. Streamline the clutter before you start indexing.

Join in the conversation. Employees need to see their managers stand behind and support document management projects they are expected to embrace. Don’t be invisible. Show enthusiasm.

Keep pace with project timelines. Everyday fires of business can draw workers away from a project. As business needs continue to advance and change, your project may risk becoming irrelevant.

Leverage all available resources to support discoverability. Capture and index pertinent emails, faxes, images, and documents into EDM. Indexing documents into one central repository eases search.

Minimize manual data entry. Wherever you can, standardize and automate indexing using data captured in document scans, bar codes, and online forms. This reduces the likelihood of human errors.

Note who needs access to which information and decide how they will be able to use it. Consider who should be allowed to list, amend, annotate, save, or delete documents. Configure security accordingly.

Organize documents by type. Use batch scanning to save time. Before indexing, consider whether each type should be scanned as-is, or whether pages should be combined or split into multiple documents.

Populate from existing sources and third-party apps such as customer/vendor databases and accounting systems. Re-use data during indexing to reduce duplication and conflicting or erroneous information.

Question your document types as well as your mode of filing them. Work with employees to determine more effective ways to file, search, and retrieve. Don’t assume the status quo still makes sense.

Remember that reports require specific content. Legislative requirements, audits, and quality control reports may require data you otherwise wouldn’t consider indexing. Revise your plans accordingly.

Standardize data collection by providing drop-down boxes, tips for data entry (i.e., the correct format for a date or number sequence), etc., to encourage accurate input. Your end users will appreciate it.

Test-drive your file plan parallel to existing systems before going fully digital. Ensure diverse users can find documents and information they need to work efficiently. Adjust your indexing scheme if needed.

Unearth inconsistencies between departments regarding data collection practices. Are middle names or initials used? Are PIN numbers a separate field or concatenated with last names? Standardize. Now.

Verify questionable scans and imported files immediately. Quality systems should identify problematic files; a smart indexing plan is worthless if it returns useless images. Re-scan/re-import, then index.

Weave an imaginary line through your business, showing where each doc type is used for decisions or processing. Understand how content is used. Then re-examine your plan to ensure successful search.

X-ray your documents: study them closely. Now is the time to streamline. Can document types be combined or eliminated to streamline data collection and reduce duplicate or conflicting information?

Yammer no more. With the advanced technology that’s available, there is no reason to lose a document…ever. Test, test, test. If you don’t get expected search results, go back to the drawing board.

Zero in on effective change management from day one. Communicate goals and plans. Collaborate. Ensure all ideas are heard. Train employees well. Mark milestones when they are met. Celebrate!

How Technology Came to Rule the Legal World

By James D. Shook, Esq.
Director of E-Discovery and Compliance
EMC Corporation

Ten years ago, few people, even lawyers, knew much about electronic discovery (e-discovery)—the process of finding, preserving, processing, and producing electronic information that is relevant to a legal dispute. Today, it’s difficult to find anyone who is not at least conversant with the concept, thanks to many high-profile cases and countless articles in both IT and legal journals. And yet even with all of the changes that we have already seen, the next ten years are likely to produce even more.

An extra “e” transforms “discovery”
The U.S. legal system requires that each party in a civil dispute provide the other party with all information, both good and bad, that is relevant to the case. This part of the litigation process is called discovery.

In the “old days”—which in technology terms means before 2000—the discovery process focused primarily on paper documents such as contracts, notes, files, and correspondence, including letters and memoranda. As businesses began using more technology, especially e-mail, the majority of that information shifted from paper to electronic format, and e-discovery was born.

The rules of discovery never specifically included—or excluded—electronic data, creating inconsistencies and confusion. To address this issue, the Federal Rules of Civil Procedure were amended in December 2006 to specifically include electronic data, defined as electronically stored information or ESI. Although the FRCP only governs disputes in federal courts, it strongly influences state courts, and the rules spread quickly.

Almost overnight, IT systems such as e-mail servers became concerns for lawyers, many of whom are notoriously techno-phobic. Organizations that failed to meet their e-discovery obligations faced the risk of embarrassing and costly sanctions from the courts. Simply collecting and preserving everything was cost-prohibitive. A frequently cited study found that it costs almost $20,000 to have lawyers review a single gigabyte of data (which may seem reasonable since 1 GB represents about 75,000 pages). Extrapolating those costs across hundreds of gigabytes, or even a terabyte or more of data, scared most organizations—and they started looking for a better way to manage both the e-discovery process and their electronic information.

E-discovery strategies at work, today
Organizations that lead the way in e-discovery best practices are attacking the problem in two ways. First, by managing data more centrally and efficiently, they can responsibly delete data that has no value, or which they are under no obligation to retain. Not only does this practice improve the e-discovery process, but it also creates significant savings for storage, backup, and personnel costs.

The second part of this strategy is to bring some—or all—of the e-discovery process in-house. To do this, organizations are creating cross-functional teams that include both IT and Legal, and then deploying technologies that enable fast and efficient in-place search and collection of their ESI.

Yet even today there are many organizations that have done little to address these requirements. Because e-discovery is not a voluntary process, many unprepared organizations perform “faux e-discovery”—they attempt to meet their obligations, but in reality miss significant amounts of relevant data. In doing so, they are taking on significant risk, without understanding or acknowledging it. Other organizations that fail to prepare are forced to turn to expert (and expensive) third-party vendors, frequently spending several hundred thousand to well over a million dollars to respond to a single case—without any ongoing benefit.

More data, more technologies, more challenges, more solutions
The continuing explosion in the amount and varying types of data will continue to significantly impact the e-discovery landscape. With studies noting that we will have 35 zetabytes of data created by 2020, even good processes may be totally overwhelmed by the sheer volume.

In addition, technologies that enhance the speed and efficiency of communication and businesses processes continue to be developed—and they are usually not e-discovery-friendly. Social media technologies such as Facebook and Twitter are further blurring the line between personal and business data, which can be difficult for organizations to locate and preserve. Cloud computing can put a company’s data in the hands of a third party, sometimes in a different country or jurisdiction, which also makes e-discovery more difficult.

But technology is also likely to provide solutions, such as intelligent filtering and review of data. There are tools today that can classify and determine whether documents are relevant to a case based on their similarity to other relevant documents or other criteria. But those technologies are new and complex, and their acceptance in actual court proceedings is not assured.

With all of these issues on the horizon—and certainly more that we cannot yet predict—the next 10 years in e-discovery will be every bit as interesting as the last.

As director of e-discovery and compliance at EMC Corporation, James D. Shook, Esq. works with customers to help them solve challenges related to e-discovery, compliance and privacy. James is a long-time member of The Sedona Conference, a well-known legal think tank, and is an active contributor on several of its committees.

Wednesday, October 6, 2010

Intern Upturn

Posted by Mark Brousseau

At a time when aspiring young professionals are having a hard time breaking into the corporate world, US Dataworks is arming high school and college students with invaluable programming and financial services knowledge, as well as industry contacts, through a hands-on internship program in its product development group. Recent “graduates” of the program have gone on to attend the University of Texas at Austin and Louisiana State University. On top of gaining real-world experience, interns can tell prospective employers that they’ve interned for a company recognized as an innovator in its industry.

US Dataworks is a Houston-based provider of payments processing solutions and services. The company’s Clearingworks platform is the industry’s only deployed solution for the end-to-end processing of any paper-based or electronic payment. Each summer, the company selects students to work on critical projects alongside seasoned programmers in its product development group.

US Dataworks launched its internship program in 2009. That summer, students worked on coding automated test cases for the company’s Clearingworks platform. Written in the Java development framework, the students’ test cases are still used by US Dataworks’ development team to automate quality assurance tasks; the test cases have greatly reduced the burden on the company’s full-time quality assurance staff and have helped to ensure better product releases for US Dataworks customers.

This past summer, the company’s interns used the Java development framework to successfully create an entire set of reports on search results. The reports, which conformed to coding and design standards, included imbedded links, and were especially designed with the user experience in mind.

“US Dataworks takes great pride in its reputation as an innovator in the payments industry,” said Leilani Doyle (, US Dataworks ( Director of Product Management. “Tapping into bright young minds provides our development team with a fresh perspective on programming solutions, as well as much needed resources as we extend our product lines.”

Not surprisingly, US Dataworks is committed to continuing its internship program.

Friday, October 1, 2010

ICD-10, EHRs Take Center Stage at AHIMA

Posted by Mark Brousseau

AHIMA's 82nd Annual Conference and Exhibit, held this week at the Gaylord Palms Hotel and Convention Center in Orlando, Florida, may not have featured "Earth-shattering new products" or "game-changing players," but it did have something that made exhibitors smile: better booth traffic.

"I can't say that I saw any new products at AHIMA," says exhibitor Greg Lusch (, ibml's ( business development manager for healthcare. But attendance at the event -- which draws coders, transcriptionists and other medical records professionals -- was noticeably higher than in recent years, Lusch adds, resulting in a steady stream of potential buyers visiting the Birmingham, Alabama-based company's booth. He attributes the increased buying interest to the "loosening economy" and strong demand for ICD-10 and electronic health records (EHR) solutions.

"There was a sense among the exhibitors that attendees had a little more money to spend," he says.

If the AHIMA conference is any indication, healthcare providers will spend a lot of that money on ICD-10 initiatives. In 2013, the U.S. healthcare system will transition from ICD-9 to ICD-10 as the HIPAA mandated code set for medical symptoms and procedures. This code set is used for billing and health insurance reimbursement, as well as statistical analysis, clinical, epidemiological and quality reporting. As a result of this transition, Lusch notes that the number of diagnosis codes will swell from 13,000 to 68,000, while the number of procedure codes will soar from 3,000 to 87,000.

"ICD-10 was by far the hottest topic at AHIMA," Lusch says. "Many attendees were there to better understand how to deal with ICD-10; how to make the transition from ICD-9 to ICD-10, what tools and updates were available to help streamline the process, and, in many cases, to find third-party services to help them figure it all out. Clearly, this was a major area of focus for AHIMA attendees."

The other area of focus for many AHIMA attendees was the conversion to EHRs. Lusch notes that in addition to hospitals and large practices -- which have been showing increasing interest in EHR solutions at conferences throughout the year -- a number of service bureaus were at AHIMA sizing up the potential opportunity, looking for EHR solutions of their own, or offering conversion services. "There is no question that more service bureaus are jumping on the EHR bandwagon, offering to scan medical records on behalf of healthcare providers. They clearly believe there is a lot of scanning business out there."

Interestingly, Lusch noted that many of the large EHR solutions vendors did not exhibit at AHIMA.

Noticeably absent from most of the exhibit hall banter was any talk of health reform. That's not to say that it didn't come up during some sessions. But Lusch thinks AHIMA attendees were "too consumed" with the major tasks of ICD-10 and EHRs to focus on the uncertainties of reform.