Thursday, May 27, 2010

Integration issues stymie RDC growth among government users

By Mark Brousseau

One positive result of the Check 21 legislation has been the adoption of remote deposit capture, which uses imaging technology to truncate checks at the point of presentment, in turn, streamlining processing, accelerating funds availability, and significantly reducing transportation costs. While remote deposit capture has been one of the most successful banking products of all time (enjoying a faster adoption than even online banking), you would never know it from the results of the 2010 Government Payment and Document Processing Survey conducted by TAWPI and IAPP.

Only 16.2 percent of all survey respondents currently use remote deposit capture. Among state revenue agencies, 17.4 percent of respondents currently use remote deposit capture. The adoption of remote deposit capture was strongest among the county government entities that responded to the survey, with half (50 percent) indicating that they use the technology. None of the (non-revenue) state agencies or city government entities that responded to the question use remote deposit capture.

Dave Bracken, vice president and senior account manager for Cash Management Solutions, Inc., says the low adoption rate of remote deposit capture among government users doesn’t mean that the need isn’t there for out-of-footprint collection solutions; instead it speaks to the complexity of integrating this type of technology without having to replace their entire legacy remittance processing system.

“A standalone remote deposit capture solution doesn’t provide updates to a government user’s internal tax systems,” Bracken explains. “And this ‘associating information’ is as important as the funds themselves. For remote deposit capture to be effective for a government user, it needs to be an extension of their centralized processing system. For most users, this presents an integration issue.”

Despite this challenge, Bracken predicts that more government users will adopt remote deposit capture – or “split source” solutions – into their operations as they replace their highly customized, “one-off” applications – which are difficult to upgrade – with more flexible and open platforms.

“The problem isn’t a lack of need. The problem is in effective implementation,” he concludes.

What do you think?

Tuesday, May 25, 2010

Government users have limited capital for payments and document automation

By Mark Brousseau

If state, county and municipal government users are going to make improvements in their payments and document processing operations in 2010, they’ll likely have to do it without the benefit of additional capital, according the results of a new survey from The Association for Work Process Improvement (TAWPI) and International Accounts Payable Professionals (IAPP).

The survey was conducted in partnership with the Federation of Tax Administrators (FTA) and sponsored by J&B Software, Inc. (a 3i Infotech company), Fairfax Imaging, ibml, WAUSAU Financial Systems, Inc., Cash Management Solutions, Inc., and Eastman Kodak Company.

Complete results are available at: http://www.tawpi.org/research/government-processing-study.aspx.

Nearly half (46.4 percent) of all survey respondents indicated that their capital budgets for payments automation projects are unchanged for 2010. However, 17.9 percent of all respondents stated that their 2010 capital budgets for payments automation projects are slightly lower compared to 2009, and 28.6 percent said their capital budgets are significantly lower. This means that the majority of respondents (46.5 percent) have smaller capital budgets in 2010 for payments automation projects. Only 3.6 percent of respondents to the electronic survey from TAWPI and IAPP reported that their 2010 capital budgets are slightly higher, while 3.6 percent said they are significantly higher.

The outlook is only slightly better for state revenue agencies, with 6.3 percent of respondents indicating that their 2010 budgets for payments initiatives are slightly higher, and 6.3 percent reporting that their budgets are significantly higher. About a third (37.5 percent) of state revenue agencies indicated that their 2010 budgets for payments automation projects are unchanged, while 31.3 percent said they are significantly lower and 18.8 percent reported they are slightly lower.

Two-thirds of (non-revenue) state agencies reported that their 2010 capital budgets for payments automation projects are slightly lower, while 33.3 percent said that their budgets are unchanged.

Three-quarters (75 percent) of county government entities that responded to the question indicated that their 2010 capital budgets for payments automation projects are unchanged, while 25 percent said they are significantly lower compared to 2009. Among the city government entities that responded to the question, 60 percent said their 2010 budgets for payments automation projects are unchanged compared to 2009, while 40 percent stated their capital budgets are significantly lower.

It is important to note that none of the (non-revenue) state agencies, county government entities or city government entities that responded to the question said their budgets for payments automation projects are higher in 2010. Clearly, capital continues to be very tight for government operations.

The budget situation is almost identical for document automation projects. Almost half of all survey respondents (44.4 percent) indicated that their 2010 capital budgets for document automation projects are unchanged compared to 2009, while 14.8 percent of all respondents stated that their capital budgets are slightly lower, and 29.6 percent indicated that their budgets are significantly lower. Only 7.4 percent of all respondents indicated that their capital budgets for document automation projects are slightly higher in 2010, while a fortunate 3.7 percent stated that they were significantly higher.

The budget situation is a bit better for state revenue agencies, with 12.5 percent indicating that their 2010 capital budgets for document automation projects are slightly higher compared to 2009, and 6.3 percent reporting that they are significantly higher. Still, 37.5 percent of state revenue agencies said their 2010 capital budgets for document automation projects are unchanged, while 31.3 percent stated that their budgets are significantly lower compared to 2009, and 12.5 said they are slightly lower.

Two-thirds of (non-revenue) state agencies report that their 2010 capital budgets for document automation projects are slightly lower compared to 2009, while 33.3 percent said they are unchanged.

One-third (33.3 percent) of county government entities that responded to the question indicated that their 2010 capital budgets for document automation projects are significantly lower compared to 2009, while 66.7 percent stated that they are unchanged. Similarly, 40 percent of city government entities that responded to the question indicated that their 2010 capital budgets for document automation projects are significantly lower, while 60 percent stated that they are unchanged.

The federal government respondent to the survey indicated that its 2010 capital budget for document automation projects is significantly lower compared to 2009. It is worth noting again that none of the responding (non-revenue) state agencies, county government entities, city government entities, or federal government agencies stated that their 2010 budgets for document projects are higher.

Getting a raise at work – not as hard as you think

Posted by Mark Brousseau

With the ever-increasing cost of living, employees at all levels would love to get a raise. But how do you make your pitch to the boss and succeed in the face of today’s economic difficulties?

Diane L. Katz, Ph.D., a Tucson, Arizona based organization consultant and author of the new book, Win at Work! The Everyone Wins Approach to Conflict Resolution, offers a strategy for resolving workplace conflicts such as debating the merits of a raise. You need a game plan that allows you to be professional, assertive but not confrontational, and clear about what you want. Diane Katz’s time-tested approach speeds up decision-making, blends intuition and logic, and leaves everyone comfortable with the solution.

To get a raise, she recommends thinking carefully about what you want and how your performance compares to the other employees around you. Next, research the wage rates and ranges of other workers in your profession and with your level of experience and responsibility.

Determine what is negotiable. Identify all the possible forms of compensation that can be given to you. In addition to your hourly rate or salary, consider vacation time, paid health leave, travel and per diem, and the ability to work at home. Can you get paid for key product or service deliverables by the unit or by the job? Bonuses and commissions may also be something you can negotiate.

After doing your homework, ask your boss for a meeting. At the start, present the key data that support your case. Describe what you have accomplished and any promises made to you, then state what you want. Assure the boss that you like the work and the challenge, but expect to be fairly and appropriately compensated.

Be prepared to receive a tough or even a negative response. Accept criticism, but say what you have you learned from mistakes and misunderstandings and remain firm.

If you get yes, show appreciation but don’t leave without asking for confirmation or at least a timeframe for when you will have the details on your new compensation and the effective date.

If the answer is no, assess what you learned. Focus on how your work performance is viewed, and what you need to do to protect your future there or elsewhere.

No matter what the outcome, you win by standing up for yourself, making your feelings known, demonstrating your commitment and ability, and maintaining your self-respect.

Monday, May 24, 2010

Dot connecting in the new data world

Posted by Mark Brousseau

The flood of digital information increases the need for accuracy -- including knowing which data to leave out. Scott Schumacher, Ph.D., a government security and technology expert with Initiate, an IBM Company, explains:

Remember when we used to ride around in our cars and listen to AM radio? Maybe you’re not quite old enough to remember, but there was a time when AM radio was all we had – and that was fine. There also used to be only a handful of television channels, which we had to get up out of our chairs to change. That was fine, too.

I don’t remember longing for a wider variety of music on the radio, or more channels to watch on TV. We had what we had, and it was all fine – it was all “good enough.”

There was also a time when the level of accuracy that our intelligence and law-enforcement systems offered was “fine.” We connected the dots well enough to eliminate the greatest threats.

Not any more.

Today, there is an intense push for accuracy in our data and, particularly, in our ability to accurately “connect the dots.” Why now? What’s changed? What’s pushing the accuracy button more than it’s been pushed before?

Turn off the radio, put down the remote, and I’ll explain.

What is accuracy?
I’m a mathematician. When I think of accuracy I think of numbers and percentages, of false-negatives and false-positives. But for law enforcement or intelligence officials, accuracy means tracking down and mitigating a potential risk before it happens.

Both perspectives are critical in understanding what accuracy is and how to improve your results.

Mathematically, accuracy is a pair of numbers. Accuracy compares the number of times you “miss” (present a false negative) and the number of times you incorrectly “hit” (present a false positive). Accuracy measures how well your process makes a decision – how well it can find a “true-positive” result amid the false negatives and false positives.

When you hear a phrase like, “our system is 95 percent accurate,” it usually refers to the false-negative rate – or the connections it missed. To gauge the true accuracy of that system, you also need to know the false-positive rate. If the system floods you with false positives, and touts a 95 percent accuracy rate (focusing on the things it missed), that’s not going to get you very far. You’re going to be spending all your time chasing false threats.

From an intelligence perspective, accuracy is just as much about keeping data apart as it is putting it together. If there were a security threat in a particular airport at a particular time, a less-than-accurate system might flag every person who was in the airport at that time as a suspect. A highly accurate system would be able to parse through the vast numbers of individuals in the airport at that time, “connect the dots” between those people and other data points within other records, and present a highly targeted list of suspects.

Accuracy is being able to make the best use of all the information you have – putting data together where necessary, and keeping it apart where necessary, to create a highly targeted list of “true-positive” results.

Why now?

The description of accuracy hasn’t changed since the time of AM-only radio, but the need has changed – because our circumstances have changed. Two primary factors are driving the push for greater accuracy:

• More information. Today’s law enforcement officials have to deal with millions of terabytes more data than they have ever had to work with in the past. Not only are there more records about more people – a simple function of our digital times – there is significantly more travel (international travel in particular), more people traveling on visas, more types of communication and a wider variety of threats.

• More fragmentation. As the amount of information grows, so do the different locations and different types of information. From local police records to state databases to federal watch lists – and all the different types of entities (people, phone numbers, weapons) that reside in each – intelligence and law enforcement officials are faced with a daunting task of connecting dots between and among all this information. Their job is akin to finding a needle in a haystack.

Adding to the challenge, the risks are greater for missing important connections – or, not connecting the dots between data that already exists. Being marginally accurate is no longer good enough.

Finding the right technology
Finding the right solution is all about understanding what accuracy is and what you should expect from a highly accurate system. The most effective technology will let you look at the right 10,000 things, not just the top 10,000 things. The right technology will help you reduce the noise, particularly the signal-to-noise ratio.

The most effective technology also will have a level of intelligence. Technology that produces the most accurate results will be able to account for errors and other misspellings – or purposeful deception, a practice common among persons of interest trying to avoid detection. That technology also will have the ability to recognize and account for cultural anomalies and unique factors in different languages.

The technology you use also has to be adaptable; it has to be able to allow you to introduce new information and new agents, as well as securely exchange information between – and allow appropriate access from – other reliable sources.

Finally, your technology has to be flexible, so you can readjust your priorities according to changing threats, threat levels and resources.

Accuracy involves many factors. And, it’s a moving target. Because of the evolving nature of threats, we must keep working at this – we must keep pushing the envelope.

“Good enough” just doesn’t work any more. As soon as we can do more, we have to do more. And we have to keep pushing for greater accuracy and a greater ability to connect the dots. Our national security is at stake.

Wednesday, May 19, 2010

NACHA stats reveal electronic migration

Posted by Mark Brousseau

If you’re looking for more proof that consumers continue to move away from paper-based check payments and towards more convenient and “greener” electronic transactions, look no further than NACHA’s 2009 statistics for the Automatic Clearing House (ACH) Network, says Leilani Doyle (ldoyle@usdataworks.com), product manager at US Dataworks, Inc., a leading provider of enterprise payments solutions.

Doyle notes that while overall ACH payment volume increased by more than 475 million transactions in 2009 – a 2.6 percent increase compared to 2008 – Accounts Receivable Check (ARC) Conversion volume decreased by about 10 percent, according to NACHA’s statistics. The drop in ARC volume was offset by a spike in Internet-initiated (WEB) transactions of 9.7 percent in 2009, Doyle points out. Similarly, US Dataworks, the long-time market share leader in ARC volumes, saw its WEB volumes soar 700 percent in 2009 compared to the previous year, Doyle said.

“This trend is not surprising,” Doyle says. “It reflects the shift in consumer preferences from writing checks to making payments over the Internet.”

Doyle adds that she also wasn’t surprised by the uptick in Back Office Conversion (BOC) volumes indicated by NACHA’s annual statistics. “BOC will continue to grow as businesses that still take checks over the counter look for ways to reduce their costs to process checks,” Doyle explains.

What do you think?

10 tips to better pricing

Posted by Mark Brousseau

Rafi Mohammed, Ph.D, the author of The 1% Windfall: How Successful Companies Use Price to Profit and Grow, says organizations can start generating new profits and growth tomorrow morning. Here's how:

Pricing is one of the most powerful – yet underutilized – strategies available to businesses. A McKinsey & Company study of the Global 1200 found that if companies increased prices by just 1%, and demand remained constant, on average operating profits would increase by 11%. Using a 1% increase in price, some companies would see even more growth in percentage of profit: Sears, 155%; McKesson, 100%, Tyson, 81%, Land O’Lakes, 58%, Whirlpool, 35%. Just as important, price is a key attribute that consumers consider before making a purchase.

The following 10 pricing tips can reap higher profits, generate growth, and better serve customers by providing options.

Stop marking up costs. The most common mistake in pricing involves setting prices by marking up costs (“I need a 30% margin”). While easy to implement, these “cost-plus” prices bear absolutely no relation to the amount that consumers are willing to pay. As a result, profits are left on the table daily.

Set prices that capture value. Manhattan street vendors understand the principle of value-based pricing. The moment that it looks like it will rain, they raise their umbrella prices. This hike has nothing to do with costs; instead it’s all about capturing the increased value that customers place on a safe haven from rain. The right way to set prices involves capturing the value that customers place on a product by “thinking like a customer.” Customers evaluate a product and its next best alternative(s) and then ask themselves, “Are the extra bells and whistles worth the price premium (organic vs. regular) or does the discount stripped down model make sense (private label vs. brand name). They choose the product that provides the best deal (price vs. attributes).

Create a value statement. Every company should have a value statement that clearly articulates why customers should purchase their product over competitors’ offerings. Be specific in listing reasons…this is not a time to be modest. This statement will boost the confidence of your frontline so they can look customers squarely in the eye and say, “I know that you have options, but here are the reasons why you should buy our product.”

Reinforce to employees that it is okay to earn high profits. I’ve found that many employees are uncomfortable setting prices above what they consider to be “fair” and are quick to offer unnecessary discounts. It is fair to charge “what the market will bear” prices to compensate for the hard work and financial risk necessary to bring products to market. It is also important to reinforce the truism that most customers are not loyal – if a new product offers a better value (more attributes and/or cheaper price), many will defect.

Realize that a discount today doesn’t guarantee a premium tomorrow. Many people believe that offering a discount as an incentive to trial a product will lead to future full price purchases. In my experience, this rarely works out. Offering periodic discounts serves price sensitive customers (which is a great strategy) but often devalues a product in customers’ minds. This devaluation can impede future full price purchases.

Understand that customers have different pricing needs. In virtually every facet of business (product development, marketing, distribution), companies develop strategies based on the truism that customers differ from each other. However, when it comes to pricing, many companies behave as though their customers are identical by setting just one price for each product. The key to developing a comprehensive pricing strategy involves embracing (and profiting from) the fact that customers’ pricing needs differ in three primary ways: pricing plans, product preferences, and product valuations. Pick-a-plan, versioning, and differential pricing tactics serve these diverse needs.

Provide pick-a-plan options. Customers are often interested in a product but refrain from purchasing simply because the pricing plan does not work for them. While some want to purchase outright, others may prefer a selling strategy such as rent, lease, prepay, or all-you-can-eat. A pick-a-plan strategy activates these dormant customers. New pricing plans attract customers by providing ownership options, mitigating uncertain value, offering price assurance, and overcoming financial constraints.

Offer product versions. One of the easiest ways to enhance profits and better serve customers is to offer good, better, and best versions. These options allow customers to choose how much to pay for a product. Many gourmet restaurants offer early-bird, regular, and chef’s-table options. Price sensitive gourmands come for the early-bird specials while well-heeled diners willingly pay an extra $50 to sit at the chef’s table.

Implement differential pricing. For any product, some customers are willing to pay more than others. Differential pricing involves offering tactics that identify and offer discounts to price sensitive customers by using hurdles, customer characteristics, selling characteristics, and selling strategy tactics. For example, customers who look out for, cut out, organize, carry, and then redeem coupons are demonstrating (jumping a hurdle) that low prices are important to them.

Use pricing tactics to complete your customer puzzle. Companies should think of their potential customer base as a giant jigsaw puzzle. Each new pricing tactic adds another customer segment piece to the puzzle. Normal Norman’s buy at full price (value-based price), Noncommittal Nancys come for leases (pricing plans), High-end Harrys buy the top-of-the-line (versions), and Discount Davids are added by offering 10% off on Tuesday promotions (differential pricing). Starting with a value-based price, employing pick-a-plan, versioning, and differential pricing tactics adds the pricing related segments necessary to complete a company’s potential customer puzzle. Offering consumers pricing choices generates growth and increases profits.

Since pricing is an underutilized strategy, it is fertile ground for new profits. The beauty of focusing on pricing is that many concepts are straightforward to implement and can start producing profits almost immediately.

What better pricing windfall can your company start reaping tomorrow morning?

What do you think?

Removing the Model T mentality from SAP hosting

Posted by Mark Brousseau

At one time or another, most people have heard Henry Ford’s famous quote about his revolutionary Model T automobile: “Any customer can have a car painted in any color so long as it is black.” Today, we look upon his inflexible, non-customer service-oriented attitude as quaint, a mindset from a bygone era that would never fly today.

But the reality is that attitude is still very prevalent. Not in our vehicles, thankfully – you can get a car or truck painted in just about any crazy color, or combination of colors you want. Instead, it’s the common mindset for IT hosting in the SAP world.

Dan Wilhelms (dwilhelms@sym-corp.com), president and CEO of Symmetry Corporation (www.sym-corp.com), explains:

By now you’ve probably seen all the articles and heard the Webinars talking about IT infrastructure as a commodity rather than a strategic advantage. They tell you how, in this day and age, managing your own infrastructure makes about as much sense as manufacturing your own electricity on a day-to-day basis, and that you’d be better off moving to a hosted model. And they tell you how IT costs to manage SAP average three percent to five percent of revenue, whereas an integrated technical managed services solution incorporating hosting reduces this figure to only one percent of revenue. All of which is true.

Unfortunately, they tend to leave out one small detail. The act of moving your infrastructure to a 20th Century-style hosting provider can be very expensive and time-consuming, especially for a mid-market organization, before it ever becomes smooth and cost-efficient.

The reason is that Henry Ford mentality. The typical 20th-Century hosting provider has a giant server farm full of equipment onto which it will move your applications. Essentially, they tell you that you can run your applications on any hardware you want – as long as it’s the hardware they already have. If you’re running on the same hardware – say your current system is IBM and so is the provider’s – that part will probably transition fairly smoothly. But if your applications are set up to run on HP servers and they’re using IBM, it’s going to take a lot of work to make the changeover. And guess who has to make the change?

The other big problem with the 20th Century model is sharing resources. Back in Ford’s day, when running water was still a rarity, families often shared bathwater (or even baths) because filling a bathtub was a time-consuming, labor-intensive task. They didn’t want to waste the effort on providing clean water for each bath.

In the traditional hosting world, the resources you’re sharing are servers. In order to operate as efficiently (and profitably) as they can, hosting providers try to fill every micron of disk space on every server with data. That means they’ll often mix data from two or more organizations to increase utilization.

It makes sense from their standpoint. But it’s not so good from yours. If a problem with some other organization’s application takes down the server you’re sharing, you are just as out of luck as they are – even though your applications are running perfectly fine. In addition, if you’re working with a government agency and have to show compliance with laws requiring separation of data, it’s going to be pretty tough to prove when your supposedly secure data is running alongside that of an organization with different (or no) compliance requirements.

There is a solution, however. Rather than settling for a “Model T” type of hosting environment, look instead for a provider using a 21st Century hosting model.

With a 21st Century hosting provider, you don’t have to make your applications fit their hardware. Instead, they will host your applications on whatever hardware you want – whether that means purchasing all new hardware of your choice as part of an upgrade, or actually packing up and shipping your current hardware to their locations. If you’re buying new hardware, a good hosting provider will even give you a choice of procuring it yourself or taking that burden off your hands – whatever method works best for you.

Moving to a hosted system dedicated specifically to your organization instead of one that is carved out of a general storage area network also solves the concerns regarding data separation. Since your hardware operates as separately as if it were in your own facility, there is no chance someone else’s application problems will affect your business. It also makes proving separation of data a very simple task.

A 21st Century hosting provider will also tend to be more specialized. In the early days, hosting meant setting up equipment and running whatever applications its customers sent its way. There was little on-staff expertise to draw from if there was a problem with, say, SAP or another complex system. In the new world of hosting, providers specialize in particular technologies and have deep expertise on staff, which allow them to do what you really want them to do – manage and maintain the system completely, including overcoming any issues immediately rather than having to call an outside specialist.

While moving to a 21st Century hosting provider makes sense for virtually any organization, it is particularly well-suited to mid-market organizations that are increasingly finding more time being spent on IT maintenance and less on actually deriving more value out of their applications. It’s a lot like those early Model Ts. Back then, if you were going to own a car, you had to know how to fix it, too.

Today, most car owners don’t know what’s under the hood and don’t want to know. They just want to get in and drive. Rather than adding IT staff (and finding themselves in the IT business instead of whatever business they’re actually in), these mid-market organizations can stay focused on the reasons they installed their applications in the first place.

When it comes to hosting, why settle for a Model T mentality? Using a 21st Century hosting provider will give you complete control over your environment and keep your data separate, all while saving you as much as 30 percent over traditional hosting. Even Henry Ford would approve of that.

What do you think?

Removing the Model T mentality from SAP hosting

Posted by Mark Brousseau

At one time or another, most people have heard Henry Ford’s famous quote about his revolutionary Model T automobile: “Any customer can have a car painted in any color so long as it is black.” Today, we look upon his inflexible, non-customer service-oriented attitude as quaint, a mindset from a bygone era that would never fly today.

But the reality is that attitude is still very prevalent. Not in our vehicles, thankfully – you can get a car or truck painted in just about any crazy color, or combination of colors you want. Instead, it’s the common mindset for IT hosting in the SAP world.

Dan Wilhelms (dwilhelms@sym-corp.com), president and CEO of Symmetry Corporation (www.sym-corp.com), explains:

By now you’ve probably seen all the articles and heard the Webinars talking about IT infrastructure as a commodity rather than a strategic advantage. They tell you how, in this day and age, managing your own infrastructure makes about as much sense as manufacturing your own electricity on a day-to-day basis, and that you’d be better off moving to a hosted model. And they tell you how IT costs to manage SAP average three percent to five percent of revenue, whereas an integrated technical managed services solution incorporating hosting reduces this figure to only one percent of revenue. All of which is true.

Unfortunately, they tend to leave out one small detail. The act of moving your infrastructure to a 20th Century-style hosting provider can be very expensive and time-consuming, especially for a mid-market organization, before it ever becomes smooth and cost-efficient.

The reason is that Henry Ford mentality. The typical 20th-Century hosting provider has a giant server farm full of equipment onto which it will move your applications. Essentially, they tell you that you can run your applications on any hardware you want – as long as it’s the hardware they already have. If you’re running on the same hardware – say your current system is IBM and so is the provider’s – that part will probably transition fairly smoothly. But if your applications are set up to run on HP servers and they’re using IBM, it’s going to take a lot of work to make the changeover. And guess who has to make the change?

The other big problem with the 20th Century model is sharing resources. Back in Ford’s day, when running water was still a rarity, families often shared bathwater (or even baths) because filling a bathtub was a time-consuming, labor-intensive task. They didn’t want to waste the effort on providing clean water for each bath.

In the traditional hosting world, the resources you’re sharing are servers. In order to operate as efficiently (and profitably) as they can, hosting providers try to fill every micron of disk space on every server with data. That means they’ll often mix data from two or more organizations to increase utilization.

It makes sense from their standpoint. But it’s not so good from yours. If a problem with some other organization’s application takes down the server you’re sharing, you are just as out of luck as they are – even though your applications are running perfectly fine. In addition, if you’re working with a government agency and have to show compliance with laws requiring separation of data, it’s going to be pretty tough to prove when your supposedly secure data is running alongside that of an organization with different (or no) compliance requirements.

There is a solution, however. Rather than settling for a “Model T” type of hosting environment, look instead for a provider using a 21st Century hosting model.

With a 21st Century hosting provider, you don’t have to make your applications fit their hardware. Instead, they will host your applications on whatever hardware you want – whether that means purchasing all new hardware of your choice as part of an upgrade, or actually packing up and shipping your current hardware to their locations. If you’re buying new hardware, a good hosting provider will even give you a choice of procuring it yourself or taking that burden off your hands – whatever method works best for you.

Moving to a hosted system dedicated specifically to your organization instead of one that is carved out of a general storage area network also solves the concerns regarding data separation. Since your hardware operates as separately as if it were in your own facility, there is no chance someone else’s application problems will affect your business. It also makes proving separation of data a very simple task.

A 21st Century hosting provider will also tend to be more specialized. In the early days, hosting meant setting up equipment and running whatever applications its customers sent its way. There was little on-staff expertise to draw from if there was a problem with, say, SAP or another complex system. In the new world of hosting, providers specialize in particular technologies and have deep expertise on staff, which allow them to do what you really want them to do – manage and maintain the system completely, including overcoming any issues immediately rather than having to call an outside specialist.

While moving to a 21st Century hosting provider makes sense for virtually any organization, it is particularly well-suited to mid-market organizations that are increasingly finding more time being spent on IT maintenance and less on actually deriving more value out of their applications. It’s a lot like those early Model Ts. Back then, if you were going to own a car, you had to know how to fix it, too.

Today, most car owners don’t know what’s under the hood and don’t want to know. They just want to get in and drive. Rather than adding IT staff (and finding themselves in the IT business instead of whatever business they’re actually in), these mid-market organizations can stay focused on the reasons they installed their applications in the first place.

When it comes to hosting, why settle for a Model T mentality? Using a 21st Century hosting provider will give you complete control over your environment and keep your data separate, all while saving you as much as 30 percent over traditional hosting. Even Henry Ford would approve of that.

What do you think?

Improving content management

Posted by Mark Brousseau

Thinking outside the box can help organizations improve payback on their enterprise content management (ECM) solutions. Jim Bunn of ibml (jbunn@ibml.com) explains:

Great minds think alike. But, when it comes to enterprise content management (ECM) solutions, like minds can be a problem. If every organization approaches its ECM deployment the same way, using the same out-of-the-box technologies and the same old workflows, then creating exceptional results is going to be extremely difficult. The problem is compounded by the fact that many organizations don't have a thorough, in-depth understanding of their own business processes. The more a company thinks that its operations are just like any other, the more difficult it becomes to achieve new efficiencies.

To maximize payback on their ECM investments, organizations need to think outside the box by ignoring the industry groupthink and focusing on their own needs and objectives.

Starting with a clean slate
Long before they agree to their first meeting with an ECM vendor, organizations should get a handle on their operations requirements and business objectives. Additionally, they should understand all of the document types that pass through their operations and their data capture needs. Be warned: this process can be time-consuming. It will also be eye-opening. Upon closer inspection, some of your legacy processes are sure to elicit groans. But, there's nothing worse for an ECM business case than forcing inefficient manual processes into an automated workflow.

Deploying an ECM solution provides an opportunity to re-engineer business processes and eliminate some altogether. For example, take a hard look at every manual step in document preparation or image capture; start by measuring how long it takes for documents to be scanned after they have entered your company. This step will have the added benefit of allowing you to look for ROI/cost savings in your end-to-end document flow. Also, consider soliciting input from the different departments that are associated with a specific process to learn their information needs and any downstream exceptions they are seeing.

When you are researching your ECM needs, resist the temptation to cede control to your company's IT department. No one knows your business requirements better than you. Work collaboratively with IT to define system workflows to ensure that the deployed solution will meet your needs. For instance, an IT programmer is unlikely to know whether a specific work type should be routed to an individual's work queue (possibly for security/privacy reasons) or to a common operator queue. Some business users accompany their IT staff for technical training classes provided by vendors just so they know what the software is capable of. Then, they can better communicate their needs.

Think big. By definition, ECM solutions help bring down data silos and bridge information gaps. So, when you are developing your ECM initiatives, think beyond one department's needs. The combination of new integration tools and emerging technologies allows organizations to make information available to whoever needs it, quickly and securely, regardless of their location. To this end, it may make sense to leverage imaging, data capture and workflow investments to develop a shared services infrastructure where one department manages document processing for others.

Similarly, map out how documents currently enter your organization -- and who is touching them -- to determine if the information flow can be streamlined. Don't assume that distributed capture is the most efficient means of scanning documents. Many organizations are surprised by the "hidden costs" of distributed capture. Likewise, a completely centralized scanning operation may miss opportunities to expedite information capture and handling. What's important is that organizations look for a flexible scanning infrastructure with centrally managed control and reporting capabilities. In some cases, you may be able to integrate electronic forms into your automated workflow. These solutions can significantly reduce manual processing, forwarding only exceptions to operators for review.

Evaluate, evaluate, evaluate
Once you have automated your document processes, evaluate how you are doing. And then evaluate the process again. And again. It's not uncommon for business requirements (volumes, document design, etc.) to change soon after a new system is implemented. You want to be sure your ECM solution adapts as well. Consider using centralized reporting or analytics tools, real-time operations dashboards or periodic operations audits to ensure your ECM solution is still meeting your needs.

You may also want to adapt your compensation plan to reward employees for productivity and quality in the new automated document environment. Similarly, consider embracing flexible work hours -- to save labor costs and attract Blue Chip talent -- based on the needs of your operations.

With budget-strapped organizations fearful of making a misstep in new system implementations, it's easy to see how they become locked into a myopic way of seeing ECM deployments. But, that makes it hard to spot new efficiencies and pounce on opportunities for business process improvements.

The key to maximizing ECM payback is to think outside the box.

What do you think?

Monday, May 17, 2010

Document capture, OCR and ... everything

Posted by Mark Brousseau

Short on time and money, but need to be more efficient than ever with your document processing? New document capture technologies can help save the day. KeyMark, Inc.'s Brian Becker (Brian.becker@keymarkinc.com) explains:

You may already be familiar with the benefits of using digital images in place of paper files – less costly than physical file storage, improved access to documents, etc. – but are you up to speed on the latest tools available to you for capturing those digital images and extracting data from them?

Even the most basic document capture tools have undergone dramatic improvements in recent years; improvements in speed, accuracy, reliability, and productivity. Where implementing document capture once made sense in a few specialized cases, it is now beneficial to most any business process that involves paper documents, helping both to reduce costs and raise the bar on the results you can expect.

And while those improvements are great for simple document capture, those same speed and accuracy increases have also paved the way for some powerful new tools that can transform and automate many painfully manual processes.

Taking It to the Next Level
Going beyond the forms-only processing of the past, software tools available now support automation for the majority of printed documents. These tools can automate functions such as:

- Capture of index values from unstructured documents (such as insurance or mortgage documents, or even correspondence);
- Classification of documents based on content in addition to format or layout;
- Document separation for many document types without the need for separator sheets.

By leveraging these capabilities you can bring automation to entire operational areas. For example, to automate your mailroom you scan documents in the mail-room, classify, separate and index the documents using automation, and then pass the documents to a workflow for automatic and immediate routing, even to distant locations.

Fully Automated (or Not)
Using software to automate document classification or indexing (even simple indexing) will reduce operator workload, but it generally will not eliminate it. Since machines and software are best used to handle general cases (the tedious stuff that machines are good at), the best systems use automated tools to handle the bulk of the work, while also making it easy and efficient for people to handle the rest. Your operators review any documents that the software did not confidently read and process and also handle any business-rule violations detected by the software.

Efficiency improvements vary by situation, so there is no general rule of thumb, but we often see improvements that yield a return on investment of eighteen months, or even twelve months or less. And in addition an automated system will typically improve consistency as well simply because these types of systems are ideal for handling repetitive work the same way every time without getting bored or distracted, so the results tend to be the same every time. Configure and tune them well, and they produce great results.

So with the system automation handling the bulk of the repetitive tasks, your operators can focus more on the challenging exceptions, leaving most of the tedium behind. Your people become more productive, and also do their best work.

So Much Work, So Little Time
The right system applied to the right problem should save you money – not burn through your budget. It might even pay for itself in short order. But a good automation system can also be a godsend in situations where you are faced with reduced staffing and too much work.

Anywhere that paper-based processes are currently used, moving to image-based automation for data entry and/or work flow can sometimes double or even triple efficiency, with a bonus of increasing job satisfaction and quality at the same time.

Come Up to Speed
Document capture, indexing and automated data entry are faster, more accurate, and more capable than ever. Add the ability to automate semi-structured and unstructured documents (made possible by the next-generation tools available now) and you will no doubt want to look not just at how you might improve your existing document capture systems, but also at how to bring automation savings to your most painful paper-based processes.

Do your research, and then find the best solution provider you can. They will help you create the optimal approach to address your specific needs so that you end up with a system that both pays for itself and improves your bottom line.

What do you think?

Sunday, May 16, 2010

Is PDF an open standard?

Is PDF an open standard? Duff Johnson, CEO, Appligent Document Solutions, weighs in:

On May 13, the founders of Adobe Systems stepped up to the microphone to deliver a response to Steve Jobs' open letter about Flash. They say Adobe has acted on open standards while Apple offers mere words.

At the outset, I must acknowledge that I owe my livelihood to the genius of these two gentlemen. The inventors of PostScript and PDF and the creators of Adobe Systems, Warnock and Geschke are gods in my Pantheon. They are the founding fathers of technologies that have been instrumental in making computers relevant to the modern everyday operations of government and business.

That said, claims about PDF being a true open standard need to be placed in context.

Adobe Systems has published the PDF Reference, the rulebook for PDF developers, since 1993. At the very beginning, if you wanted to make, view or manipulate PDFs you bought the book in the store for a few dollars. Pretty soon it was (and still is) available online at no charge.

On July 1, 2008, version 1.7 of the PDF Reference was rewritten as ISO 32000, a document managed by committees under the auspices of the International Standards Organization. ISO 32000 is managed by individual representatives of interested parties in open meetings under parliamentary rules. Anyone can observe and participate. While they are obviously heavily invested in the outcome of the committee's decisions, Adobe Systems has only one vote at the table, the same as any other.

By now, the rulebook for PDF is relatively mature and precise in its language. It was not always so. Adobe's very openness – their willingness to let third-parties in to make their own PDFs before the PDF Reference was a mature document – was and continues to be a source of pain.

Three of five PDF viewers displayed this PDF incorrectly.

When millions of PDF files from hundreds of different applications started flying around, two major problems with the rulebook for PDF emerged.

First, while the Reference set rules it is not a cookbook; it included no recipes for how to create content on a PDF page.

Second, the Reference was ambiguous in some areas and left other matters under-considered, sometimes unaddressed.

When dealing with real-world documents, Adobe's software had to deal with these vagaries, so more rules were written; specific details of their implementation were crafted to address the issues encountered in the real world.

These new rules, however, were in the software, not the Reference. As the Reference developed, Adobe's implementation and the published rules began to diverge. It became possible to create a “legal” PDF file that otherwise perfectly serviceable software couldn't handle quite right (or handled dead wrong). In fact, because the early versions of the PDF Reference were so vague (relatively speaking), the range of possible oddities that were legal in a PDF was very wide indeed. A lot of sloppy PDF software was (and still is) written for this reason.

I remember discussing this problem with Adobe developers in the late 1990s. First and foremost, we all knew PDF had to be reliable. PDFs had to display the same way on-screen and in-print, no matter the platform. The problem with these “legal” but otherwise oddball PDF files was that if they displayed with problems in Adobe Reader, then Adobe (not the PDF's producer) would get the blame.

A pattern was established in which poorly-structured PDF files were roaming around in the wild, and that problem has worsened over time. As PDF has grown more popular, more and more applications of widely varying quality make bad PDF.

Adobe's solution was to engineer Adobe Reader to handle all the various oddball PDF files out there. It's one of the main reasons why Adobe Reader is a larger application to download and install compared to its rivals. Reader includes lots of code to deal with the thousands of different types of exceptions to “good” PDF that Reader users worldwide can and will encounter on a regular basis.

In their attempt to ensure that even the sloppiest PDF files still worked, Adobe created a situation in which developers could (and have) used Adobe's Reader as the reference implementation for their PDF software.

In 2010, there is still no alternative to Adobe Reader when it comes to validating third-party software.

As the vice-chair of ISO 32000, that bothers me, and if you're relying on the idea that PDF is indeed an International Standard in your organization, it should bother you too.

To make the final move in ensuring PDF is a durable international standard, Adobe should release their test suite of PDF files used to test Adobe Reader. This could take form in several ways, the simplest of which would be a collection of PNG images demonstrating the authoritative rendering of example PDF pages.

This test suite should be referenced in the upcoming ISO 32000-2, the forthcoming update to the International Standard for PDF.

When this step is taken will it become possible to validate the open standard of ISO 32000 without the proprietary Adobe Reader, an objective which is fundamental to the project of PDF as an International Standard.

Establishing an open test suite will make PDF truly an open standard in the spirit of Warnock and Geschke's letter. The advantages for consumers will be substantial. Adobe and software developers can produce conversion software to resolve the old files.

With no further excuse for sloppy code, non-compliant software will tend to die away, removing a major source of problems.

PDF will become truly reliable and based not only on an international standard, but one that may be readily validated.

Adobe will have begun the process of liberating itself from supporting old (and now invalid) PDFs, and will eventually be free to re-direct engineering resources away from propping up other people's software and into creative development.

I can't imagine a world without PDF; if it didn't exist it would have to be invented. PDF is indeed an open standard, but it's incomplete. It's time to finish the story and end the practice of making Adobe Reader a de facto reference implementation.

What do you think?

Thursday, May 13, 2010

Paper shuffling continues

Posted by Mark Brousseau

“While I’m not surprised that we haven’t seen the ‘Holy Grail’ of putting co-mingled documents into a scanner, and letting the system and software figure it out, I am surprised by the amount of manual processes in place at most companies,” Mark Smith of OPEX said during a panel discussion Monday at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas. The panel brought together document management solutions providers to offer their perspectives on the results of TAWPI’s 2009 Document Management Study.

Smith noted that 67 percent of survey respondents indicated that they are still inserting document separators. “That is a ton of document separators – with expensive paper and ink,” Smith said.

ibml’s Derrick Murphy told attendees that he was surprised by the lower-than-expected adoption rate of automated document classification technology. “This reminds me of the days when people thought ICR [intelligent character recognition] was going to save the day,” Murphy said. “The problem with auto-classification is the high costs associated with miss-classifications and errors. The technology simply has to become more intelligent. When that happens, it will open up a tremendous amount of cost savings, as well as opportunities to better leverage intelligent scanners to out-sort documents based on their content.”

Jim Wanner of KeyMark also was struck by “the lack of software utilized for front-end document classification.”

Murphy added that he was surprised that less than two-thirds of survey respondents track their imaging production rates. “If you don’t track your production rates, you can’t accurately track your costs,” Murphy noted. “And, in this economy, I’m shocked that operations wouldn’t want to know their true costs.”

Jim Thumma of Optical Image Technology (OIT) said organizations should strive to track their document management costs end-to-end -- from capture to archive -- to look for opportunities to remove inefficiencies.

Tuesday, May 11, 2010

Upgrade scanning, increase productivity

Posted by Mark Brousseau

“No one wants to hear about adding more employees,” Lisa Coleman, MHA, RHIA, director, Centralized Scanning, Memorial Hermann Healthcare System (MHHS), told attendees during a session today at FUSION 2010 at the Gaylord Texan Resort & Convention Center near Dallas.

That created a challenge when MHHS decided several years ago to centralize its scanning operation – which had been spread across MHHS’ 11 hospitals in the Houston, Texas, area – and add inpatient records to the documents it was scanning – all while adhering to a strict 24-hour turnaround time for scanning patient records. By centralizing its scanning operations, MHHS hoped to achieve:

• Centralized management
• Standardized processes and cross coverage
• Reduced manual intervention
• Decreased operating costs
• Improved customer service
• Increased productivity
• Complete Electronic Medical Record (EMR) processes (add inpatient records)
• Adherence with MHHS’ 24 hour turnaround deadline
• Secure record access (role-based, audit trail)

In 2002, MHHS implemented a Sovera health information management (HIM) system to ensure compliance with HIPAA regulations and establish a centralized scanning infrastructure. At this time, scanning was decentralized across MHHS’ hospital network. In 2006, MHHS made the decision to add inpatient records to its scanning program. Scanning inpatient records would have a positive impact on the hospital’s migration to EMRs, Coleman said. But it also posed a problem in terms of significantly higher document volumes. “To meet our deadlines using our existing scanning infrastructure would have required at least three additional mid-volume scanners and 10 full-time equivalents,” Coleman told attendees.

As a solution, CGI Group presented the ImageTrac scanners from Birmingham, AL-based ibml. CGI Group arranged a site visit of a nearby ibml user – EDCO – to see firsthand the hardware’s capabilities and observe the processes the user put in place to streamline the document scanning workflow. “We were surprised at how different the ibml scanners were from our own; it was more like an assembly line,” she said.

MHHS decided to purchase two ibml ImageTrac scanners. The scanners were integrated with MHHS’ Datacap Taskmaster software to take out some of the manual intervention. “The more manual tasks you have, the more prone you are to have errors,” Coleman emphasized to attendees.

During implementation, CGI Group and ibml provided MHHS employees with a two-week course of hands-on training. Throughout implementation and training, MHHS continued to operate its legacy scanning infrastructure. “We wanted to run the systems in parallel before making the switch.”

With the new workflow, records are picked up and reconciled at the facility where they originated. Once reconciled, records are transported to the hospital’s centralized scanning facility where employees check them in and perform a quality check to ensure all of the records are accounted for. Batches then are delivered to a scanning station where a scanner operator images the document batches. At this time, key data is extracted from a barcode, documents are converted to digital images, and the images are sent to the hospital’s Datacap Taskmaster software solution.

By centralizing its scanning operations, Coleman said MHHS has achieved several benefits:

• Much faster document throughput with the ImageTrac scanners
• No need to hire additional staff
• Shifted staff from scanning to capture functions
• Records are scanned within 8 hours of receipt
• Physicians and medical staff now have instant access to records

The bottom line: MHHS achieved full payback in about 18 months.

Monday, May 10, 2010

'One Person Can Make a Difference'

“One thing I have learned over the past 27 years is that one person can make a difference,” says John Walsh, the opening keynote speaker of Fusion 2010, at the Gaylord Texan Resort & Convention Center in Grapevine, Texas. Motivational speaker Walsh, who also hosts the long-running FOX weekly television series “America’s Most Wanted,” was speaking specifically about his successful efforts to raise awareness about America’s epidemic of missing and exploited children in the wake of the murder of his own son, Adam, by a serial pedophile in 1991.

Since then, despite the frequent reluctance of state and federal officials, Walsh and his wife have succeeded in honoring the memory of their slain son by spearheading creation of the National Center for Missing and Exploited Children (NCMEC) and the National Child Sex Offender Registry (NCSOR). The former organization facilitates distribution of information about missing children to law enforcement officials and the general public; the latter requires convicted sex offenders to be registered in a national database and their whereabouts made known to the general public. Currently, Walsh is lobbying Congress to enact legislation requiring DNA samples to be collected from those arrested for alleged felonies.

Walsh’s audience call to action was primarily intended to encourage advocacy for this pending legislation. However, he also encouraged attendees to advocate within their organizations for proactive embrace of the many changes sweeping the global business community. Many of those changes were detailed at the opening session by IAPP Executive Director and CEO Tom Bohn. For example, members of the so-called Millennial generation entering the workforce toady will hold an estimated 10 to 14 jobs --- by the time they are 38 years of age. This is having a huge impact on training and management practices.

FUSION 2010

Posted by Mark Brousseau

During a panel discussion this morning at FUSION 2010 at the Gaylord Resort & Convention Center in Grapevine, Texas, Serena Smith of FIS, Les Young of US Bank, Blaine Carnprobst of BNY Mellon, Mike Reynolds of Image-Remit, Phil Ahwesh of PNC Bank, and Jill Humbert of 3i Infotech, shared the 12 things that attendees “absolutely must know” about the lockbox processing market:

1. Business-to-business (B2B) payments are still largely paper-based – but electronification efforts continue
2. Clients want tools for B2B decisioning
3. Electronic Bill Presentment and Payment (EBPP), home banking and mobile payments data is being integrated with lockbox information
4. There’s a convergence of lockbox, accounts receivable (AR) and accounts payable (AP)
5. United States Postal Services (USPS) changes – namely, five-day mail delivery -- will impact lockbox processors
6. While most biller volume is declining, there is an upswing in healthcare consumer self-pays
7. Federal government initiatives will help drive the electronification of healthcare payments
8. Changes are coming to payments formats
9. There is an increasing emphasis on disclosure controls
10. Risk management control mandates are driving up payments processing costs
11. Labor is becoming more costly and scarce
12. There may be more consolidation among lockbox providers

What do you think?

Good morning Rock Stars!

Ladies and gentlemen, supervisors and CFOs, welcome to FUSION 2010! IAPP/IARP Chairman Eric Jones and Robert Lund, Chairman of the TAWPI board of Directors, officially announced the merger of the two organizations, which will continue to operate as autonomous organizations with a shared staff and shared board of directors. Keynote Speaker John Walsh has just taken the podium . . . but the question of the morning was: Who were those masked, um, thingies, in flaming red and green who opened the show, T-shirt cannons blazing? Your inside source has it on good authority it was FUSION staffers Ken Brown and Diane Sears, morphed by the power of FUSION. Never fear. They have promised to only use their superpowers for good – and to never wear tights in public again.

Opening Night Reception



FUSION 2010 attendees show off their dance moves last night at the event's opening night reception.

Opening Night Reception


IAPP/IARP CEO Tom Bohn joins the B Street Band at the opening night reception of FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

Sunday, May 9, 2010

Good night, FUSION 2010


Ingrid Collins, manager, membership development and chapter relations, IAPP/IARP, is ready to turn in after a long day at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas. The opening reception tomorrow morning starts at 7:45!

Opening Night Reception


today Magazine Editor Mark Brousseau greets Tracy Allen of Union Bank tonight at the opening night reception of FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

Opening Night Reception


today Magazine Editor Mark Brousseau greets Gary Nedved, CTP, of First Tennessee (pictured, left), and Steve McNair of Transaction Directory (pictured, right) tonight at the opening night reception of FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

Day One: Load In

Fusion has been a blast!

Roadies are everywhere. They’ve been setting up the expo hall, rigging the lights and sound, and hanging the video screens. We’re ready to rock ‘n’ roll!

The Dallas skies have been threatening all day, but they wouldn’t dare rain on this party. Inside the Gaylord the party is just getting started. Fans are arriving from far and wide, checking in at the registration desk, mobbing the bookstore, and attending the pre-conference sessions.

Anticipation is building for the opening reception at the Glass Cactus Nightclub, a two-story palace of 39,000 square feet overlooking scenic Lake Grapevine. Bruce Springsteen tribute group The B-Street Band will be on fire for the Fusion crowd.

Looking forward to a rockin’ opening session tomorrow at 7:45. The crowd is buzzing about a special announcement and surprise high-energy entertainment.

Ciao for now. Baby, we were born to run.

FUSION 2010


Juan Paz of Ameriprise arrives this afternoon at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

FUSION 2010


Mary Schaeffer, editor, AP Now, and Dennis Lindsay, AP manager, Intermountain Healthcare, attend a writer's workshop this afternoon at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

FUSION 2010


Susan Heider, senior audit advisor, APEX Analytix (pictured, left), and Tracy Bryant, AP manager, Mohawk Industries, attend a writer's workshop this afternoon at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

FUSION 2010

Posted by Mark Brousseau

At a writer's workshop this afternoon at FUSION 2010, IAPP/IARP Editor in Chief Laureen Crowley Algier shared the 10 myths of business writing:

1) Quoting from Wikipedia and other websites strengthens my business writing.
Not necessarily. Remember, not everything you read on the Internet has been verified for accuracy. Using Wikipedia to research a topic is fine. Quoting from it is dicey. The Internet is like the Wild West: The rules are different there. It’s not as safe as quoting from Encyclopedia Britannica.

2) Repurposing content I wrote for another company can save me time and work.
Yes and no. If you created the content on the other company’s dime, it’s likely that company owns the material, not you. It might be in your best interest to rewrite the material entirely so it doesn’t appear that your new organization is plagiarizing your former company’s work.

3) Articles that appear on the Internet are public property.
Absolutely not. You’ll find warnings on many websites that say the content may not be reproduced without the written permission of the author. Web-based content falls under the same rules as magazine articles, newspaper stories, books, corporate reports, and most other written material. Just to be safe, assume that you need permission to use any content you find on the Internet – unless it contains a note that allows anyone to reprint it as long as the original source is cited.

4) Footnotes that cite my sources protect me legally under copyright law.
Not always. First of all, what are you doing using footnotes? These rarely are necessary. It’s a lot more reader-friendly to attribute information in the body of your copy than make people stop what they’re doing and look up a footnote. But if you must use them, you still must properly identify where you got the information and possibly obtain permission to use it.

5) If I’m using only one paragraph from another source, I don’t have to get permission from the writer.
True – but you do have to cite where it came from. If the paragraph is word for word, it should be in quotation marks or indented on both sides, or somehow set aside as something different from the body copy. And it must be attributed to the source, just as if it were a quote.

6) If I credit the original writer, I can republish as much of an article as I want.
Absolutely wrong. The rule of thumb when you’re quoting from someone else’s work is that you can’t give away “the heart” of it. Just like telling someone how a book or movie ends, you take away the work’s impact if you give away too much. Unless you have the author’s permission to reprint the entire piece, be very careful.

7) My peers will respect me more if my writing sounds academic.
Maybe. But they’ll respect you even more if they can understand what the heck you’re saying. Don’t let your message get lost in the language. Keep it simple.

8) I’m a great speller, so my work doesn’t really need editing.
False. Even the best writers and editors need editing. It’s hard to “hear” the flow of your own work. Someone with a trained ear for writing and an objective outlook can do wonders for making your writing sing.

9) The editor inserted a lot of changes, so my writing must have been horrible.
Not necessarily. Editors look for many things in the copy, and you might not be aware of all of them. Those red or blue lines throughout the article you submit might mean the editor had to trim out words and sentences to make it fit into the news hole. They might mean the editor was “translating” your work into the publication’s style. In most cases, your work will never see the light of day if it’s horrible. Just be happy that you’re being published!

10) The editor didn’t respond right away, so my writing must have been horrible.
Again, this is not necessarily true. Sometimes it’s not all about you! Just like anyone else, editors have a lot on their plates. Even though they gave you a deadline and you met it, your work might not be edited for some time. Most editors give it a first glance initially to be sure it’s in the ballpark of what they’re seeking, and then they put it in the queue of pieces to edit. They take time and truly go through the work later. You might not hear from them until they take that second look. So relax.

FUSION 2010

Posted by Mark Brousseau

At a writer's workshop this afternoon at FUSION 2010, IAPP/IARP Editor in Chief Laureen Crowley Algier shared five editing tips for presenting the most polished product:

Step 1: Front-end coaching
Before anything is written, the writer and editor discuss expectations for the work. Who is the audience? What kind of research is required? Who will be quoted? What kinds of “extras” will the work contain – such as boxed facts, pullout quotes, charts, illustrations, photographs, or diagrams? The writer and editor lay out a plan that includes a deadline.

Step 2: Preliminary content edit
The editor looks through everything quickly for content. Is it being presented in a logical order, are there any holes that leave questions, is there anything that warrants further explanation, can the facts be verified, is everything quoted from another source properly attributed? The editor and the writer go over the questions and decide what to change. The file goes to the writer for revisions and then back to the editor.

Step 3: Clean edit
The editor reads the material again all the way through, smoothing out the new information and checking the file for spelling, grammar, punctuation and style. The editor sends the file back to the writer for final approval in two forms: one in edit mode with all the changes marked, and the other “cleaned up” to show the final product. The writer sends the work to anyone else who needs to read it before it’s published, such as co-workers, supervisors, colleagues, or experts.

Step 4: Copy edit
After everyone has reviewed the work, the material is copy edited at least one more time to double-check for typos. This step often is performed by a different editor who can give it a fresh eye. Then the work goes to the graphic designer or layout editor.

Step 5: Proofread
When the work comes back from the graphic designer, an editor proofreads it all the way through to make sure none of the type shifted during layout and that the graphic elements – such as charts, breakout boxes, photographs or illustrations – appear at logical places in the copy. For instance, if a chart about a topic appears on a page preceding the mention of that topic in the text, it could be confusing to readers. After this step is completed, the editor signs off on the product and it’s ready for the public.

FUSION 2010


Attendees begin to arrive this afternoon for FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

FUSION 2010


Tina Kidd of IAPP/IARP (pictured, left) and Jane Souza of TAWPI welcomed exhibitors at FUSION 2010 this afternoon at the Gaylord Texan Resort & Convention Center in Grapevine, Texas.

FUSION 2010


today Magazine Editor Mark Brousseau and Debby Kristofco of ibml take a break from setting up for FUSION 2010 at the Gaylord Texan Resort & Convention Center in Texas to visit the nearby Grapevine Mills Mall.

FUSION 2010

Posted by Mark Brousseau

During an interactive networking luncheon today at FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas, attendees shared the best operations tips that they have implemented in the past year. Below are some of the top operations tips shared by attendees:

• Develop an AP Roadshow to visit different departments and operations sites to explain what AP does, what information it needs to do its job effectively, and how departments can work with AP.

• Implement a document imaging and retrieval system for finance documents. Having instant access to document images helped one company eliminate one full-time equivalent.

• Integrate TIN Matching with Oracle.

• Scan your invoices!

• Scan AP documents on the front-end, not the back-end, to achieve more workflow efficiencies.

• Combine your travel and entertainment (T&E) and purchasing card into one card to reduce administration and capture more rebates.

• Leverage remote deposit capture to eliminate trips to the bank.

• Automate, automate, automate!

• Do away with paper checks for T&E. Use debit cards for employees without bank accounts.

• If you have international travelers, educate them on VAT reclamation requirements.

• Trust is not a control! A “trusted employee” could be stealing from your company.

• When choosing a software solution, ask how they initiates upgrades or you might find yourself back at square one. Also understand whether the vendor will convert existing data.

• Eliminate, automate, delegate -- EAD!

• Eliminate as much paper as possible from your workflow.

• Whenever you are implementing new technologies or processes, be sure to get buy-in from line-level staff.

• Strive for open communication with your staff.

• Learn to walk away!

Wednesday, May 5, 2010

Letter from the President

As the economy begins to show hopeful signs of recovery, organizations are shifting their focus from survival to success. Critical to this success will be the ability of organizations to drive innovation to gain a competitive advantage. This battle will be won on the frontlines, by corporate foot soldiers.

Unfortunately, workforce learning and development has long been a victim of corporate budget cuts and scrutiny due in large part to the subjective interpretation of its business impact, notes Aberdeen.

For instance, many companies have cut technology staff levels too deeply, making it challenging for IT departments to keep pace with demands, warns Dave Willmer, executive director of Robert Half Technology. "Although businesses may be able to operate with stretched teams in the short term, being perpetually understaffed isn't sustainable and can detract from the overall productivity."

Willmer's right.

In this increasingly competitive global business environment, where organizations face mounting pressure not only to improve productivity, but also to capitalize on internal know-how and subject matter expertise, development and innovation will take on heightened organizational importance.

"An organization will only go as far as its leadership's ability to lead employees," says Elizebeth Varghese of Aon Consulting. "History has shown a new genre of competitors has risen from each economic crisis, capitalizing on innovative ways to do business. Conversely, those organizations that have focused on 'just getting by' lose market share and may eventually disappear. Those employers turning their attention to building a focused workforce in 2010 ... will see a 'bottom-line' benefit."

"To excel during changing times and the economic recovery, we believe organizations must take an 'offensive' approach, implementing talent strategies dedicated to driving innovation," adds Jeff Schwartz, principal, Human Capital, Deloitte Consulting, LLC. This is where technology comes in.

For example, solutions that integrate enterprise information can provide employees with critical, cross-database information, such as transaction data, in real-time. This information, in turn, can give today’s knowledge workers the ability to make higher-quality, more rapid decisions, because they can base their decision-making on higher-quality, more up-to-date information, notes Aberdeen.

High performing organizations are increasingly seeing the need these types of knowledge management systems as being critical to ensuring employees have the information and tools they need to do their jobs, according to the results of a recent survey by CCH, a Wolters Kluwer business.

"The numbers are staggering," says CCH President Mike Sabbatis. "It's estimated that knowledge workers spend 15 to 35 percent of their time looking for information they need to do their jobs, and 40 percent of the time they never find it." High performing organizations are taking steps to ensure this productivity drain is stopped. Today, 32 percent use knowledge management systems, and the rate of adoption is expected to exceed 50 percent in three years, according to the CCH survey.

Several factors are driving this strong demand for knowledge management solutions, including continued staffing challenges and the demand for increased productivity, Sabbatis explains.

You can add the desire for innovation to that list.

Regardless of the economy, TAWPI will continue to provide its members -- many of whom are on the frontlines of their industry every day -- with the educational resources and tools they need to drive innovation and best practices across their organizations. From in-person events and actionable studies and reports, to industry councils and online resources, we are committed to facilitating the exchange of ideas that make businesses smarter. And we have even more resources planned.

TAWPI is excited about the future that lies ahead. Thanks for joining us on the journey!

Sincerely,

Frank Moran
President
TAWPI

Tuesday, May 4, 2010

Group says legislation threatens electronic commerce

Posted by Mark Brousseau

Reps. Rick Boucher (D-VA) and Cliff Stearns (R-Fla.) today unveiled draft legislation aimed at improving online privacy that would impose new rules on companies that collect individual data on the Internet. But technology analysts at the Competitive Enterprise Institute warned that the proposed bill would actually harm consumers and hinder the evolution of online commerce.

“Substituting federal regulations for competitive outcomes in the online privacy arena interferes with evolution of the very kind of authentication and anonymity technologies we urgently need as the digital era evolves,” argues Wayne Crews, vice president for Policy.

“Today, businesses increasingly compete in the development of technologies that enhance our privacy and security, even as we share information that helps them sell us the things we want. This seeming tension between the goals of sharing information and keeping it private is not a contradiction -- it’s the natural outgrowth of the fact that privacy is a complex relationship, not a ‘thing’ for governments to specify for anyone beforehand,” Crews states.

“This legislation flips the proper definition of privacy on its head, wrongly presuming that individuals deserve a fundamental right to control information they’ve voluntarily disclosed to others online. But in the digital world, information collection and retention is the norm, not the exception. Privacy rights, where they exist, arise from voluntary privacy policies. The proper role of government is to enforce these policies, not dictate them in advance,” argues Ryan Radia, associate director of Technology Studies.

“If Rep. Boucher wants to strengthen consumer privacy online, he should turn his focus to constraining government data collection, which poses a far greater privacy threat than private sector data collection. A good starting point would be reexamining the Electronic Communications Privacy Act, the outdated 1986 law that governs governmental access to private communications stored online. Strengthening these privacy safeguards, as a broad coalition of companies and activist groups are now urging, will empower firms to offer stronger privacy assurances to concerned users,” Radia states.

What do you think?

FUSION Preview

Posted by Mark Brousseau

Join me Wednesday, May 12 at 3:15 for an interactive panel discussion that will drill down into the findings of a groundbreaking AP Benchmarking Survey from International Accounts Payables Professionals (IAPP), American Productivity and Quality Center (APQC), and PRGX.

Our panelists will offer their insights on the findings, as well as advice on what AP operations should do to become top performers themselves. Topics covered will include automation strategies, AP best practices, and the business models with the biggest payoff. You might be surprised by the findings! There will be plenty of opportunity for attendees to ask questions and share their own strategies for AP effectiveness and efficiency.

Moderator:
Mark Brousseau, Brousseau and Associates

Panelists:
Tom Bohn, CEO, IAPP/IARP
Evert Hulleman, Managing Director, Advisory Services, PRGX USA, Inc.
Neville Sokol, Sr. Advisor Research Services, APQC

FUSION Preview

Posted by Mark Brousseau

They say breakfast is the most important meal of the day.

In this case, it is critical to the future success of your AP operation.

Join IAPP CEO Tom Bohn and PRGX President and CEO Romil Bahl at 7:45 a.m. on Tuesday, May 11, as they unveil the results of a groundbreaking new tool for measuring the efficiency and effectiveness of AP operations.

The AP Productivity Index, developed by IAPP, PRGX and APQC, is the first tool of its kind to gauge the impact of AP metrics such as cost, staff productivity, turnaround time, and error rates.

During this breakfast presentation, Bahl will share the findings of the index, providing qualitative information on the practices, business models and attributes of AP Top Performers. The results may surprise you! Bahl also will arm CFOs and AP leaders with questions they can take back to their operations to identify areas for improvement. You'll also learn how you can participate in the index.

So set your alarm, and set your future success in motion!

FUSION Preview

Posted by Mark Brousseau

Coming to Dallas early for FUSION 2010?

Join me for a special networking lunch on Sunday, May 9, beginning at 11:30 a.m. in Dallas 6 and 7 at the Gaylord Texan Resort & Convention Center. The lunch is open to all FUSION registrants.

During this complimentary lunch, you will have an opportunity to meet other FUSION attendees in a casual and comfortable setting -- and take away some great tips for improving your operations.

Prizes also will be awarded.

Have lunch with us and get your FUSION experience off to a great start!

Monday, May 3, 2010

FUSION Preview

Posted by Mark Brousseau

At FUSION 2010 at the Gaylord Texan Resort & Convention Center in Grapevine, Texas, next week, J&B Software will demonstrate its AP automation solution, I'm told.

J&B’s AP automation solution converts paper invoices and associated documents into electronic images. Optical character recognition (OCR) technology adapts captured invoice fields into usable data using ERP information, sophisticated algorithms and an extensive knowledge base. Following capture, the invoices and data are submitted to an appropriate staff member for review.

The automated distribution of discrepancy items for review is based on a sophisticated business rules engine that is configured according to individual company requirements. This speeds approval workflows while providing proper documentation and audit trails. The system’s dashboard controls also provide management with the tools necessary to adjust priorities, exception handling and bottlenecks as desired. Once invoices have been verified and completed for payment, they are exported to the back-end ERP or accounting system for posting and easy access.

J&B says its AR automation solution reduces manual processing, increases transparency and automates workflow so staff can focus on error and exception handling. Many organizations have departments that operate in silos, which make it difficult to integrate processes and deliver a complete set of documents related to a specific transaction, the vendor notes. J&B’s AR automation solution addresses these pain points by unifying workflow, freeing staff from manual tasks and allowing them to concentrate on exception, deduction and dispute management to accelerate resolution.

The solution consolidates payments and remittance documentation from a variety of complex paper and electronic input channels into a single portal, J&B says. The captured data is then verified against existing ERP payment data with an advanced cash application rules engine that decreases manual processing and helps avoid errors such as misapplied payments. After data has been verified or exceptions have been resolved, the system transfers the applied cash and associated transaction for posting by single or multiple ERP or back-end systems.

In all, more than 140 vendors will be showcasing AP, AR, payments and document automation solutions in the expo hall at FUSION 2010.