Posted by Mark Brousseau
What do Lehman Brothers, AIG, Merrill Lynch, Washington Mutual Savings, Arthur Andersen, Starbucks, and Toyota all have in common? All went gunning for business growth but instead ended up with self-inflicted wounds. Each of these companies pursued the wrong kind of growth for the wrong reasons. If you are considering trying to grow your business to beat the economic pressures of the down economy or are caving to the popular "grow or die" influence of Wall Street, Ed Hess asks that you think before you grow.
"Most business executives accept without question the belief that growth is always good, that bigger is always better, and that the healthy vital signs for a public company include growth that is continuous, smooth, and linear," says Hess, a professor at the University of Virginia's Darden Graduate School of Business and author of the new book Smart Growth: Building an Enduring Business by Managing the Risks of Growth. "The problem with those presumptions is that there is no scientific or business basis for them."
Hess, a leading authority on business growth, knows this to be true because he's conducted extensive research of his own with both public and private companies. Based on his research, Hess has found that the hard data shows that above-average, long-term growth (five years or more) by public companies is an exception, not the rule, occurring in less than 10 percent of the companies studied.
"For the vast majority of companies, growth is often pursued in a way that brings with it as many risks of failure as chances of success," notes Hess. "Combine unquestioned strategic presumptions with bad judgment—and sometimes a fair share of greed and arrogance—and the results can be serious or fatal to the viability of a business."
What are some of the self-inflicted wounds premature growth can leave on your company? Hess outlines a few:
Growth can create new business risks. Growth is a business strategy that can require investments in people, equipment, raw materials, space, and supplies. As these cash outlays occur before new revenues kick in, many businesses find themselves exhausting their cash reserves—a risky tightrope to walk.
"Starbucks is a great example of a company that learned this lesson the hard way," says Hess. "Previously the poster child of a successful, well-respected business, a new executive team decided that continuous, quarterly store expansion was necessary to prove to Wall Street how committed the company was to growth. Aggressive plans did indeed increase the number of new stores being opened each month, but many were in unprofitable locations that eventually had to be closed. The result was bad press, a diluted customer value proposition, and, equally troublesome, the sudden need to take on massive and unprecedented short-term debt. A change in senior management and a public mea culpa showed that, in the pursuit of growth, Starbucks had instead weakened itself as a business, at least for a time."
Growth can force you into the big leagues before you are ready. Growth can match companies up against more experienced players before they truly know how to handle the competition.
Growth can strain your operations. Growth can pose huge challenges for your people, processes, controls, and management capacities, resulting in quality problems and the increased potential for damaged customer relationships and diminished brand perceptions.
"Toyota learned this lesson the hard way," says Hess. "The company maintained an unbridled pursuit of growth over the past decade or so even though it was already a market leader in quality and dependability. It wanted more—to be #1 in sales. That shift in mindset set Toyota down a path where controls were stretched beyond capacity. The results: massive recalls, hundreds of lawsuits, and a damaged brand. Even Toyota's current CEO has acknowledged that the company's problems can be traced to growing too quickly."
Hess's solution for overcoming the risks associated with growth is a concept he calls Smart Growth. Smart Growth accounts for the complexity of growth from the perspective of organization, process, change, leadership, cognition, risk management, employee engagement, and human dynamics. It recognizes that authentic growth is a process characterized by complex change, entrepreneurial action, experimental learning, and the management of risk. It is a strategy that requires companies of all sizes to follow what Hess calls the "4Ps of Growth":
Plan for growth before kicking the strategy into gear. Think about how growth will change what you need to do. What new processes, controls, and people will be needed at what cost?
Prioritize what changes or additions to the business have to be made to accommodate the growth. This is a way to make the essential investments first, so as not to deplete cash reserves before new income starts rolling in.
Processes must be put in place to ensure there are adequate financial, operational, personnel, and quality controls for a bigger business. These are like dams on a river: if the water starts flowing faster and with more volume, those dams need to be reengineered to handle it.
Pace growth so as not to overwhelm yourself, your people, and your processes. Growth can be exciting, but it is also almost always stressful. If you underestimate the need for effective change management, and for a phased approach to implementation, you increase chances for failure.
The tools and rigorous governance methods outlined in Smart Growth can help companies along all four parts of the process. For example, Hess's Growth Decision Template can help leaders analyze, illuminate, and devise a plan to manage their growth risks.
Hess also advises all public companies to conduct an annual Growth Risks Audit to review the stresses that growth is placing on the organization, its people, and its processes. Avoiding conflict of interests and striving for objectivity are critical. This annual Audit should be conducted by a senior multi-disciplinary team made up of members who are not rewarded for producing growth results but are rewarded instead for preventing growth risks from creating serious damage to the business.
"CEOs and Boards of Directors face a unique kind of challenge when it comes to planning for smart growth," says Hess. "Sometimes the right decision when it comes to growth is not to pursue it, but it takes a special kind of team to make that decision when shareholders and analysts are clamoring for higher returns each quarter. But smart growth is possible. Successful high-growth companies—such as Best Buy, SYSCO, Walgreens, and Tiffany & Company—have grown through constant improvement in their organizations' DNA, executed by a highly engaged workforce in a positive learning and performance environment.
"What's important to remember is that the goal is not necessarily growth," concludes Hess. "The goal is continuously making your organization better. When you achieve that, growth will happen naturally in due course. That's the way to achieve smart growth."
Wednesday, June 30, 2010
On-premises versus the Cloud
Posted by Mark Brousseau
There is a lot of talk these days about on-premises versus cloud computing. Keyon C. Thomas (keyon@infostreet.com), reseller channel manager at InfoStreet, says the market is shifting:
I talk to VARs/MSPs all day long. One of the things I am always shocked by is how many of them don't know how little money they make selling on-premises technology and how much more they can make selling all cloud solutions. For some of my VAR's, I see an 82% profit increase. If that is not enough to get your attention then I don’t know what will. Let me break down why and see if it makes sense to you guys as well.
So let’s first look at a typical on premises install. You go in and meet with the client to make sure that you have the things they need. From there you order from your vendor, in most cases you sell at a set price with your commission built in. Plus you have to put money into technicians setting stuff up (and even if you are the tech your time is money because that time you are spending could be better spent on finding more clients). This is about a 5- to 6- day deployment with a combination of time at the client location and pre configuration at your location. Plus with most on-premises solutions, there is no recurring revenue unless you get a break fix support contract. Even if you have a break fix contract, when something goes amiss you still have to send someone out to the location to fix it so, again, you are eating into your overall profits.
Now let’s look at a cloud install using a combination of SaaS and HaaS. Your client’s network should already be in place just like it would have been in the prior example. If you can get the networking contract then you have minimal work there to set it up. Hardware would come from you HaaS vendor with the deployment specs provided to them. This is again a simple install since all the configurations were done before you received them decreasing your time at the client location.
Deployment of the file server, Exchange, SharePoint, and Communicator like environments for the clients can be deployed by the SaaS provider with a couple of clicks so you don't have to have a technician. There are even SaaS offerings of Accounting, MS office suite, and industry specific software. For most clients you're looking at about a 1 work day deployment. In both cases you are going to have recurring monthly revenue coming in before you add your support contract. When stuff does go down, if it is hardware you ship it back to the HaaS vendor and if it is software the SaaS provider is taking care of it, so you are not devoting man hours to it. Quite simply you are collecting the same if not more money but doing CONSIDERABLY LESS WORK per client. This frees you up to get more clients. Where you may only be able to support 7 to 14 on-premises clients you could support over 100 cloud clients. It just makes sense.
So my question to you is would you like to explore how to decrease your operational expenses while you significantly increase your bottom line?
There is a lot of talk these days about on-premises versus cloud computing. Keyon C. Thomas (keyon@infostreet.com), reseller channel manager at InfoStreet, says the market is shifting:
I talk to VARs/MSPs all day long. One of the things I am always shocked by is how many of them don't know how little money they make selling on-premises technology and how much more they can make selling all cloud solutions. For some of my VAR's, I see an 82% profit increase. If that is not enough to get your attention then I don’t know what will. Let me break down why and see if it makes sense to you guys as well.
So let’s first look at a typical on premises install. You go in and meet with the client to make sure that you have the things they need. From there you order from your vendor, in most cases you sell at a set price with your commission built in. Plus you have to put money into technicians setting stuff up (and even if you are the tech your time is money because that time you are spending could be better spent on finding more clients). This is about a 5- to 6- day deployment with a combination of time at the client location and pre configuration at your location. Plus with most on-premises solutions, there is no recurring revenue unless you get a break fix support contract. Even if you have a break fix contract, when something goes amiss you still have to send someone out to the location to fix it so, again, you are eating into your overall profits.
Now let’s look at a cloud install using a combination of SaaS and HaaS. Your client’s network should already be in place just like it would have been in the prior example. If you can get the networking contract then you have minimal work there to set it up. Hardware would come from you HaaS vendor with the deployment specs provided to them. This is again a simple install since all the configurations were done before you received them decreasing your time at the client location.
Deployment of the file server, Exchange, SharePoint, and Communicator like environments for the clients can be deployed by the SaaS provider with a couple of clicks so you don't have to have a technician. There are even SaaS offerings of Accounting, MS office suite, and industry specific software. For most clients you're looking at about a 1 work day deployment. In both cases you are going to have recurring monthly revenue coming in before you add your support contract. When stuff does go down, if it is hardware you ship it back to the HaaS vendor and if it is software the SaaS provider is taking care of it, so you are not devoting man hours to it. Quite simply you are collecting the same if not more money but doing CONSIDERABLY LESS WORK per client. This frees you up to get more clients. Where you may only be able to support 7 to 14 on-premises clients you could support over 100 cloud clients. It just makes sense.
So my question to you is would you like to explore how to decrease your operational expenses while you significantly increase your bottom line?
Tuesday, June 29, 2010
Taking the Sting Out of ACH Dispute Management
Posted by Mark Brousseau
The financial services landscape is undergoing radical change, with transaction processing rapidly migrating from paper-based to electronic payments. According to the Federal Reserve's 2007 Payments Study, electronic payments now exceed two-thirds of all non-cash payments -- a big change from a decade ago when paper checks were still king. Automated Clearing House (ACH) transactions have been a key to the growth of electronic payments. The number of ACH transactions in 2008 topped 18.2 billion, representing an increase of 1.2 billion over 2007, NACHA reports.
But this ACH growth also has created new back-office challenges, particularly in the area of transaction dispute management. The limitations of traditional in-house ACH systems and the strict time constraints and complex processing requirements imposed by NACHA rules and Regulation E have led to increases in operations expenses and potentially higher charge-offs associated with ACH disputes. And changes in the interpretation of Regulation E -- spelled out by Federal Reserve Bank staff and an OCC Advisory Letter -- may further complicate matters.
The Situation
One of the nation's largest bank ACH processors has taken proactive measures to better manage its ACH disputes. The bank's ACH operations perform a wide range of functions, including daily inbound and outbound transaction management, implementation and maintenance, customer service, compliance, and exception support.
Over the past several years, the bank has achieved significant growth in its ACH transaction volume. Along with this growth in overall ACH transactions, the bank has seen its ACH disputes increase 25 percent during the same period.
To be sure, the overall growth in ACH volumes is a factor in the increasing number of disputes. But customers also are better educated about ACH, and have higher expectations. Regardless of the cause, ACH disputes are costly to manage (with different attributes for each dispute and conflicting Regulation E and NACHA timelines), and present the risk of non-compliance and charge-off losses.
The Solution
Recognizing these challenges, the bank began an evaluation of solutions to better manage its ACH dispute process. At that time, the bank used an ACH processing product that kept transactions online for a short period of time, after which the information was archived to an offline report warehouse. The offline warehouse required the bank to "restore" reports and customer statements, a time-consuming and costly process that prevented the bank from providing quick responses to customer inquiries.
The bank evaluated three options for enhancing its ACH capabilities: further extending its legacy ACH solution's capabilities, developing an in-house solution, or leveraging a hosted solution. Scarce in-house IT resources precluded the bank from extending its legacy solution's warehouse capabilities. Similarly, the bank ruled out developing a custom solution because of competing demands for its limited IT resources and the long time-to-market required to develop an in-house solution.
Ultimately, the bank selected a hosted ACH solution from eGistics based on its compelling business case and its track record in the bank's lockbox operation. Using eGistics, the bank was able to implement an ACH Dispute Management solution faster, more effectively, and more economically than it could using internal resources or its current ACH vendor.
Within a few weeks of selection eGistics delivered its ACH solution, which supports a range of ACH functions including dispute research, customer service inquiries (notably questions about transaction details and debit authorization), and compliance reporting for potential rules violations. eGistics’ ACH Dispute Management solution provides a secure, easy to use Web interface that enables users to quickly search ACH transactions using a variety of configurable search criteria.
The Benefits
Most important to the bank, the eGistics hosted framework streamlined the research, management and reporting of ACH transaction disputes. Here's how it works: The bank receives ACH transmission files from the ACH network. A copy of these files is forwarded to the eGistics ACH solution. Operators log in to the eGistics platform and are able to search for transactions in real-time. Additionally, transactions can be marked as disputed, and then managed through the resolution process. Because there is a single view of the transactions, all operators can see the status of a dispute or inquiry. Finally, each disputed transaction is given a disposition status such as: credited, denied, or returned.
Streamlined research and management of ACH disputes were part of an overall business case for the bank that included long-term storage, improved customer service, attractive total cost of ownership, minimal internal resources, minimal capital expense, and rapid deployment. And eGistics provided the bank with the peace of mind that its solution complied with industry requirements, was reliable and scalable, ensured the privacy of critical data, and maintained complete access management through transaction tracking, auditing and reporting.
The eGistics hosted solution enhances the bank's dispute management process by providing: real-time distributed data access to any authorized user (even across branches or operations centers); intuitive search capabilities; the ability to annotate comments to disputed transactions; and the ability to export data (such as for batch extracts). The eGistics solution also has provided the bank with expanded search capabilities, including the ability to search on any alpha-numeric field (e.g. date, amount, customer, etc.) or using multiple "operators" (e.g. "contains," "greater than," "less than," "equal to," etc.). eGistics' ability to search data based on configurable parameters allows the bank to spot trends and react more effectively to unauthorized ACH debits. And the filtering capabilities provided by the eGistics research tool will enable the bank to block and restrict access to certain transactions, when required. What's more, the bank can store data in the eGistics solution for an unlimited period of time.
The functionality delivered by the eGistics solution supports a range of ACH functions at the bank, including: dispute research; customer service inquiries (notably, questions about transaction details and debit authorization); fraud mitigation; and compliance (reporting for potential rules violations).
The Bottom Line
At a time when rising ACH dispute volumes are impacting the back-office operations at banks, one of the largest ACH banks in the United States has achieved significant benefits by moving to a hosted ACH dispute management solution. These benefits include better, faster customer service, more accurate and timelier dispute status and tracking, streamlined ACH operations with lower costs, and reduced losses from charge-offs.
The financial services landscape is undergoing radical change, with transaction processing rapidly migrating from paper-based to electronic payments. According to the Federal Reserve's 2007 Payments Study, electronic payments now exceed two-thirds of all non-cash payments -- a big change from a decade ago when paper checks were still king. Automated Clearing House (ACH) transactions have been a key to the growth of electronic payments. The number of ACH transactions in 2008 topped 18.2 billion, representing an increase of 1.2 billion over 2007, NACHA reports.
But this ACH growth also has created new back-office challenges, particularly in the area of transaction dispute management. The limitations of traditional in-house ACH systems and the strict time constraints and complex processing requirements imposed by NACHA rules and Regulation E have led to increases in operations expenses and potentially higher charge-offs associated with ACH disputes. And changes in the interpretation of Regulation E -- spelled out by Federal Reserve Bank staff and an OCC Advisory Letter -- may further complicate matters.
The Situation
One of the nation's largest bank ACH processors has taken proactive measures to better manage its ACH disputes. The bank's ACH operations perform a wide range of functions, including daily inbound and outbound transaction management, implementation and maintenance, customer service, compliance, and exception support.
Over the past several years, the bank has achieved significant growth in its ACH transaction volume. Along with this growth in overall ACH transactions, the bank has seen its ACH disputes increase 25 percent during the same period.
To be sure, the overall growth in ACH volumes is a factor in the increasing number of disputes. But customers also are better educated about ACH, and have higher expectations. Regardless of the cause, ACH disputes are costly to manage (with different attributes for each dispute and conflicting Regulation E and NACHA timelines), and present the risk of non-compliance and charge-off losses.
The Solution
Recognizing these challenges, the bank began an evaluation of solutions to better manage its ACH dispute process. At that time, the bank used an ACH processing product that kept transactions online for a short period of time, after which the information was archived to an offline report warehouse. The offline warehouse required the bank to "restore" reports and customer statements, a time-consuming and costly process that prevented the bank from providing quick responses to customer inquiries.
The bank evaluated three options for enhancing its ACH capabilities: further extending its legacy ACH solution's capabilities, developing an in-house solution, or leveraging a hosted solution. Scarce in-house IT resources precluded the bank from extending its legacy solution's warehouse capabilities. Similarly, the bank ruled out developing a custom solution because of competing demands for its limited IT resources and the long time-to-market required to develop an in-house solution.
Ultimately, the bank selected a hosted ACH solution from eGistics based on its compelling business case and its track record in the bank's lockbox operation. Using eGistics, the bank was able to implement an ACH Dispute Management solution faster, more effectively, and more economically than it could using internal resources or its current ACH vendor.
Within a few weeks of selection eGistics delivered its ACH solution, which supports a range of ACH functions including dispute research, customer service inquiries (notably questions about transaction details and debit authorization), and compliance reporting for potential rules violations. eGistics’ ACH Dispute Management solution provides a secure, easy to use Web interface that enables users to quickly search ACH transactions using a variety of configurable search criteria.
The Benefits
Most important to the bank, the eGistics hosted framework streamlined the research, management and reporting of ACH transaction disputes. Here's how it works: The bank receives ACH transmission files from the ACH network. A copy of these files is forwarded to the eGistics ACH solution. Operators log in to the eGistics platform and are able to search for transactions in real-time. Additionally, transactions can be marked as disputed, and then managed through the resolution process. Because there is a single view of the transactions, all operators can see the status of a dispute or inquiry. Finally, each disputed transaction is given a disposition status such as: credited, denied, or returned.
Streamlined research and management of ACH disputes were part of an overall business case for the bank that included long-term storage, improved customer service, attractive total cost of ownership, minimal internal resources, minimal capital expense, and rapid deployment. And eGistics provided the bank with the peace of mind that its solution complied with industry requirements, was reliable and scalable, ensured the privacy of critical data, and maintained complete access management through transaction tracking, auditing and reporting.
The eGistics hosted solution enhances the bank's dispute management process by providing: real-time distributed data access to any authorized user (even across branches or operations centers); intuitive search capabilities; the ability to annotate comments to disputed transactions; and the ability to export data (such as for batch extracts). The eGistics solution also has provided the bank with expanded search capabilities, including the ability to search on any alpha-numeric field (e.g. date, amount, customer, etc.) or using multiple "operators" (e.g. "contains," "greater than," "less than," "equal to," etc.). eGistics' ability to search data based on configurable parameters allows the bank to spot trends and react more effectively to unauthorized ACH debits. And the filtering capabilities provided by the eGistics research tool will enable the bank to block and restrict access to certain transactions, when required. What's more, the bank can store data in the eGistics solution for an unlimited period of time.
The functionality delivered by the eGistics solution supports a range of ACH functions at the bank, including: dispute research; customer service inquiries (notably, questions about transaction details and debit authorization); fraud mitigation; and compliance (reporting for potential rules violations).
The Bottom Line
At a time when rising ACH dispute volumes are impacting the back-office operations at banks, one of the largest ACH banks in the United States has achieved significant benefits by moving to a hosted ACH dispute management solution. These benefits include better, faster customer service, more accurate and timelier dispute status and tracking, streamlined ACH operations with lower costs, and reduced losses from charge-offs.
Sunday, June 27, 2010
Migraines costly to productivity
Posted by Mark Brousseau
Employees suffering from Chronic Migraines (CM) experience increased lost productive time (LPT) in the workplace, according to new analysis from the American Migraine Prevalence and Prevention Study. Lost productive time (LPT) is estimated as the average weekly time lost due to an employee being absent (absenteeism) and reduced performance while at work (presenteeism).
Migraine is a neurological syndrome characterized by severe, painful headaches that are often accompanied by nausea, vomiting, and increased sensitivity to light and sound. Headaches may last for hours or even days. The pain is often on one side of the head and pulsating. Headaches may be preceded by aura: sensory warning signs such as flashes of light, blind spots, tingling in the arms and legs. Migraine can be divided into those experiencing headache on average 15 or more days per month (CM) and episodic migraine (EM): headache on average fewer than 15 days per month. Of the estimated 30 million Americans who suffer from migraine, approximately one million – mostly women – suffer from CM.
Chronic Migraine (CM) sufferers experience greater LPT in the workplace than those suffering from EM. This study showed that in the age interval 35-44 years, the LPT of CM sufferers was 215.3 hours higher per year than those suffering from EM. The amount of LPT among CM increased over age groups while it remained relatively low and stable among EM.
Cost estimates increased for CM across age cohorts while remaining relatively constant for EM. On an annual basis for those aged 35-44 years, this translated to the LPT for CM sufferers being $5,352.36 higher per year than those with EM. The average cost of LPT per week was based on 2005 census median income estimates.
According to Dr. Dawn Buse, one of the study's authors and Assistant Professor at Albert Einstein College of Medicine and Director of Behavioral Medicine at the Montefiore Headache Center, "The burden of CM is significant in terms of LPT and related costs. The results from these analyses may even underestimate that burden as these data do not capture those who are unemployed and may have exited the labor force through disability or early retirement, representing a significant loss of trained and skilled people who may exit the labor force early due to the burden of CM."
According to Dr. Richard Lipton, one of the study's authors and Professor of Neurology and Epidemiology and Population Health at Albert Einstein College of Medicine and Director of the Montefiore Headache Center, "The cost of treatment, whether it be to prevent a headache attack or to treat during an attack, may be considerably lower for employers than the costs associated with LPT. Additionally, treatments may ease the suffering of employees, while recovering the labor value of experienced and knowledgeable employees burdened by the symptom and work-related impacts of CM."
Dr. Buse advised, "By understanding the findings of this study, assessing the amount of lost work time their organization is experiencing due to migraine, and taking measures to educate and encourage migraine sufferers to seek treatment, organizations could reduce the amount of LPT and related costs due to migraine, and improve the health and quality of life of their employees."
What do you think?
Employees suffering from Chronic Migraines (CM) experience increased lost productive time (LPT) in the workplace, according to new analysis from the American Migraine Prevalence and Prevention Study. Lost productive time (LPT) is estimated as the average weekly time lost due to an employee being absent (absenteeism) and reduced performance while at work (presenteeism).
Migraine is a neurological syndrome characterized by severe, painful headaches that are often accompanied by nausea, vomiting, and increased sensitivity to light and sound. Headaches may last for hours or even days. The pain is often on one side of the head and pulsating. Headaches may be preceded by aura: sensory warning signs such as flashes of light, blind spots, tingling in the arms and legs. Migraine can be divided into those experiencing headache on average 15 or more days per month (CM) and episodic migraine (EM): headache on average fewer than 15 days per month. Of the estimated 30 million Americans who suffer from migraine, approximately one million – mostly women – suffer from CM.
Chronic Migraine (CM) sufferers experience greater LPT in the workplace than those suffering from EM. This study showed that in the age interval 35-44 years, the LPT of CM sufferers was 215.3 hours higher per year than those suffering from EM. The amount of LPT among CM increased over age groups while it remained relatively low and stable among EM.
Cost estimates increased for CM across age cohorts while remaining relatively constant for EM. On an annual basis for those aged 35-44 years, this translated to the LPT for CM sufferers being $5,352.36 higher per year than those with EM. The average cost of LPT per week was based on 2005 census median income estimates.
According to Dr. Dawn Buse, one of the study's authors and Assistant Professor at Albert Einstein College of Medicine and Director of Behavioral Medicine at the Montefiore Headache Center, "The burden of CM is significant in terms of LPT and related costs. The results from these analyses may even underestimate that burden as these data do not capture those who are unemployed and may have exited the labor force through disability or early retirement, representing a significant loss of trained and skilled people who may exit the labor force early due to the burden of CM."
According to Dr. Richard Lipton, one of the study's authors and Professor of Neurology and Epidemiology and Population Health at Albert Einstein College of Medicine and Director of the Montefiore Headache Center, "The cost of treatment, whether it be to prevent a headache attack or to treat during an attack, may be considerably lower for employers than the costs associated with LPT. Additionally, treatments may ease the suffering of employees, while recovering the labor value of experienced and knowledgeable employees burdened by the symptom and work-related impacts of CM."
Dr. Buse advised, "By understanding the findings of this study, assessing the amount of lost work time their organization is experiencing due to migraine, and taking measures to educate and encourage migraine sufferers to seek treatment, organizations could reduce the amount of LPT and related costs due to migraine, and improve the health and quality of life of their employees."
What do you think?
Monday, June 21, 2010
Tips for moving a data center
Moving a data center? Irwin Teodoro (iteodoro@laurustech.com), director of Systems Integration for Laurus Technologies (www.laurustech.com), says there are four career-limiting mistakes to avoid:
Moving a data center is a major undertaking for most organizations. And a successful move is a nice resume builder for any IT professional. A successful move will showcase skills in large-scale project planning, project management, technology integration and interpersonal communications. The process provides a chance for exposure across the company, as virtually every department is touched by the IT organization (and affected by a data center move) in some way.
However, data center moves can be fraught with career-limiting failures. No IT professional wants to be on the receiving end of memos and discussions describing lost orders, missed deadlines or customer dissatisfaction that occurred because some mission-critical business process was disrupted by a data center move that didn’t run smoothly.
In most relocation projects, there are four critical mistakes to avoid: Ignoring the data, combining the move with additional projects, failure to plan appropriately and not creating an inventory of equipment, applications and processes. Taking time up front to think through each of these will significantly improve your chances for success — and the personal recognition that follows.
Ignoring the data
While IT professions give plenty of thought to the infrastructure involved in data center relocation, moving the data itself can be just as taxing. Sometimes even more so. It is easy for IT professionals to lose sight of the data, as many firms have adopted the model that business group leaders own their data, Marketing owns the prospect data base, operations owns inventory data and so on. Yet the reality is that the data and its underlying infrastructure must be considered as part of an interconnected holistic system — not elements that can be taken apart and easily reassembled at will.
The smart IT professional will reach out to business owners before the move to identify information that may be affected and reach agreement on such items as data access, compatibility with new systems, application migration and others. Cleansing data prior to the move may be worthwhile, but definitely not during the move.
Killing two birds with one stone
With all the planning that goes into moving a data center, many IT professionals attempt to combine other projects — often guided by the CFO’s visions of cost savings — into the relocation.
This is probably THE biggest mistake a firm can make. One of our clients recently attempted to combine multiple projects into a garden-variety data center move, and wound up making the project so complex that the timelines slipped out past all available slack in the plan. The firm was subject to significant financial penalties by failing to vacate the old facility on time.
Moving a data center is a major project in and of itself. It is not the time to take on virtualization of the computing environment, or incorporation of a new tiered storage philosophy. Move your data center first. If your operations or finance teams insist on trying to combine projects, work with your vendor or reseller on a quote for sequential projects — the fees should not be significantly more. Finally, calculate potential costs associated with the increased complexity of combined projects. Paying an extra month’s rent or failure-to-vacate penalties may wipe out any projected savings from the combined projects.
Incomplete planning
Some IT professionals don’t take the time or effort to prepare a comprehensive plan or complete documentation of their existing data center environments. They either go from memory about which applications run on which servers, or make incorrect assumptions on equipment that may or may not be in use. Relying on memory practically guarantees that a key server or application won’t get moved correctly.
Smart project managers take a comprehensive approach to planning; not only working a baseline “best case” plan to accomplish the project goals, but putting significant upfront time into risk management planning. Most failures are due to a lack of foresight — nobody thought the disaster that just hit your datacenter move could happen. Therefore, you didn’t plan for it.
Spend the time and meet with business owners across the organization and your IT team. Identify some of the worst-case scenarios that could occur in your data center move. When you think you’ve identified them all, brainstorm some more. Assess the likelihood of each scenario and the potential business impact and make contingency plans. Disasters may not happen, but you’ll be in a far better place if they do.
Forgetting to create a complete inventory
It should go without saying that any IT professional worth their credentials will have developed strong project plans, with plenty of slack time. However, don’t forget to develop a comprehensive inventory of every server, application, networking connection, storage array and everything else in your data center before you start to move. Work with business leaders across the enterprise to make sure you include everything. And be certain to have a well-documented list of everything in the current data center before planning for a new one.
Being lax or unprepared in moving data could have devastating results. While company expectations are high for improved performance from a new infrastructure environment, data quality may suffer if it is quickly moved as an afterthought. At best, critical data may be temporarily unavailable. At worst, records could be permanently lost. For companies that increasingly rely on data, the ramifications range from abandoned shopping carts and immediate loss of sales to long-term damage to customer relationships and company reputations.
Develop plans to relocate data with as much care as you put into hardware and building projects. And make sure you work with data owners across the enterprise. By following these recommendations, you’ll wish you could move more often.
What do you think?
Moving a data center is a major undertaking for most organizations. And a successful move is a nice resume builder for any IT professional. A successful move will showcase skills in large-scale project planning, project management, technology integration and interpersonal communications. The process provides a chance for exposure across the company, as virtually every department is touched by the IT organization (and affected by a data center move) in some way.
However, data center moves can be fraught with career-limiting failures. No IT professional wants to be on the receiving end of memos and discussions describing lost orders, missed deadlines or customer dissatisfaction that occurred because some mission-critical business process was disrupted by a data center move that didn’t run smoothly.
In most relocation projects, there are four critical mistakes to avoid: Ignoring the data, combining the move with additional projects, failure to plan appropriately and not creating an inventory of equipment, applications and processes. Taking time up front to think through each of these will significantly improve your chances for success — and the personal recognition that follows.
Ignoring the data
While IT professions give plenty of thought to the infrastructure involved in data center relocation, moving the data itself can be just as taxing. Sometimes even more so. It is easy for IT professionals to lose sight of the data, as many firms have adopted the model that business group leaders own their data, Marketing owns the prospect data base, operations owns inventory data and so on. Yet the reality is that the data and its underlying infrastructure must be considered as part of an interconnected holistic system — not elements that can be taken apart and easily reassembled at will.
The smart IT professional will reach out to business owners before the move to identify information that may be affected and reach agreement on such items as data access, compatibility with new systems, application migration and others. Cleansing data prior to the move may be worthwhile, but definitely not during the move.
Killing two birds with one stone
With all the planning that goes into moving a data center, many IT professionals attempt to combine other projects — often guided by the CFO’s visions of cost savings — into the relocation.
This is probably THE biggest mistake a firm can make. One of our clients recently attempted to combine multiple projects into a garden-variety data center move, and wound up making the project so complex that the timelines slipped out past all available slack in the plan. The firm was subject to significant financial penalties by failing to vacate the old facility on time.
Moving a data center is a major project in and of itself. It is not the time to take on virtualization of the computing environment, or incorporation of a new tiered storage philosophy. Move your data center first. If your operations or finance teams insist on trying to combine projects, work with your vendor or reseller on a quote for sequential projects — the fees should not be significantly more. Finally, calculate potential costs associated with the increased complexity of combined projects. Paying an extra month’s rent or failure-to-vacate penalties may wipe out any projected savings from the combined projects.
Incomplete planning
Some IT professionals don’t take the time or effort to prepare a comprehensive plan or complete documentation of their existing data center environments. They either go from memory about which applications run on which servers, or make incorrect assumptions on equipment that may or may not be in use. Relying on memory practically guarantees that a key server or application won’t get moved correctly.
Smart project managers take a comprehensive approach to planning; not only working a baseline “best case” plan to accomplish the project goals, but putting significant upfront time into risk management planning. Most failures are due to a lack of foresight — nobody thought the disaster that just hit your datacenter move could happen. Therefore, you didn’t plan for it.
Spend the time and meet with business owners across the organization and your IT team. Identify some of the worst-case scenarios that could occur in your data center move. When you think you’ve identified them all, brainstorm some more. Assess the likelihood of each scenario and the potential business impact and make contingency plans. Disasters may not happen, but you’ll be in a far better place if they do.
Forgetting to create a complete inventory
It should go without saying that any IT professional worth their credentials will have developed strong project plans, with plenty of slack time. However, don’t forget to develop a comprehensive inventory of every server, application, networking connection, storage array and everything else in your data center before you start to move. Work with business leaders across the enterprise to make sure you include everything. And be certain to have a well-documented list of everything in the current data center before planning for a new one.
Being lax or unprepared in moving data could have devastating results. While company expectations are high for improved performance from a new infrastructure environment, data quality may suffer if it is quickly moved as an afterthought. At best, critical data may be temporarily unavailable. At worst, records could be permanently lost. For companies that increasingly rely on data, the ramifications range from abandoned shopping carts and immediate loss of sales to long-term damage to customer relationships and company reputations.
Develop plans to relocate data with as much care as you put into hardware and building projects. And make sure you work with data owners across the enterprise. By following these recommendations, you’ll wish you could move more often.
What do you think?
Saturday, June 19, 2010
Tips to reduce risk and liability using EMRs
Posted by Mark Brousseau
According to the 2010 Healthcare Information and Management Systems Society (HIMSS) Analytics Report: Security of Patient Data, the number of healthcare organizations that reported a breach in data security increased by 6 percent in 2010, totaling 19 percent. As more healthcare organizations migrate to electronic medical records (EMRs), it’s important to take the proper steps to reduce risk and prevent medical liability suits.
Cintas offers the following tips for maintaining secure and compliant EMRs:
1. Collaboration. The most successful, secure medical healthcare record programs are the result of a collaborative process. In hospitals, it’s critical to include the chief security officer, chief financial officer, chief medical officer and medical records director to outline and define a comprehensive program that meets the needs of the entire organization and provides maximum security for patient files. Likewise, smaller healthcare organizations must include relevant senior staff members to develop and execute a successful program.
2. Digitize information. Digitizing healthcare records is the first step to ensure compliance with evolving industry regulations. By partnering with a vendor that provides secure document imaging and scanning services, physicians and clinicians will have real-time access to a patient’s entire medical history. Further, healthcare organizations will increase security through unique user identification to prevent unauthorized access and minimize risk of regulatory exposure, fines and penalties.
3. Create a strict security policy with password restrictions. Ensure authorized physicians and staff members have their own passwords and are unable to share. This will ensure an accurate audit trail if an incident is to occur. It’s also important to limit access to records. Create different levels of security based on the job functions of staff members. Only those working directly with the patient should have the ability to modify records.
4. Protect healthcare records throughout their entire lifecycle. Since medical records require long-term retention with a low volume of retrieval, it’s important to utilize a secure document management provider that has the capability to protect patient data information from the cradle to grave. By selecting a vendor that provides imaging, storage and shredding services, a healthcare organization can ensure both their electronic and physical medical records live in a secure environment and can be properly destroyed if required.
5. Train staff regarding proper documentation and retention practices. Incomplete and improper documentation and retention may lead to damaging financial and compliance issues. In addition, a staff member associated with improper documentation may be held liable in a malpractice case. To protect oneself, the organization and staff against allegations of negligent care and compliance violations, it’s important to provide continuous training to ensure that files are always complete, securely maintained and properly destroyed if required.
6. Have a disaster recovery program in place. Catastrophic events can and will take place. It is critical to ensure a hospital’s digital repository is backed up and can be recreated if necessary.
“As more healthcare organizations adopt EMR systems, it’s important to identify and work to alleviate potential risks before they occur,” said Tom Griga, Global Healthcare Manager, Cintas Document Management. “Healthcare Risk Management Week is an optimal time to reflect on your organization’s practices to ensure it is using up-to-date efficient and secure processes to protect patients and the organization from falling victim to a data breach.”
According to the 2010 Healthcare Information and Management Systems Society (HIMSS) Analytics Report: Security of Patient Data, the number of healthcare organizations that reported a breach in data security increased by 6 percent in 2010, totaling 19 percent. As more healthcare organizations migrate to electronic medical records (EMRs), it’s important to take the proper steps to reduce risk and prevent medical liability suits.
Cintas offers the following tips for maintaining secure and compliant EMRs:
1. Collaboration. The most successful, secure medical healthcare record programs are the result of a collaborative process. In hospitals, it’s critical to include the chief security officer, chief financial officer, chief medical officer and medical records director to outline and define a comprehensive program that meets the needs of the entire organization and provides maximum security for patient files. Likewise, smaller healthcare organizations must include relevant senior staff members to develop and execute a successful program.
2. Digitize information. Digitizing healthcare records is the first step to ensure compliance with evolving industry regulations. By partnering with a vendor that provides secure document imaging and scanning services, physicians and clinicians will have real-time access to a patient’s entire medical history. Further, healthcare organizations will increase security through unique user identification to prevent unauthorized access and minimize risk of regulatory exposure, fines and penalties.
3. Create a strict security policy with password restrictions. Ensure authorized physicians and staff members have their own passwords and are unable to share. This will ensure an accurate audit trail if an incident is to occur. It’s also important to limit access to records. Create different levels of security based on the job functions of staff members. Only those working directly with the patient should have the ability to modify records.
4. Protect healthcare records throughout their entire lifecycle. Since medical records require long-term retention with a low volume of retrieval, it’s important to utilize a secure document management provider that has the capability to protect patient data information from the cradle to grave. By selecting a vendor that provides imaging, storage and shredding services, a healthcare organization can ensure both their electronic and physical medical records live in a secure environment and can be properly destroyed if required.
5. Train staff regarding proper documentation and retention practices. Incomplete and improper documentation and retention may lead to damaging financial and compliance issues. In addition, a staff member associated with improper documentation may be held liable in a malpractice case. To protect oneself, the organization and staff against allegations of negligent care and compliance violations, it’s important to provide continuous training to ensure that files are always complete, securely maintained and properly destroyed if required.
6. Have a disaster recovery program in place. Catastrophic events can and will take place. It is critical to ensure a hospital’s digital repository is backed up and can be recreated if necessary.
“As more healthcare organizations adopt EMR systems, it’s important to identify and work to alleviate potential risks before they occur,” said Tom Griga, Global Healthcare Manager, Cintas Document Management. “Healthcare Risk Management Week is an optimal time to reflect on your organization’s practices to ensure it is using up-to-date efficient and secure processes to protect patients and the organization from falling victim to a data breach.”
Thursday, June 17, 2010
Organizations focus on process improvement
Posted by Mark Brousseau
There is a “huge uptick” in the number of companies evaluating document management solutions – particularly when it comes to enterprise content management (ECM), KeyMark CEO Jim Wanner told attendees at the KeyMark Horizons Conference 2010 at The Hyatt Downtown in Greenville, SC.
“If you look at the data from Wells Fargo, the amount of software sales was negatively effective by the recession in the fourth quarter of 2008 and the first quarter of 2009,” Wanner said. “Then you see a massive swing in software sales, largely driven by companies looking for improved processes.”
Today, with the economy showing signs of improvement, and companies making the first moves toward hiring more staff, buyers are looking hard at their core systems to figure out ways to improve business processes, Wanner said. One primary area of focus: reporting. “Companies don’t want to be caught off guard anymore. They want to know exactly what’s going on at each moment in time. We’ve always had departmental reporting. But companies want executive reporting via dashboards, with Sharepoint as a portal that interfaces with a lot of different systems,” Wanner told attendees.
There also is a huge push towards enterprise automation. “We have had more conversations about ECM in the last six months than at any time in our company’s history,” Wanner said. “This is a significant change. It is being driven by new hardware that makes it easier to do distributed scanning, and by software with streamlined interfaces that make it easier to push out applications.”
“People do not view ECM as a departmental solution anymore,” Wanner concluded.
Document classification, which automates document identification, also is gaining traction. “This is the wave of the future,” Wanner said. “It is good for most core applications, such as financial, insurance and government, and is delivering accuracy greater than 80 percent in many cases. When combined with mailroom equipment – document classification makes the digital mailroom a reality.”
There is a “huge uptick” in the number of companies evaluating document management solutions – particularly when it comes to enterprise content management (ECM), KeyMark CEO Jim Wanner told attendees at the KeyMark Horizons Conference 2010 at The Hyatt Downtown in Greenville, SC.
“If you look at the data from Wells Fargo, the amount of software sales was negatively effective by the recession in the fourth quarter of 2008 and the first quarter of 2009,” Wanner said. “Then you see a massive swing in software sales, largely driven by companies looking for improved processes.”
Today, with the economy showing signs of improvement, and companies making the first moves toward hiring more staff, buyers are looking hard at their core systems to figure out ways to improve business processes, Wanner said. One primary area of focus: reporting. “Companies don’t want to be caught off guard anymore. They want to know exactly what’s going on at each moment in time. We’ve always had departmental reporting. But companies want executive reporting via dashboards, with Sharepoint as a portal that interfaces with a lot of different systems,” Wanner told attendees.
There also is a huge push towards enterprise automation. “We have had more conversations about ECM in the last six months than at any time in our company’s history,” Wanner said. “This is a significant change. It is being driven by new hardware that makes it easier to do distributed scanning, and by software with streamlined interfaces that make it easier to push out applications.”
“People do not view ECM as a departmental solution anymore,” Wanner concluded.
Document classification, which automates document identification, also is gaining traction. “This is the wave of the future,” Wanner said. “It is good for most core applications, such as financial, insurance and government, and is delivering accuracy greater than 80 percent in many cases. When combined with mailroom equipment – document classification makes the digital mailroom a reality.”
Labels:
document imaging,
document management,
IAPP,
IARP,
Mark Brousseau,
TAWPI,
workflow
Wednesday, June 16, 2010
Why PDF?
Posted by Mark Brousseau
What's the purpose of PDF? Why can't you just send Word or Excel files? And why should you bother converting to PDF? Duff Johnson (duffjohnson@appligent.com), CEO of Appligent Document Solutions, explains:
Very few “love” PDF, but we all need it, because PDF is electronic paper.
For the efficient and reliable delivery of final-form electronic documents, there's nothing else quite like a PDF file.
For business and government organizations, “posting the PDF” is now essentially THE physical act of publication. Pretty much everyone with a computer is assumed to have a PDF Reader; it's a standard assumption in millions of interactions between consumers, business and government everyday. Hundreds of millions of people “PDF it” when they want to share some content.
Current squabbles between the two companies aside, even Apple's display technology is based on Adobe's PDF.
So how did electronic paper get defined as PDF?
Fundamentally, it's all about portability. Reliable viewing and printing across platforms is one of the great Killer Apps of all time.
There are other technologies that deliver some of PDF's complete package, but PDF is built from the outset to work the same way in all places, period. It turns out that's the most important thing of all. There are a set of very specific reasons why PDF is the world's choice for electronic paper. No other format offers this combination of attributes.
Easy to make and share
Sure, you can send a Word, HTML, PowerPoint or any other file. But other formats, while just as easy to attach to an email, aren't quite as easy to share as PDF.
First and foremost, you can't be sure the recipients have the same version of PowerPoint (or whatever you are sending). You may not want to give them the ability to edit the document, you don't want hassle with passwords. Making a PDF is usually just a click or two, and for that amount of effort, it's clearly a smart move.
A typical Acrobat or Reader user doesn't think about their choice to use PDF at a fundamental level. They make, send and use PDF files precisely because, hey – why worry? PDF just works.
WHY YOU MIGHT CARE: Who doesn't like it easy?
Reliable, manageable presentation
There's just no excuse for poor presentation. From elaborate graphic-design to simply making sure the page-breaks happen just the way you've set it up, PDF delivers you from worrying about what it's going to look like or print on the other end.
Other formats might don't look quite the same when opened on different machines, or can't be opened on a Mac. There may be font dependencies, or differing page-sizes or other application or user settings that affect appearance. There may be undesirable information such as slide-show notes, metadata or track-changes information that's really a part of the file, and you might not want to share it!
Not only does PDF provide a completely faithful, high-fidelity rendering of your source document, but you can mix and match it with other documents from other sources. There's detailed management of all sorts of document functions, navigation features, accessibility and more, and it's all just ready to go, for users on every platform, inside of each and every PDF file.
WHY YOU MIGHT CARE: PDF delivery is entirely manageable and utterly predictable.
Convert from any source, use in every workflow
PDF files may be created from any application that can print, including desktop publishing, office software, design, database report and other applications. PDF files may also be produced from scanners, either with or without searchable text via OCR. You can even take a screen-shot and convert it to PDF and combine it with other PDF pages.
WHY YOU MIGHT CARE: Users can learn to make PDF files from any software in seconds, and every PDF file works with every other PDF file, so they can be shuffled and reorganized like... paper pages.
Smaller file-size, yet fully searchable
When converting to a PDF file, it's usually possible to reduce the file-size substantially below that of the original source files. Even for scanned documents, conversion to PDF generally means smaller files - and more importantly, scanned pages can be made into searchable PDFs.
WHY YOU MIGHT CARE: Although hard-drives are getting larger and larger, a 195kb PDF file is usually preferred over a 2.95 MB Word file, especially if users aren't expected to edit it.
Self-contained
Unlike most authoring formats, a properly-made PDF file includes all content, fonts, images, structure, signatures, encryption, scripts and other resources necessary to the appearance and proper function of the file in an ISO 32000 conforming reader.
PDF just works everywhere; it has no server or style-sheet dependencies, and each page may be extracted into it's own self-contained PDF file.
WHY YOU MIGHT CARE: Self-contained files are inherently rugged and adaptable, for example, they can go offline, be emailed, FTPed or accessed in any preferred manner, always with the same result.
Makes content from any source accessible to users with disabilities
One of the great beauties of PDF is the ability to make almost any source content accessible to users with disabilities who must use Assistive Technology (AT) in order to read. From scanned documents to drawings, diagrams and multilingual content, PDF files may be tagged to provide a complete, high-quality reading and navigating experience.
Many applications can't generate accessible content by themselves, but converted to PDF, these documents may be structured and tagged for complete accessibility.
WHY YOU MIGHT CARE: For Federal agencies and contractors, Section 508 requires that electronic documents be accessible. Other jurisdictions are beginning to adopt similar regulations, and many businesses are choosing to post accessible content.
A multiplatform International Standard
PDF is a truly multiplatform technology, and it's here to stay. PDF is equally at home on Windows, Mac OS X, Linux, UNIX, Android and any other operating system.
No-one has ever had to pay Adobe a royalty to make PDF files, and the company has published the PDF Reference since the beginning. Until recently, Adobe kept the copyright and updated the Reference, the “rules of the road” for PDF, as they wished.
In 2008, Adobe ceded control of the PDF specification to ISO, the International Standards Organization. Now known as ISO 32000, PDF is an International Standard; it is no longer owned by Adobe Systems but is managed by diverse members of the electronic document industry, with free and open access to all interested parties as observers or full voting members.
WHY YOU MIGHT CARE: While PDF is everywhere, one lingering doubt for some has been the idea that Adobe Systems “owned” PDF and therefore, adopting PDF for critical business functions would create a vulnerability. Turning over PDF to ISO is the categorical solution to this concern – Adobe Systems or no, PDF is here to stay, and no-one owns your PDF files except you.
What do you think?
What's the purpose of PDF? Why can't you just send Word or Excel files? And why should you bother converting to PDF? Duff Johnson (duffjohnson@appligent.com), CEO of Appligent Document Solutions, explains:
Very few “love” PDF, but we all need it, because PDF is electronic paper.
For the efficient and reliable delivery of final-form electronic documents, there's nothing else quite like a PDF file.
For business and government organizations, “posting the PDF” is now essentially THE physical act of publication. Pretty much everyone with a computer is assumed to have a PDF Reader; it's a standard assumption in millions of interactions between consumers, business and government everyday. Hundreds of millions of people “PDF it” when they want to share some content.
Current squabbles between the two companies aside, even Apple's display technology is based on Adobe's PDF.
So how did electronic paper get defined as PDF?
Fundamentally, it's all about portability. Reliable viewing and printing across platforms is one of the great Killer Apps of all time.
There are other technologies that deliver some of PDF's complete package, but PDF is built from the outset to work the same way in all places, period. It turns out that's the most important thing of all. There are a set of very specific reasons why PDF is the world's choice for electronic paper. No other format offers this combination of attributes.
Easy to make and share
Sure, you can send a Word, HTML, PowerPoint or any other file. But other formats, while just as easy to attach to an email, aren't quite as easy to share as PDF.
First and foremost, you can't be sure the recipients have the same version of PowerPoint (or whatever you are sending). You may not want to give them the ability to edit the document, you don't want hassle with passwords. Making a PDF is usually just a click or two, and for that amount of effort, it's clearly a smart move.
A typical Acrobat or Reader user doesn't think about their choice to use PDF at a fundamental level. They make, send and use PDF files precisely because, hey – why worry? PDF just works.
WHY YOU MIGHT CARE: Who doesn't like it easy?
Reliable, manageable presentation
There's just no excuse for poor presentation. From elaborate graphic-design to simply making sure the page-breaks happen just the way you've set it up, PDF delivers you from worrying about what it's going to look like or print on the other end.
Other formats might don't look quite the same when opened on different machines, or can't be opened on a Mac. There may be font dependencies, or differing page-sizes or other application or user settings that affect appearance. There may be undesirable information such as slide-show notes, metadata or track-changes information that's really a part of the file, and you might not want to share it!
Not only does PDF provide a completely faithful, high-fidelity rendering of your source document, but you can mix and match it with other documents from other sources. There's detailed management of all sorts of document functions, navigation features, accessibility and more, and it's all just ready to go, for users on every platform, inside of each and every PDF file.
WHY YOU MIGHT CARE: PDF delivery is entirely manageable and utterly predictable.
Convert from any source, use in every workflow
PDF files may be created from any application that can print, including desktop publishing, office software, design, database report and other applications. PDF files may also be produced from scanners, either with or without searchable text via OCR. You can even take a screen-shot and convert it to PDF and combine it with other PDF pages.
WHY YOU MIGHT CARE: Users can learn to make PDF files from any software in seconds, and every PDF file works with every other PDF file, so they can be shuffled and reorganized like... paper pages.
Smaller file-size, yet fully searchable
When converting to a PDF file, it's usually possible to reduce the file-size substantially below that of the original source files. Even for scanned documents, conversion to PDF generally means smaller files - and more importantly, scanned pages can be made into searchable PDFs.
WHY YOU MIGHT CARE: Although hard-drives are getting larger and larger, a 195kb PDF file is usually preferred over a 2.95 MB Word file, especially if users aren't expected to edit it.
Self-contained
Unlike most authoring formats, a properly-made PDF file includes all content, fonts, images, structure, signatures, encryption, scripts and other resources necessary to the appearance and proper function of the file in an ISO 32000 conforming reader.
PDF just works everywhere; it has no server or style-sheet dependencies, and each page may be extracted into it's own self-contained PDF file.
WHY YOU MIGHT CARE: Self-contained files are inherently rugged and adaptable, for example, they can go offline, be emailed, FTPed or accessed in any preferred manner, always with the same result.
Makes content from any source accessible to users with disabilities
One of the great beauties of PDF is the ability to make almost any source content accessible to users with disabilities who must use Assistive Technology (AT) in order to read. From scanned documents to drawings, diagrams and multilingual content, PDF files may be tagged to provide a complete, high-quality reading and navigating experience.
Many applications can't generate accessible content by themselves, but converted to PDF, these documents may be structured and tagged for complete accessibility.
WHY YOU MIGHT CARE: For Federal agencies and contractors, Section 508 requires that electronic documents be accessible. Other jurisdictions are beginning to adopt similar regulations, and many businesses are choosing to post accessible content.
A multiplatform International Standard
PDF is a truly multiplatform technology, and it's here to stay. PDF is equally at home on Windows, Mac OS X, Linux, UNIX, Android and any other operating system.
No-one has ever had to pay Adobe a royalty to make PDF files, and the company has published the PDF Reference since the beginning. Until recently, Adobe kept the copyright and updated the Reference, the “rules of the road” for PDF, as they wished.
In 2008, Adobe ceded control of the PDF specification to ISO, the International Standards Organization. Now known as ISO 32000, PDF is an International Standard; it is no longer owned by Adobe Systems but is managed by diverse members of the electronic document industry, with free and open access to all interested parties as observers or full voting members.
WHY YOU MIGHT CARE: While PDF is everywhere, one lingering doubt for some has been the idea that Adobe Systems “owned” PDF and therefore, adopting PDF for critical business functions would create a vulnerability. Turning over PDF to ISO is the categorical solution to this concern – Adobe Systems or no, PDF is here to stay, and no-one owns your PDF files except you.
What do you think?
Monday, June 14, 2010
Time to re-evaluate the rationale for SaaS?
Posted by Mark Brousseau
Software as a service (SaaS) will have a role in the future of IT, but not the dominant future that was first thought, according to Gartner, Inc. Organizations should carefully assess their software needs in light of the current promises delivered on by SaaS.
“In 2009, within enterprise applications, SaaS represented 3.4 percent of total enterprise spending, slightly up from 2008 at 2.8 percent,” said David Cearley, vice president and fellow at Gartner. Gartner predicts that the global enterprise applications software market will reach $8.8 billion in 2010.
From a market perspective, most of the spending for SaaS is occurring in content, collaboration and communication and the customer relationship management markets. Collectively, they represented 65 percent of the global enterprise applications software market in 2009.
Many of the bad practices that occurred in the on-premises world are now moving their way into SaaS. The biggest example is shelfware. “Shelfware as a service is the concept of paying for a software subscription that is not being accessed by an end user,” said Cearley. “This most commonly occurs in large organizations, but it could happen to any company, especially those that have downsized their workforce, or one that has oversubscribed to trigger a volume discount.”
SaaS may not have delivered on its early grand promises - of the current SaaS deployments we estimate that a total of 90 percent of SaaS deployments are not pay-per-use -, but it has re-energized the software market and added choice. SaaS does not solve all the challenges of software delivery, but can provide advantages based on the specific circumstances of a deployment as it is quicker to implement and configure for less-complex problems. “SaaS changes the role of IT from implementing its own operations to inspecting a vendor’s operations,” Cearley added.
Gartner said that SaaS will likely penetrate every company at one level or another and recommends that organizations consider four steps when evaluating SaaS:
1. Determine Value
SaaS is not a panacea, and companies need to evaluate and understand the trade-offs that SaaS presents. While it limits infrastructure overheads and management, and lowers short- to medium-term total cost of ownership, third-party application tools are limited and SaaS applications cannot be counted as assets on a balance sheet.
2. Develop Governance
The next step is to develop a SaaS policy and governance document. This document should be a collaborative effort between the business and IT to create internal and external SaaS governance model.
3. Evaluate Vendors
Organizations need to evaluate SaaS vendors for specific application needs as applicable. A vendor’s commitment to SaaS is not just measured in business performance, but in technical considerations, such as operations management capabilities.
4. Develop an Integration Road Map
This step will be a continuous process of developing an integration road map on how SaaS applications will integrate with on-premises applications and other SaaS solutions deployed.
What do you think?
Software as a service (SaaS) will have a role in the future of IT, but not the dominant future that was first thought, according to Gartner, Inc. Organizations should carefully assess their software needs in light of the current promises delivered on by SaaS.
“In 2009, within enterprise applications, SaaS represented 3.4 percent of total enterprise spending, slightly up from 2008 at 2.8 percent,” said David Cearley, vice president and fellow at Gartner. Gartner predicts that the global enterprise applications software market will reach $8.8 billion in 2010.
From a market perspective, most of the spending for SaaS is occurring in content, collaboration and communication and the customer relationship management markets. Collectively, they represented 65 percent of the global enterprise applications software market in 2009.
Many of the bad practices that occurred in the on-premises world are now moving their way into SaaS. The biggest example is shelfware. “Shelfware as a service is the concept of paying for a software subscription that is not being accessed by an end user,” said Cearley. “This most commonly occurs in large organizations, but it could happen to any company, especially those that have downsized their workforce, or one that has oversubscribed to trigger a volume discount.”
SaaS may not have delivered on its early grand promises - of the current SaaS deployments we estimate that a total of 90 percent of SaaS deployments are not pay-per-use -, but it has re-energized the software market and added choice. SaaS does not solve all the challenges of software delivery, but can provide advantages based on the specific circumstances of a deployment as it is quicker to implement and configure for less-complex problems. “SaaS changes the role of IT from implementing its own operations to inspecting a vendor’s operations,” Cearley added.
Gartner said that SaaS will likely penetrate every company at one level or another and recommends that organizations consider four steps when evaluating SaaS:
1. Determine Value
SaaS is not a panacea, and companies need to evaluate and understand the trade-offs that SaaS presents. While it limits infrastructure overheads and management, and lowers short- to medium-term total cost of ownership, third-party application tools are limited and SaaS applications cannot be counted as assets on a balance sheet.
2. Develop Governance
The next step is to develop a SaaS policy and governance document. This document should be a collaborative effort between the business and IT to create internal and external SaaS governance model.
3. Evaluate Vendors
Organizations need to evaluate SaaS vendors for specific application needs as applicable. A vendor’s commitment to SaaS is not just measured in business performance, but in technical considerations, such as operations management capabilities.
4. Develop an Integration Road Map
This step will be a continuous process of developing an integration road map on how SaaS applications will integrate with on-premises applications and other SaaS solutions deployed.
What do you think?
Wednesday, June 9, 2010
Paper is still king in government operations
Check out this article by Government Technology magazine on the results of IAPP-TAWPI's 2010 Government Payments and Document Processing Study.
http://www.govtech.com/gt/765018?topic=117673
http://www.govtech.com/gt/765018?topic=117673
Monday, June 7, 2010
The threat of cyber security to tax administration
Posted by Mark Brousseau
“The way tax administration occurs today has significantly changed,” Devon Bryan, CISSP, deputy associate CIO, cyber security, Internal Revenue Service, told attendees at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta this afternoon. “Modern tax administration has grown increasingly dependent on the near limitless interconnectivity of the Internet to provide instant access to information and services. But that comes with a risk.”
Bryan said, “We are now facing a flood of increasingly sophisticated cyber threats. As commercial and government services continue to be made available online, the amount of sensitive and financial data transmitted over the internet, also increases.” He pointed to the 2009 Internet Crime Report which reported that the Internet Crime Complaint Center web site received more than 300,000 compliant submissions last year – a 22.3 percent increase as compared to 2008. The FBI estimates that more money was made in cybercrime last year than in illegal drug trafficking, Bryan added.
And that’s a chilling thought, Bryan said, when you consider that data networks now underlie the U.S. power grid, the country’s military operations, and its telecommunications infrastructure.
Data networks also are becoming critical to U.S. tax administration.
And that’s why tax processors need to be more vigilant about cyber security, Bryan said.
“The way tax administration occurs today has significantly changed,” Devon Bryan, CISSP, deputy associate CIO, cyber security, Internal Revenue Service, told attendees at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta this afternoon. “Modern tax administration has grown increasingly dependent on the near limitless interconnectivity of the Internet to provide instant access to information and services. But that comes with a risk.”
Bryan said, “We are now facing a flood of increasingly sophisticated cyber threats. As commercial and government services continue to be made available online, the amount of sensitive and financial data transmitted over the internet, also increases.” He pointed to the 2009 Internet Crime Report which reported that the Internet Crime Complaint Center web site received more than 300,000 compliant submissions last year – a 22.3 percent increase as compared to 2008. The FBI estimates that more money was made in cybercrime last year than in illegal drug trafficking, Bryan added.
And that’s a chilling thought, Bryan said, when you consider that data networks now underlie the U.S. power grid, the country’s military operations, and its telecommunications infrastructure.
Data networks also are becoming critical to U.S. tax administration.
And that’s why tax processors need to be more vigilant about cyber security, Bryan said.
EDI announcements may impact document management
Posted by Mark Brousseau
A number of recent announcements in the world of electronic data interchange (EDI) are likely to have an impact on enterprise document management and payment processing in the near future. These announcements foretell a trend in which businesses are increasingly sending electronic documents and electronic business data to each other through “business networks.” Pete Dinham, Global Solutions Director, BancTec (www.banctec.com), explains:
Consider the following announcements:
... SAP (a large ERP vendor) announced they will offer their own "business ready network" for supporting EDI and other B2B e-commerce exchanges titled the SAP Information Interchange. The vision of this solution is to connect all SAP users and their trading partners together in a large business network to simplify data exchanges.
... GXS (the world's largest EDI provider) announced their intent to acquire Inovis (another of the world's largest - top five EDI providers). Both of these companies have been investing heavily into developing business networks and SaaS (software as a service) solutions to be operated in a cloud computing environment.
... IBM announced their intent to acquire Sterling Commerce (the world's 2nd largest EDI provider). Sterling Commerce has also been heavily investing in the business network concept of supporting EDI/B2B in a SaaS business model using a cloud computing environment.
What is motivating all of these acquisitions and activities in an IT sector that has been around for decades and is considered mature?
GXS, Inovis and Sterling Commerce are all legacy EDI companies that are rapidly innovating and setting up “business networks.” They recognize the power and impact that the social networking revolution is having on businesses and their networks of suppliers and customers. They realize that it will change the way businesses have traditionally operated and exchanged data.
Business networks are EDI/B2B e-commerce hubs that enable companies to efficiently exchange electronic business documents and data in a relatively low cost and simple manner. Business networks are similar in many ways to the popular Linkedin and Facebook sites. Companies can join and set up a profile, and then search for their customers, suppliers and service providers and easily connect with them and begin exchanging electronic documents and messages using electronic data interchanges. How is this different than in the past? In the past, if company A wanted to exchange EDI messages with company B, they would have to call company B and negotiate data requirements and data formats. It was often a complex and time consuming effort each time they wanted to connect with a new partner. Business networks let you join the network and simply ask permission to connect with others on it. The data formats and data requirements are all handled by the business network (Hub) in a cloud computing environment.
What does business networks have to do with document management and transactional content management? EDI and B2B data exchanges are part of the transactional content management's chain of custody. Chain of custody refers to the ability to track and trace each movement that an electronic document takes in a business process. Electronic documents/messages/data can originate at a customer or supplier and travel through the business network (EDI/B2B hub) into a company's internal transactional content management system. Invoices are an example of a business document that originates at a supplier and must be received, processed, approved and paid. The efficient flow of external business data between businesses, and then internally using document management or transactional content management solutions enables near real-time processing and end-to-end visibility of transactional data. In these environments many areas of latency and costs are removed from the business process.
Business networks offer simplicity. They remove complexity and the need for expensive investments in legacy EDI and B2B systems and dedicated resources. They make it easy for many more companies to participate. They extend enterprise document management, payment processing and transactional content management visibility all the way from one end of the process to the other. The impact of business networks and social networking on large enterprises is just starting to be understood. The implications are enormous and will be interesting to watch.
For related information on transactional content management, go to http://transactionalcontentmanagement.blogspot.com/.
A number of recent announcements in the world of electronic data interchange (EDI) are likely to have an impact on enterprise document management and payment processing in the near future. These announcements foretell a trend in which businesses are increasingly sending electronic documents and electronic business data to each other through “business networks.” Pete Dinham, Global Solutions Director, BancTec (www.banctec.com), explains:
Consider the following announcements:
... SAP (a large ERP vendor) announced they will offer their own "business ready network" for supporting EDI and other B2B e-commerce exchanges titled the SAP Information Interchange. The vision of this solution is to connect all SAP users and their trading partners together in a large business network to simplify data exchanges.
... GXS (the world's largest EDI provider) announced their intent to acquire Inovis (another of the world's largest - top five EDI providers). Both of these companies have been investing heavily into developing business networks and SaaS (software as a service) solutions to be operated in a cloud computing environment.
... IBM announced their intent to acquire Sterling Commerce (the world's 2nd largest EDI provider). Sterling Commerce has also been heavily investing in the business network concept of supporting EDI/B2B in a SaaS business model using a cloud computing environment.
What is motivating all of these acquisitions and activities in an IT sector that has been around for decades and is considered mature?
GXS, Inovis and Sterling Commerce are all legacy EDI companies that are rapidly innovating and setting up “business networks.” They recognize the power and impact that the social networking revolution is having on businesses and their networks of suppliers and customers. They realize that it will change the way businesses have traditionally operated and exchanged data.
Business networks are EDI/B2B e-commerce hubs that enable companies to efficiently exchange electronic business documents and data in a relatively low cost and simple manner. Business networks are similar in many ways to the popular Linkedin and Facebook sites. Companies can join and set up a profile, and then search for their customers, suppliers and service providers and easily connect with them and begin exchanging electronic documents and messages using electronic data interchanges. How is this different than in the past? In the past, if company A wanted to exchange EDI messages with company B, they would have to call company B and negotiate data requirements and data formats. It was often a complex and time consuming effort each time they wanted to connect with a new partner. Business networks let you join the network and simply ask permission to connect with others on it. The data formats and data requirements are all handled by the business network (Hub) in a cloud computing environment.
What does business networks have to do with document management and transactional content management? EDI and B2B data exchanges are part of the transactional content management's chain of custody. Chain of custody refers to the ability to track and trace each movement that an electronic document takes in a business process. Electronic documents/messages/data can originate at a customer or supplier and travel through the business network (EDI/B2B hub) into a company's internal transactional content management system. Invoices are an example of a business document that originates at a supplier and must be received, processed, approved and paid. The efficient flow of external business data between businesses, and then internally using document management or transactional content management solutions enables near real-time processing and end-to-end visibility of transactional data. In these environments many areas of latency and costs are removed from the business process.
Business networks offer simplicity. They remove complexity and the need for expensive investments in legacy EDI and B2B systems and dedicated resources. They make it easy for many more companies to participate. They extend enterprise document management, payment processing and transactional content management visibility all the way from one end of the process to the other. The impact of business networks and social networking on large enterprises is just starting to be understood. The implications are enormous and will be interesting to watch.
For related information on transactional content management, go to http://transactionalcontentmanagement.blogspot.com/.
How tax agencies are coping with the economy
Posted by Mark Brousseau
“We are all in the same boat,” Roger Ervin, secretary, Wisconsin Department of Revenue, told attendees at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta this morning. “As the economy continues to falter, we need to be more diligent in collections. But citizens want more. They want us to be as efficient as possible. And they want us to do it quietly.”
Ken Lay, secretary, North Carolina Department of Revenue, agreed, noting that, “We want to be easier to do business with. We want to be professional. And we want to be firm, but fair.”
Lisa Echeverri, executive director, Florida Department of Revenue, noted that the recession has presented her department with an opportunity to take a fresh look at its operations and processes.
“You should never waste a good crisis,” Lay concurred, adding that his department is becoming much more process and metric focused. “We are redoing all of our processes. So everyone is not only busy, they are also working on creating the new North Carolina Department of Revenue.”
“We’re beginning to change the organization to fit the processes that will fit the technology,” Lay said, noting that his department has already enhanced its data warehouse capabilities. Some of the changes Lay’s department has implemented include monthly transformation meetings for staff, an updated intranet site with information on changes, deployment of a Microsoft Sharepoint site where anyone can see the state of the project, and an emphasis on deadlines and holding firm on them.
“We’re not doing this for the sake of the Department of Revenue. We’re really doing it for the citizens of North Carolina,” Lay explained. Changes like these may become necessary.
“We are in a period of market conversion in how taxpayers interact with their tax departments,” Ervin added. “Many are choosing electronic products, as their contribution to the future, while many others are continuing with paper. Many taxpayers are using the Internet with regularity, while others still call their tax department until they speak with a person. Clearly, we are in a deep and complex recession and recovery is still months or years away. In this environment, many citizens cannot or will not pay. This puts pressure on collections. Intuitively, this should be a time of investments in new technology and processes. But tight budgets have put a hold on those investments.”
“We are all in the same boat,” Roger Ervin, secretary, Wisconsin Department of Revenue, told attendees at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta this morning. “As the economy continues to falter, we need to be more diligent in collections. But citizens want more. They want us to be as efficient as possible. And they want us to do it quietly.”
Ken Lay, secretary, North Carolina Department of Revenue, agreed, noting that, “We want to be easier to do business with. We want to be professional. And we want to be firm, but fair.”
Lisa Echeverri, executive director, Florida Department of Revenue, noted that the recession has presented her department with an opportunity to take a fresh look at its operations and processes.
“You should never waste a good crisis,” Lay concurred, adding that his department is becoming much more process and metric focused. “We are redoing all of our processes. So everyone is not only busy, they are also working on creating the new North Carolina Department of Revenue.”
“We’re beginning to change the organization to fit the processes that will fit the technology,” Lay said, noting that his department has already enhanced its data warehouse capabilities. Some of the changes Lay’s department has implemented include monthly transformation meetings for staff, an updated intranet site with information on changes, deployment of a Microsoft Sharepoint site where anyone can see the state of the project, and an emphasis on deadlines and holding firm on them.
“We’re not doing this for the sake of the Department of Revenue. We’re really doing it for the citizens of North Carolina,” Lay explained. Changes like these may become necessary.
“We are in a period of market conversion in how taxpayers interact with their tax departments,” Ervin added. “Many are choosing electronic products, as their contribution to the future, while many others are continuing with paper. Many taxpayers are using the Internet with regularity, while others still call their tax department until they speak with a person. Clearly, we are in a deep and complex recession and recovery is still months or years away. In this environment, many citizens cannot or will not pay. This puts pressure on collections. Intuitively, this should be a time of investments in new technology and processes. But tight budgets have put a hold on those investments.”
Changes taxpayers can believe in
Posted by Mark Brousseau
“Many taxpayers believe they can’t get a fair shake from their state tax system,” Walter Hellerstein, Shackelford Professor of Taxation, University of Georgia Law School, said this morning at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta. “They think there are too many thumbs in the mix. This is what’s driving so many of the problems that we are seeing, not to mention, the Tea Party movement.”
Getting to a more “robust” tax system that is closer to “the ideal” will go a long way toward changing taxpayers’ negative perceptions, Hellerstein told attendees. He said that some key changes should include a streamlined sales tax, independent state tax courts, withholding on flow-through entities, federal-state tax coordination, and corporate income tax uniformity. “These changes would at least get us to where we could talk to each other in more civil tones,” Hellerstein said. “The answer is respect; not only between federal and state entities, but also between taxpayers and states.”
“Many taxpayers believe they can’t get a fair shake from their state tax system,” Walter Hellerstein, Shackelford Professor of Taxation, University of Georgia Law School, said this morning at the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta. “They think there are too many thumbs in the mix. This is what’s driving so many of the problems that we are seeing, not to mention, the Tea Party movement.”
Getting to a more “robust” tax system that is closer to “the ideal” will go a long way toward changing taxpayers’ negative perceptions, Hellerstein told attendees. He said that some key changes should include a streamlined sales tax, independent state tax courts, withholding on flow-through entities, federal-state tax coordination, and corporate income tax uniformity. “These changes would at least get us to where we could talk to each other in more civil tones,” Hellerstein said. “The answer is respect; not only between federal and state entities, but also between taxpayers and states.”
Sunday, June 6, 2010
Modernized e-Filing System Takes Off
Posted by Mark Brousseau
More than 6.5 million federal tax returns (business and individual) were submitted through the new Modernized e-File (MeF) system, the Internal Revenue Service (IRS) told attendees of the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta today. That’s on top of nearly 400,000 state tax returns (business and individual) submitted using MeF. On March 15th alone, the IRS received nearly 500,000 MeF submissions.
MeF is a web-based system that allows electronic filing through the Internet of corporate, partnership, exempt organization, excise tax returns, and, for the first time this year, Individual 1040 returns. When fully deployed, MeF will replace the IRS’ legacy e-file system.
1040 MeF is being deployed over three years. The first phase, deployed this year, includes processing of Form 1040 and 22 other forms and schedules. Fourteen states currently are in live production with 1040 MeF. As of April 21, more than 1 million federal and state returns were submitted via 1040 MeF, the IRS says.
Phase 2, targeted for next filling season, will support the same forms and will include current and prior year processing. In addition, the IRS will complete the build-out of additional hardware and improved disaster recovery capabilities. The Phase 3 plan, targeted for January 2012, will bring online the remaining related 1040 forms and schedules and will include amended return processing. During the deployment phases, the IRS’ legacy e-file system will remain fully operational with current plans to retire it during the fourth quarter of calendar year 2012.
More than 6.5 million federal tax returns (business and individual) were submitted through the new Modernized e-File (MeF) system, the Internal Revenue Service (IRS) told attendees of the Federation of Tax Administrators (FTA) Annual Meeting at the Grand Hyatt Atlanta today. That’s on top of nearly 400,000 state tax returns (business and individual) submitted using MeF. On March 15th alone, the IRS received nearly 500,000 MeF submissions.
MeF is a web-based system that allows electronic filing through the Internet of corporate, partnership, exempt organization, excise tax returns, and, for the first time this year, Individual 1040 returns. When fully deployed, MeF will replace the IRS’ legacy e-file system.
1040 MeF is being deployed over three years. The first phase, deployed this year, includes processing of Form 1040 and 22 other forms and schedules. Fourteen states currently are in live production with 1040 MeF. As of April 21, more than 1 million federal and state returns were submitted via 1040 MeF, the IRS says.
Phase 2, targeted for next filling season, will support the same forms and will include current and prior year processing. In addition, the IRS will complete the build-out of additional hardware and improved disaster recovery capabilities. The Phase 3 plan, targeted for January 2012, will bring online the remaining related 1040 forms and schedules and will include amended return processing. During the deployment phases, the IRS’ legacy e-file system will remain fully operational with current plans to retire it during the fourth quarter of calendar year 2012.
Thursday, June 3, 2010
Priceless gem or fool's gold?
Posted by Mark Brousseau
Priceless gem or fool’s gold? Laurel B. Sanders (lsanders@docfinity.com) of Optical Image Technology (OIT) offers 10 strategies for cost justifying an automated invoice processing solution:
Remember the stuff we called fool’s gold as kids? Our first discovery led plenty of us to think we were striking it rich as we ran home with sample treasure in hand. Reminiscent smiles and chuckles quickly told us we had been fooled by something that held more apparent than intrinsic value. Similarly, some technology improvements are priceless. Others seem like a great idea, but deliver only moderate value.
Automated invoice processing is no fool’s gold. Implemented well, it’s a priceless gem that boosts profitability, service reputations, and employee morale. If you understand the ROI, it’s easy to defend making the investment. Integrating electronic document management (EDM) and business process management (BPM) software with your line-of-business (LOB) applications saves time, money, and aggravation by letting you:
1. Access payment-related documents instantly
• Integrating a digital repository with your LOB apps gives you instant access to images of invoices, purchase requisitions/orders, packing and delivery slips, checks, GL info, and more within your preferred invoicing system. No more cumbersome search. No lost documents. No re-creating missing files.
2. Match documentation automatically
• Rules-driven BPM searches for identical customer data, invoice numbers, product codes, descriptions, payment terms, and more, validating invoicing readiness. Automated matching gives you more time for meaningful work.
3. Identify discrepancies and errors quickly
• BPM easily identifies missing or inconsistent information so you can take appropriate action. Email alerts notify workers of tasks requiring human intervention. Everything else keeps moving.
4. Expedite invoice routing and approval
• ECM and BPM gather, package, and flow documentation to the right people for timely review. Automated routing with links to files requiring approval, and task assignment based on hierarchies and attendance rules ensure each invoice is handled promptly and appropriately. Turnaround: typically 50-90 % faster (with less effort).
5. Eliminate errors by re-using data intelligently
• As new documents are created (purchase orders following approved requisition orders; invoices after shipping), meaningful data is extracted and re-used, improving content integrity and transactional accuracy. No more $10 invoices for $100 goods or multiple bills to Mr. Smyth/Smithe/Smith. Say goodbye to costly errors.
6. Collect receivables quickly and cost efficiently
• Automated document review, instant file access, and automated routing/approval for documents that meet billing criteria minimize human involvement while ensuring quick, accurate processing.
7. Take advantage of more early-payment discounts
• Automated invoicing based on business rules and real-time data such as invoicing terms lets you keep pace with 2/10 net 30 and other discounts, saving $$.
8. Eliminate late payment penalties
• Don’t (ever!) miss an important date. Automated invoice processing relies on stored information rather than human accuracy and reliability to ensure timely decisions. Stored data (such as due dates and discount opportunities) keeps work prioritized, removing the potential for errors and missed opportunities.
9. Help staff to be more productive
• Work is most satisfying when employees’ skills and talents are used well. Automation lets workers focus handle routine work quickly, giving them time to focus on problematic cases and accomplish typically 30-60% more each day than they would without it.
10. Create a better work environment
• Creating a balance between work’s challenges and rewards isn’t easy. Automation lets employees apply the skills they have worked to develop, be more productive, and accomplish what needs to be done so they can have a life beyond the workplace.
Still wondering if it’s worth it? Consider your own work environment. Are you extracting full value from your people, systems, and business information? If there’s room for improvement, there’s no better time to start than right now.
Priceless gem or fool’s gold? Laurel B. Sanders (lsanders@docfinity.com) of Optical Image Technology (OIT) offers 10 strategies for cost justifying an automated invoice processing solution:
Remember the stuff we called fool’s gold as kids? Our first discovery led plenty of us to think we were striking it rich as we ran home with sample treasure in hand. Reminiscent smiles and chuckles quickly told us we had been fooled by something that held more apparent than intrinsic value. Similarly, some technology improvements are priceless. Others seem like a great idea, but deliver only moderate value.
Automated invoice processing is no fool’s gold. Implemented well, it’s a priceless gem that boosts profitability, service reputations, and employee morale. If you understand the ROI, it’s easy to defend making the investment. Integrating electronic document management (EDM) and business process management (BPM) software with your line-of-business (LOB) applications saves time, money, and aggravation by letting you:
1. Access payment-related documents instantly
• Integrating a digital repository with your LOB apps gives you instant access to images of invoices, purchase requisitions/orders, packing and delivery slips, checks, GL info, and more within your preferred invoicing system. No more cumbersome search. No lost documents. No re-creating missing files.
2. Match documentation automatically
• Rules-driven BPM searches for identical customer data, invoice numbers, product codes, descriptions, payment terms, and more, validating invoicing readiness. Automated matching gives you more time for meaningful work.
3. Identify discrepancies and errors quickly
• BPM easily identifies missing or inconsistent information so you can take appropriate action. Email alerts notify workers of tasks requiring human intervention. Everything else keeps moving.
4. Expedite invoice routing and approval
• ECM and BPM gather, package, and flow documentation to the right people for timely review. Automated routing with links to files requiring approval, and task assignment based on hierarchies and attendance rules ensure each invoice is handled promptly and appropriately. Turnaround: typically 50-90 % faster (with less effort).
5. Eliminate errors by re-using data intelligently
• As new documents are created (purchase orders following approved requisition orders; invoices after shipping), meaningful data is extracted and re-used, improving content integrity and transactional accuracy. No more $10 invoices for $100 goods or multiple bills to Mr. Smyth/Smithe/Smith. Say goodbye to costly errors.
6. Collect receivables quickly and cost efficiently
• Automated document review, instant file access, and automated routing/approval for documents that meet billing criteria minimize human involvement while ensuring quick, accurate processing.
7. Take advantage of more early-payment discounts
• Automated invoicing based on business rules and real-time data such as invoicing terms lets you keep pace with 2/10 net 30 and other discounts, saving $$.
8. Eliminate late payment penalties
• Don’t (ever!) miss an important date. Automated invoice processing relies on stored information rather than human accuracy and reliability to ensure timely decisions. Stored data (such as due dates and discount opportunities) keeps work prioritized, removing the potential for errors and missed opportunities.
9. Help staff to be more productive
• Work is most satisfying when employees’ skills and talents are used well. Automation lets workers focus handle routine work quickly, giving them time to focus on problematic cases and accomplish typically 30-60% more each day than they would without it.
10. Create a better work environment
• Creating a balance between work’s challenges and rewards isn’t easy. Automation lets employees apply the skills they have worked to develop, be more productive, and accomplish what needs to be done so they can have a life beyond the workplace.
Still wondering if it’s worth it? Consider your own work environment. Are you extracting full value from your people, systems, and business information? If there’s room for improvement, there’s no better time to start than right now.
Perception of data security at odds with reality
Posted by Mark Brousseau
Nearly three-quarters of organizations believe they have adequate policies in place to protect sensitive, personal information, yet more than half have lost sensitive data within the past two years — and nearly 60 percent of those organizations acknowledge data loss as a recurring problem, according to findings of a global study by Accenture.
The study reveals a startling difference between organizations’ intentions regarding data privacy and how they actually protect sensitive personal information, such as name, address, date of birth, race, National ID/social security number and medical history. The study was conducted in conjunction with the Ponemon Institute, a privacy, protection and information security research firm.
“The volume of sensitive personal information being collected and shared by organizations has grown exponentially in recent years, making data protection a critical business issue and not just a technology concern,” said Alastair MacWillson, managing director of Accenture’s Security practice. “Our study underscores the importance of taking a comprehensive approach to data privacy and protection, one that closes the gaps between business strategy, risk management, compliance reporting and IT security.”
Global Business Findings
Fifty-eight (58) percent of business respondents have experienced at least one data security breach over the past two years, yet 73 percent said their organization has adequate policies to protect the personally identifiable information it maintains.
While 70 percent agreed that organizations have an obligation to take reasonable steps to secure consumers’ personal information, there are discrepancies in their commitments for doing so:
• Forty-five (45) percent of respondents were unsure about or actively disagreed with granting customers the right to control the type of information that is collected about them.
• Forty-seven (47) percent were unsure about or disagreed with customers having a right to control how this information is used.
• Nearly half also did not believe it was important or very important to: limit the collection (47 percent) or sharing (46 percent) of sensitive personal customer information; protect consumer privacy rights (47 percent); prevent cross-border transfers of personal information to countries with inadequate privacy laws (47 percent); prevent cyber crimes against consumers (48 percent); or prevent data loss or theft (47 percent).
• The study revealed that the biggest causes of data loss are internal — problems presumably well within an organization’s ability to detect and correct. For instance, business or system failure (57 percent) and employee negligence or errors (48 percent) were cited most often as the source of the breaches; cyber crime was cited as a cause of only 18 percent of security breaches.
While many organizations believe that complying with existing regulations is sufficient, it appears that compliance alone may not be enough to protect sensitive data. For instance, 70 percent of respondents said they regularly monitor privacy and data protection regulatory compliance requirements, yet data breaches have occurred in 58 percent of organizations polled.
Nearly three-quarters of organizations believe they have adequate policies in place to protect sensitive, personal information, yet more than half have lost sensitive data within the past two years — and nearly 60 percent of those organizations acknowledge data loss as a recurring problem, according to findings of a global study by Accenture.
The study reveals a startling difference between organizations’ intentions regarding data privacy and how they actually protect sensitive personal information, such as name, address, date of birth, race, National ID/social security number and medical history. The study was conducted in conjunction with the Ponemon Institute, a privacy, protection and information security research firm.
“The volume of sensitive personal information being collected and shared by organizations has grown exponentially in recent years, making data protection a critical business issue and not just a technology concern,” said Alastair MacWillson, managing director of Accenture’s Security practice. “Our study underscores the importance of taking a comprehensive approach to data privacy and protection, one that closes the gaps between business strategy, risk management, compliance reporting and IT security.”
Global Business Findings
Fifty-eight (58) percent of business respondents have experienced at least one data security breach over the past two years, yet 73 percent said their organization has adequate policies to protect the personally identifiable information it maintains.
While 70 percent agreed that organizations have an obligation to take reasonable steps to secure consumers’ personal information, there are discrepancies in their commitments for doing so:
• Forty-five (45) percent of respondents were unsure about or actively disagreed with granting customers the right to control the type of information that is collected about them.
• Forty-seven (47) percent were unsure about or disagreed with customers having a right to control how this information is used.
• Nearly half also did not believe it was important or very important to: limit the collection (47 percent) or sharing (46 percent) of sensitive personal customer information; protect consumer privacy rights (47 percent); prevent cross-border transfers of personal information to countries with inadequate privacy laws (47 percent); prevent cyber crimes against consumers (48 percent); or prevent data loss or theft (47 percent).
• The study revealed that the biggest causes of data loss are internal — problems presumably well within an organization’s ability to detect and correct. For instance, business or system failure (57 percent) and employee negligence or errors (48 percent) were cited most often as the source of the breaches; cyber crime was cited as a cause of only 18 percent of security breaches.
While many organizations believe that complying with existing regulations is sufficient, it appears that compliance alone may not be enough to protect sensitive data. For instance, 70 percent of respondents said they regularly monitor privacy and data protection regulatory compliance requirements, yet data breaches have occurred in 58 percent of organizations polled.
Wednesday, June 2, 2010
Mirror, mirror on the web
Posted by Mark Brousseau
Although some may dub it "egosurfing," others might call it a wise career move to conduct a web search to see what information about you is available online. After all, what is visible to you also is visible to potential employers. In a recent survey by Accountemps, 69 percent of workers interviewed said they have entered their name in one or more search engines to see what results were displayed.
"While all professionals should protect their reputation by monitoring their online presence, this is especially critical for job seekers," said Max Messmer, chairman of Accountemps. "Many employers now routinely perform Internet searches to quickly learn about applicants' interests, experience and industry involvement. Job seekers need to pay attention to what they share online -- including contributed content, article comments and photos -- and take steps to ensure the image they project is professional."
Accountemps offers the following five tips for making your online footprint work for you:
1. Know what's out there. Set alerts using Google or other tracking services to receive a notification each time something new is said about you, and delete any content that could be seen as unprofessional or controversial. If you find unflattering material you cannot remove, be prepared to explain if a hiring manager asks about it.
2. Take advantage of privacy settings. If you belong to social networking sites or have a personal blog, adjust your privacy settings so you control who has access.
3. Contribute to the conversation. As appropriate, comment on articles of interest to you and your field, and consider writing columns for industry organizations.
4. Exercise discretion. Be aware that whatever you post may be seen by potential employers, and give careful consideration to how statements you make may be interpreted. While you want to show you have a well-informed opinion, keep your comments constructive, and avoid disparaging others.
5. Keep your profiles current. Make sure your professional profiles on sites such as Google and LinkedIn are up-to-date and highlight your experience
Although some may dub it "egosurfing," others might call it a wise career move to conduct a web search to see what information about you is available online. After all, what is visible to you also is visible to potential employers. In a recent survey by Accountemps, 69 percent of workers interviewed said they have entered their name in one or more search engines to see what results were displayed.
"While all professionals should protect their reputation by monitoring their online presence, this is especially critical for job seekers," said Max Messmer, chairman of Accountemps. "Many employers now routinely perform Internet searches to quickly learn about applicants' interests, experience and industry involvement. Job seekers need to pay attention to what they share online -- including contributed content, article comments and photos -- and take steps to ensure the image they project is professional."
Accountemps offers the following five tips for making your online footprint work for you:
1. Know what's out there. Set alerts using Google or other tracking services to receive a notification each time something new is said about you, and delete any content that could be seen as unprofessional or controversial. If you find unflattering material you cannot remove, be prepared to explain if a hiring manager asks about it.
2. Take advantage of privacy settings. If you belong to social networking sites or have a personal blog, adjust your privacy settings so you control who has access.
3. Contribute to the conversation. As appropriate, comment on articles of interest to you and your field, and consider writing columns for industry organizations.
4. Exercise discretion. Be aware that whatever you post may be seen by potential employers, and give careful consideration to how statements you make may be interpreted. While you want to show you have a well-informed opinion, keep your comments constructive, and avoid disparaging others.
5. Keep your profiles current. Make sure your professional profiles on sites such as Google and LinkedIn are up-to-date and highlight your experience
8 business imperatives for driving competitive advantage
Posted by Mark Brousseau
Businesses face eight strategic imperatives that will play determining factors in their short- and long-term success following the economic crisis, according to PricewaterhouseCoopers LLP (PwC).
PwC’s recommendations include:
1. Sustain cost-reduction measures in order to improve margins with a smaller, more productive workforce and use newly freed resources for investments in the company's future.
2. Align risk to performance and create individual accountability measures by integrating risk at a business unit level and creating more personal accountability and reward structures.
3. Prepare for major regulatory changes through a cross-section of issues—international tax frameworks, infrastructure development, healthcare costs and environmental policies.
4. Enhance trust in your business through voluntary disclosures or independent verifications in areas such as supply chain integrity, measurement of carbon emissions and data integrity, to satisfy discerning stakeholders who now demand greater levels of transparency when making decisions about where to invest.
5. Make technology a strategic asset and competitive differentiator by increasing your investment in technology infrastructure and applications.
6. Leverage innovation to underscore differentiation, especially around hot-button global issues such as sustainability and climate change.
7. Move forward with deferred transactions and deals to help grow, protect value, and capture benefits of scale, productivity and efficiency.
8. Invest in leadership, talent development and deployment of for your employees, starting with the C-Suite.
"Executives must renew their focus on the most important issues to accelerate growth - both top and bottom line," says Bob Moritz, US chairman and senior partner, PricewaterhouseCoopers. "We believe that organizations that better engage key stakeholders while also making strategic IT, innovation, M&A and people investments will unleash potential growth that will create long-term competitive advantages."
These strategies have become particularly important as U.S. CEOs continue to respond to the significant shifts underpinning America's economic recovery. When asked about threats to business growth as their companies emerge from the recession, U.S. CEOs express the greatest concern about the prospect of overregulation, followed by shifts in consumer behaviors.
Interestingly, these concerns weigh more heavily in the US than elsewhere: 71 percent of U.S. CEOs are either somewhat or extremely concerned about overregulation compared to 60 percent of global CEOs; 62 percent of U.S. CEOs worry about changing consumer behaviors compared to 48 percent of global CEOs. Meanwhile, concerns about talent shortages have temporarily receded in the US, with CEOs focusing on organizational redesign and employee engagement and morale programs.
Businesses face eight strategic imperatives that will play determining factors in their short- and long-term success following the economic crisis, according to PricewaterhouseCoopers LLP (PwC).
PwC’s recommendations include:
1. Sustain cost-reduction measures in order to improve margins with a smaller, more productive workforce and use newly freed resources for investments in the company's future.
2. Align risk to performance and create individual accountability measures by integrating risk at a business unit level and creating more personal accountability and reward structures.
3. Prepare for major regulatory changes through a cross-section of issues—international tax frameworks, infrastructure development, healthcare costs and environmental policies.
4. Enhance trust in your business through voluntary disclosures or independent verifications in areas such as supply chain integrity, measurement of carbon emissions and data integrity, to satisfy discerning stakeholders who now demand greater levels of transparency when making decisions about where to invest.
5. Make technology a strategic asset and competitive differentiator by increasing your investment in technology infrastructure and applications.
6. Leverage innovation to underscore differentiation, especially around hot-button global issues such as sustainability and climate change.
7. Move forward with deferred transactions and deals to help grow, protect value, and capture benefits of scale, productivity and efficiency.
8. Invest in leadership, talent development and deployment of for your employees, starting with the C-Suite.
"Executives must renew their focus on the most important issues to accelerate growth - both top and bottom line," says Bob Moritz, US chairman and senior partner, PricewaterhouseCoopers. "We believe that organizations that better engage key stakeholders while also making strategic IT, innovation, M&A and people investments will unleash potential growth that will create long-term competitive advantages."
These strategies have become particularly important as U.S. CEOs continue to respond to the significant shifts underpinning America's economic recovery. When asked about threats to business growth as their companies emerge from the recession, U.S. CEOs express the greatest concern about the prospect of overregulation, followed by shifts in consumer behaviors.
Interestingly, these concerns weigh more heavily in the US than elsewhere: 71 percent of U.S. CEOs are either somewhat or extremely concerned about overregulation compared to 60 percent of global CEOs; 62 percent of U.S. CEOs worry about changing consumer behaviors compared to 48 percent of global CEOs. Meanwhile, concerns about talent shortages have temporarily receded in the US, with CEOs focusing on organizational redesign and employee engagement and morale programs.
Tuesday, June 1, 2010
Sputtering Check Conversion
By Mark Brousseau
While the latest statistics from Herndon, VA-based NACHA show that overall Automated Clearing House (ACH) volumes increased slightly in 2009 (2.6 percent compared to 2008 activity), Accounts Receivable Check (ARC) Conversion -- once the darling of remittance operations -- decreased by more than 10 percent during the same period. In 2009, there were approximately 2.4 billion ARC transactions, NACHA reports, compared to nearly 2.7 billion ARC transactions the previous year.
The drop in ARC volumes comes as no surprise to Creditron, Inc., Founder and CEO Wally Vogel (wvogel@creditron.com). Vogel expects ARC volumes to continue to fall as consumers move away from writing checks for recurring remittances and towards electronic mechanisms such as Internet-initiated payments (which registered a stout 9.7 percent volume increase in 2009, NACHA reports).
Vogel says that, "As far as we have seen, ARC is not really going anywhere among mid-volume processors. Our clients are either moving to Check 21 or staying with what they have. Many clients don't perceive a significant cost benefit if they already have an automated encoding solution."
Vogel adds that Creditron is seeing "slow but steady growth in Check 21 among its current clients, and virtually all of its new installations include Check 21. The biggest reason our clients are reluctant to implement ARC is the notification requirement, which isn't a factor if they go with Check 21."
What are you seeing?
While the latest statistics from Herndon, VA-based NACHA show that overall Automated Clearing House (ACH) volumes increased slightly in 2009 (2.6 percent compared to 2008 activity), Accounts Receivable Check (ARC) Conversion -- once the darling of remittance operations -- decreased by more than 10 percent during the same period. In 2009, there were approximately 2.4 billion ARC transactions, NACHA reports, compared to nearly 2.7 billion ARC transactions the previous year.
The drop in ARC volumes comes as no surprise to Creditron, Inc., Founder and CEO Wally Vogel (wvogel@creditron.com). Vogel expects ARC volumes to continue to fall as consumers move away from writing checks for recurring remittances and towards electronic mechanisms such as Internet-initiated payments (which registered a stout 9.7 percent volume increase in 2009, NACHA reports).
Vogel says that, "As far as we have seen, ARC is not really going anywhere among mid-volume processors. Our clients are either moving to Check 21 or staying with what they have. Many clients don't perceive a significant cost benefit if they already have an automated encoding solution."
Vogel adds that Creditron is seeing "slow but steady growth in Check 21 among its current clients, and virtually all of its new installations include Check 21. The biggest reason our clients are reluctant to implement ARC is the notification requirement, which isn't a factor if they go with Check 21."
What are you seeing?
Subscribe to:
Posts (Atom)