By Mark Brousseau
Although the economic downturn caused CFOs to concentrate primarily on their steward role, the recovery has reemphasized the need to act as a catalyst and strategist, Jeff Bronaugh, senior manager at Deloitte Consulting LLP told attendees of the Masters Session at Fusion 2011 at the Gaylord Palms Resort and Convention Center Florida. Bronaugh and his colleagues Bob Comeau, national service line lead and principle at Deloitte Consulting, and Scott Rottman, principal at Deloitte Consulting, led a highly interactive discussion among attendees of the Masters Session.
During the depths of the credit crisis and recession, CFOs were spending roughly 60 percent of their time in the operator and steward roles, Bronaugh said. The increased time spent in the steward role reduced the amount of time CFOs spent in their preferred role as the strategist in the organization.
But in the wake of considerable capital-market and economic turmoil, CFOs are expected to take on broader and deeper strategic roles, Bronaugh said. CFOs are now routinely in charge of an expansive range of regulatory, governance, and strategy functions, especially investor and public relations, strategic planning, corporate development, and mergers and acquisitions.
As the economy begins to stabilize, focus for North America's top finance executives is shifting back to strategic initiatives, Bronaugh said. He pointed to a survey of CFOs conducted by Deloitte in the first quarter that showed that quality metrics, influencing strategies, and monitoring initiatives were the top three challenges of their finance organizations. CFOs cited strategic ambiguity, major change initiatives and regulatory change as their top three job stresses, Bronaugh said.
Against this backdrop, Bronaugh said there are 10 hot topics for CFOs:
1. Improving business decision support
2. Influencing business strategy and operational strategies
3. Major infrastructure and change initiatives
4. Prioritizing capital investments
5. Regulatory changes
6. Finance operating models
7. Cash is king
8. Managing finance department expectations
9. Finance talent management
10. Taxes
What are the hot topics in your finance department?
Showing posts with label regulations. Show all posts
Showing posts with label regulations. Show all posts
Friday, May 13, 2011
Thursday, February 17, 2011
Banks should get back to the boring
Posted by Mark Brousseau
As the world closes in on the three-year mark of the beginning of the global financial crisis, one expert believes that it’s not enough to rely on new regulations to prevent future disasters — a fundamental change in mindset is required.
Rex Ghosh, a Harvard PhD economist who has worked in the financial markets for more than 20 years, currently with the International Monetary Fund, believes that the very culture of the financial sector needs to shift back to basics as the economy limps out of recession.
“The global financial crisis, marked by the bankruptcy of Lehman Brothers in September 2008, has taken an enormous economic, financial, and social toll,” said Ghosh. “Both in the United States and abroad, regulations, laws, and practices are being changed to help ensure that such crises do not recur. But these regulations — running to the thousands of pages — are enormously complex. It may be years before they are all adopted and absorbed into the daily lives of those in the financial sector. The real prevention rests in the notion that leaders need to work toward changing the very culture of the sector to rely on more fundamental and basic practices based in prudence and responsibility.”
Ghosh would like to see the financial sector learn the following lessons in 2011:
... For the Federal Reserve -- Central banks such as the Fed should not only look at goods price inflation, but also at important asset prices, such as the stock market and housing sectors. It also needs to be more mindful of lending and credit booms, especially in the face of weakening credit standards. That’s what paved the road to hell three years ago. We do not want to repeat that option again. Traditional monetary policy tools (like the Fed’s interest rate) may need to be bolstered by counter-cyclical capital requirements (requiring banks to hold more capital in “boom” times).
... For Banks -- Boring is good. Banks should get used to being a much smaller proportion of the economy, like it was before the 1990s. Bankers should also be aware of credit and counterparty risks. They need to know who they’re doing business with, know to whom they are lending and not rely solely on credit ratings.
... For Regulators – They need to watch the kids and the cookie jar. They should not count on banks to manage their risks prudently. They should think seriously about “tail risks” — just because something has not happened before, such as a nation-wide decline in house prices, doesn’t mean it cannot happen in the future.
“These are not incredibly difficult precepts,” Ghosh added. “The short answer is that the Fed needs to broaden its view of what constitutes inflation, banks need to look past the paperwork and avoid risk, and regulators need to realize their jobs don’t end with the passage of new rules. For every regulation created, there are 50 new ways created to get around it. We need to realize that the practices of the past won’t go away until we match the letter of the regulations with the culture of the financial sector.”
What do you think?
As the world closes in on the three-year mark of the beginning of the global financial crisis, one expert believes that it’s not enough to rely on new regulations to prevent future disasters — a fundamental change in mindset is required.
Rex Ghosh, a Harvard PhD economist who has worked in the financial markets for more than 20 years, currently with the International Monetary Fund, believes that the very culture of the financial sector needs to shift back to basics as the economy limps out of recession.
“The global financial crisis, marked by the bankruptcy of Lehman Brothers in September 2008, has taken an enormous economic, financial, and social toll,” said Ghosh. “Both in the United States and abroad, regulations, laws, and practices are being changed to help ensure that such crises do not recur. But these regulations — running to the thousands of pages — are enormously complex. It may be years before they are all adopted and absorbed into the daily lives of those in the financial sector. The real prevention rests in the notion that leaders need to work toward changing the very culture of the sector to rely on more fundamental and basic practices based in prudence and responsibility.”
Ghosh would like to see the financial sector learn the following lessons in 2011:
... For the Federal Reserve -- Central banks such as the Fed should not only look at goods price inflation, but also at important asset prices, such as the stock market and housing sectors. It also needs to be more mindful of lending and credit booms, especially in the face of weakening credit standards. That’s what paved the road to hell three years ago. We do not want to repeat that option again. Traditional monetary policy tools (like the Fed’s interest rate) may need to be bolstered by counter-cyclical capital requirements (requiring banks to hold more capital in “boom” times).
... For Banks -- Boring is good. Banks should get used to being a much smaller proportion of the economy, like it was before the 1990s. Bankers should also be aware of credit and counterparty risks. They need to know who they’re doing business with, know to whom they are lending and not rely solely on credit ratings.
... For Regulators – They need to watch the kids and the cookie jar. They should not count on banks to manage their risks prudently. They should think seriously about “tail risks” — just because something has not happened before, such as a nation-wide decline in house prices, doesn’t mean it cannot happen in the future.
“These are not incredibly difficult precepts,” Ghosh added. “The short answer is that the Fed needs to broaden its view of what constitutes inflation, banks need to look past the paperwork and avoid risk, and regulators need to realize their jobs don’t end with the passage of new rules. For every regulation created, there are 50 new ways created to get around it. We need to realize that the practices of the past won’t go away until we match the letter of the regulations with the culture of the financial sector.”
What do you think?
Friday, November 12, 2010
The Top 5 Compliance Issues That Smolder Beneath The Surface
By Dan Wilhelms
When firefighters arrive at a burning building, their first priority (of course) is to knock down the visible flames. Yet experienced firefighters know that when those flames are extinguished, the job isn’t done yet. That’s the time they go in and start looking for the hidden flames – the smoldering materials in a ceiling or behind a wall that could suddenly erupt and engulf them when they’re not expecting it. They know those hidden fires can be the most dangerous of all simply because they can’t be seen until it’s too late.
For the past few years, IT and compliance managers have been like those firefighters first arriving on the scene. You’ve been putting out the compliance fires – the big issues that have been burning brightly since SOX legislation was passed in the early part of the millennium. You’ve done a good job too, creating a new compliance structure where roles are defined, segregation of duties (SOD) is the standard and transactions are well-documented.
Yet just like those firefighters, the job isn’t finished yet. There are still all kinds of compliance issues that, while not as visible as the first ones you tackled, can still create a back-draft that will burn your organization if you’re not careful. Following are five of the most pressing (and potentially dangerous).
Excessive access – With the complexity of the security architecture that is part of modern ERP systems, it’s easier than you might think to accidentally give some users access to potentially sensitive transactions that might be far outside their job descriptions. Access is usually assigned by the help desk, and in the heat of battle, with many pressing issues, they may not be as careful about assigning or double-checking authorizations as they should be. When that occurs, it can lead to all types of dangers.
Imagine a parts picker in the warehouse being given access to every SAP transaction in the organization (which has happened, by the way). In that instance, the warehouse worker started running and looking at transactions (including financial transactions) just out of curiosity. But what if he’d had a different agenda? He could have changed the data, either accidentally or maliciously, or executed a fraudulent transaction, creating a serious compliance breech.
Even if he didn’t change anything, there’s still a productivity issue. After all, if he’s busy running a myriad of SAP transactions, he’s not busy picking orders.
Excessive access is not the type of issue that will show up in a SOD report. The best way to address it is by installing governance, risk and compliance (GRC) software that makes managing security and authorization easier. The software should also provide you with tools that help you measure and monitor actual system usage so you can see whether the things users are doing and the places they’re going within the system are appropriate to their job requirements. Having automated systems in place is particularly important in smaller enterprises that usually do not have the resources for a lot of manual inspection.
Access to sensitive data – Users don’t necessarily need access to a broad variety of data to pose a risk; they just need access to particular data. For example who can open and close posting periods. Who can view HR salary and benefits information? Again, this is nothing that is likely to show up on a SOD report, yet it’s a very real risk.
We’ve all heard the stories about how a certain soft drink manufacturer’s formula is better-guarded than the launch codes for nuclear weapons. Imagine if the formula was sitting on the ERP system and the wrong person was given access to it – or given access to payroll, HIPAA or other sensitive information.
One key to controlling access to sensitive data, of course, is to exercise more care when assigning authorizations. This is called preventative controls. It’s also important to use reverse business engineering tools to see who does have access to sensitive transactions, whether that access is appropriate, and what they did with the information once they had it. This is called detective controls. It’s like following the smoke to discover where the hidden fire is.
Poor segregation of duties – Although SOD has already been mentioned, some organizations are not familiar with what it is and its purpose. Let’s look at the nuclear missiles analogy again. In order to launch, there are two keys controlled by two different people. Two keys are used to assure that no one person has control of the missiles in case someone decides to “go rogue.”
It’s the same with financial transactions in an enterprise. You don’t want one person to be able to create a vendor in your SAP system and then initiate payment of that same vendor; you’re just asking people to steal from you.
That’s why it’s important to have value-added tools that analyze user access against the enterprise’s SOD rulebook and flag any conflicting functions. An ongoing analysis will point out any areas of risk so they can be remediated, and keep you informed should the situation change.
Of course, in a smaller organization, conflicting duties may not be avoidable. Everyone is expected to wear multiple hats, and sometimes those hats do not allow for proper segregation. In those instances, you need to have tools that can monitor actual transactions and report against them so you can see if a compliance violation is occurring. In other words, if someone has to carry both keys, you know when they’ve inserted them both into the control panel through mitigating controls.
Even with the proper tools, it’s unlikely you’ll ever bring SOD conflicts down to zero. But you can get awfully darned close, and keep an eye on what happens from there.
Introduction of malicious programs into production systems – The modern reality is that ERP systems are rarely steady state. Often enterprises have multiple initiatives going on that introduce new data, configuration and programs into the production systems.
With lean staffing and urgent deadlines, often changes are not properly tested or audited. In other words, they don’t use proper change management. A developer who has the means to do it, the motive to do it and knows whether he/she can get away with it can wreak all kinds of havoc by including malicious code along with legitimate code when new applications are moved into production. Malicious code can download sensitive data, create fraudulent transactions, delete data or crash the systems.
It is critical to have a second person reviewing any changes at every step of the way. What that means is the person who requests the change can’t be the person who develops it; the developer can’t be the person who tests it; the person who tests it can’t be the same person who migrates it into production. In other words, transport development and approvals cannot be given by a single person – instead, an independent approver or even a committee must be controlling the entire process.
Change management duties need to be segregated and managed throughout the entire process. Even if not malicious, poorly coded, untested programs can result in a catastrophic outage. Given that in a large enterprise an hour of downtime can cost $1 million, it’s easy to see why proper change management is worth the investment.
Emergency access – In large ERP environments, there’s always the chance that emergency maintenance of production systems will need to be performed. When it does, and the enterprise is dialing 9-1-1, someone needs to be given emergency “super user” access to everything in the system. Such emergency maintenance is often by outside parties (e.g. the software vender or 3rd party consultants).
The problem is these emergency all-access passes aren’t always tracked very well. Everyone is so fixed on putting out the fire – for example unlocking a sales order that has frozen the entire system – that they never think about documenting what transactions were performed or what data was changed. The risk is increased by the widespread use of generic “firefighter” user IDs whereby the individual performing the actions isn’t definitively known.
You’d like to think that the person you give super user access to can be trusted. But blind trust is what has gotten other enterprises into trouble in the past. The person with full access may make other changes while he/she is in there – either accidentally or on purpose. You need to be able to monitor who has all-access and what they do while they have it.
It is critical to have tools that allow you to track what these super-users do while they’re in the system. Not just for the day-to-day operation of the business, but for the auditors as well. When auditors see someone has been given this additional emergency access, their job is to immediately assume the person did something nefarious. It will be your job to prove they didn’t. You’ll need to show why access was granted, what was done while the person was in there, when/how long the person was in the system, what changes were made and when the person exited.
While it’s important to put out the big compliance blazes, keep in mind those are the ones that are also easy to see. Once they’re under control, take a tip from the professional firefighters and be sure to check for the smaller, smoldering flashpoints. It’s your best insurance against getting burned.
Dan Wilhelms is President and CEO of SymSoft Corporation (www.controlpanelGRC.com, the makers of ControlPanelGRC, professional solutions for compliance automation. He can be reached at dwilhelms@sym-corp.com.
When firefighters arrive at a burning building, their first priority (of course) is to knock down the visible flames. Yet experienced firefighters know that when those flames are extinguished, the job isn’t done yet. That’s the time they go in and start looking for the hidden flames – the smoldering materials in a ceiling or behind a wall that could suddenly erupt and engulf them when they’re not expecting it. They know those hidden fires can be the most dangerous of all simply because they can’t be seen until it’s too late.
For the past few years, IT and compliance managers have been like those firefighters first arriving on the scene. You’ve been putting out the compliance fires – the big issues that have been burning brightly since SOX legislation was passed in the early part of the millennium. You’ve done a good job too, creating a new compliance structure where roles are defined, segregation of duties (SOD) is the standard and transactions are well-documented.
Yet just like those firefighters, the job isn’t finished yet. There are still all kinds of compliance issues that, while not as visible as the first ones you tackled, can still create a back-draft that will burn your organization if you’re not careful. Following are five of the most pressing (and potentially dangerous).
Excessive access – With the complexity of the security architecture that is part of modern ERP systems, it’s easier than you might think to accidentally give some users access to potentially sensitive transactions that might be far outside their job descriptions. Access is usually assigned by the help desk, and in the heat of battle, with many pressing issues, they may not be as careful about assigning or double-checking authorizations as they should be. When that occurs, it can lead to all types of dangers.
Imagine a parts picker in the warehouse being given access to every SAP transaction in the organization (which has happened, by the way). In that instance, the warehouse worker started running and looking at transactions (including financial transactions) just out of curiosity. But what if he’d had a different agenda? He could have changed the data, either accidentally or maliciously, or executed a fraudulent transaction, creating a serious compliance breech.
Even if he didn’t change anything, there’s still a productivity issue. After all, if he’s busy running a myriad of SAP transactions, he’s not busy picking orders.
Excessive access is not the type of issue that will show up in a SOD report. The best way to address it is by installing governance, risk and compliance (GRC) software that makes managing security and authorization easier. The software should also provide you with tools that help you measure and monitor actual system usage so you can see whether the things users are doing and the places they’re going within the system are appropriate to their job requirements. Having automated systems in place is particularly important in smaller enterprises that usually do not have the resources for a lot of manual inspection.
Access to sensitive data – Users don’t necessarily need access to a broad variety of data to pose a risk; they just need access to particular data. For example who can open and close posting periods. Who can view HR salary and benefits information? Again, this is nothing that is likely to show up on a SOD report, yet it’s a very real risk.
We’ve all heard the stories about how a certain soft drink manufacturer’s formula is better-guarded than the launch codes for nuclear weapons. Imagine if the formula was sitting on the ERP system and the wrong person was given access to it – or given access to payroll, HIPAA or other sensitive information.
One key to controlling access to sensitive data, of course, is to exercise more care when assigning authorizations. This is called preventative controls. It’s also important to use reverse business engineering tools to see who does have access to sensitive transactions, whether that access is appropriate, and what they did with the information once they had it. This is called detective controls. It’s like following the smoke to discover where the hidden fire is.
Poor segregation of duties – Although SOD has already been mentioned, some organizations are not familiar with what it is and its purpose. Let’s look at the nuclear missiles analogy again. In order to launch, there are two keys controlled by two different people. Two keys are used to assure that no one person has control of the missiles in case someone decides to “go rogue.”
It’s the same with financial transactions in an enterprise. You don’t want one person to be able to create a vendor in your SAP system and then initiate payment of that same vendor; you’re just asking people to steal from you.
That’s why it’s important to have value-added tools that analyze user access against the enterprise’s SOD rulebook and flag any conflicting functions. An ongoing analysis will point out any areas of risk so they can be remediated, and keep you informed should the situation change.
Of course, in a smaller organization, conflicting duties may not be avoidable. Everyone is expected to wear multiple hats, and sometimes those hats do not allow for proper segregation. In those instances, you need to have tools that can monitor actual transactions and report against them so you can see if a compliance violation is occurring. In other words, if someone has to carry both keys, you know when they’ve inserted them both into the control panel through mitigating controls.
Even with the proper tools, it’s unlikely you’ll ever bring SOD conflicts down to zero. But you can get awfully darned close, and keep an eye on what happens from there.
Introduction of malicious programs into production systems – The modern reality is that ERP systems are rarely steady state. Often enterprises have multiple initiatives going on that introduce new data, configuration and programs into the production systems.
With lean staffing and urgent deadlines, often changes are not properly tested or audited. In other words, they don’t use proper change management. A developer who has the means to do it, the motive to do it and knows whether he/she can get away with it can wreak all kinds of havoc by including malicious code along with legitimate code when new applications are moved into production. Malicious code can download sensitive data, create fraudulent transactions, delete data or crash the systems.
It is critical to have a second person reviewing any changes at every step of the way. What that means is the person who requests the change can’t be the person who develops it; the developer can’t be the person who tests it; the person who tests it can’t be the same person who migrates it into production. In other words, transport development and approvals cannot be given by a single person – instead, an independent approver or even a committee must be controlling the entire process.
Change management duties need to be segregated and managed throughout the entire process. Even if not malicious, poorly coded, untested programs can result in a catastrophic outage. Given that in a large enterprise an hour of downtime can cost $1 million, it’s easy to see why proper change management is worth the investment.
Emergency access – In large ERP environments, there’s always the chance that emergency maintenance of production systems will need to be performed. When it does, and the enterprise is dialing 9-1-1, someone needs to be given emergency “super user” access to everything in the system. Such emergency maintenance is often by outside parties (e.g. the software vender or 3rd party consultants).
The problem is these emergency all-access passes aren’t always tracked very well. Everyone is so fixed on putting out the fire – for example unlocking a sales order that has frozen the entire system – that they never think about documenting what transactions were performed or what data was changed. The risk is increased by the widespread use of generic “firefighter” user IDs whereby the individual performing the actions isn’t definitively known.
You’d like to think that the person you give super user access to can be trusted. But blind trust is what has gotten other enterprises into trouble in the past. The person with full access may make other changes while he/she is in there – either accidentally or on purpose. You need to be able to monitor who has all-access and what they do while they have it.
It is critical to have tools that allow you to track what these super-users do while they’re in the system. Not just for the day-to-day operation of the business, but for the auditors as well. When auditors see someone has been given this additional emergency access, their job is to immediately assume the person did something nefarious. It will be your job to prove they didn’t. You’ll need to show why access was granted, what was done while the person was in there, when/how long the person was in the system, what changes were made and when the person exited.
While it’s important to put out the big compliance blazes, keep in mind those are the ones that are also easy to see. Once they’re under control, take a tip from the professional firefighters and be sure to check for the smaller, smoldering flashpoints. It’s your best insurance against getting burned.
Dan Wilhelms is President and CEO of SymSoft Corporation (www.controlpanelGRC.com, the makers of ControlPanelGRC, professional solutions for compliance automation. He can be reached at dwilhelms@sym-corp.com.
Monday, July 12, 2010
Economic risks of data overload
By Ed Pearce (epearce@egisticsinc.com) of eGistics (www.eGisticsinc.com)
When data pours in by the millisecond and the mountain of information builds continuously, professionals inevitably cut corners and go with their 'gut' when making decisions that can impact financial markets, medical treatments or any number of time sensitive matters, according to a new study from Thomson Reuters. The study indicates that when faced with unsorted, unverified "raw" data, 60 percent of decision-makers will make "intuitive" decisions that can lead to poor outcomes.
Many government regulators have flagged increased financial risk-taking, which can be traced in some degree to imperfectly managed data, as a contributor to the recent financial crisis. Moreover, the world is awash with data -- roughly 800 exabytes -- and the velocity of information is increasing, Thomson Reuters says.
The challenge is that the staffing and investment needed to ensure that information and information channels are trusted, reliable and useful is not keeping pace. In fact, it is estimated that the information universe will increase by a factor of 44; the number of managed files by a factor of 67; storage by a factor of 30 but staffing and investment in careful management by a factor of 1.4.
"The solution to data overload is to provide decision makers with what Thomson Reuters calls Intelligent Information: better organized and structured information, rapidly conveyed to the users preferred device," says David Craig, executive vice president and chief strategy officer.
Fortunately, as the Thomson Reuters study notes, the same technological revolution that has resulted in the explosion of information also opens the way to new and improved tools for providing intelligent information: better organized and structured information, rapidly conveyed to the user's preferred device.
"We must use the benefits of the information technology revolution to minimize its risks. This is a joint task that the private sector and governments must closely focus on if we are to avoid systemic crises, in the future, whether we speak of finance, healthcare delivery, international security and a myriad of other areas," comments Craig.
How is your organization managing information overload?
When data pours in by the millisecond and the mountain of information builds continuously, professionals inevitably cut corners and go with their 'gut' when making decisions that can impact financial markets, medical treatments or any number of time sensitive matters, according to a new study from Thomson Reuters. The study indicates that when faced with unsorted, unverified "raw" data, 60 percent of decision-makers will make "intuitive" decisions that can lead to poor outcomes.
Many government regulators have flagged increased financial risk-taking, which can be traced in some degree to imperfectly managed data, as a contributor to the recent financial crisis. Moreover, the world is awash with data -- roughly 800 exabytes -- and the velocity of information is increasing, Thomson Reuters says.
The challenge is that the staffing and investment needed to ensure that information and information channels are trusted, reliable and useful is not keeping pace. In fact, it is estimated that the information universe will increase by a factor of 44; the number of managed files by a factor of 67; storage by a factor of 30 but staffing and investment in careful management by a factor of 1.4.
"The solution to data overload is to provide decision makers with what Thomson Reuters calls Intelligent Information: better organized and structured information, rapidly conveyed to the users preferred device," says David Craig, executive vice president and chief strategy officer.
Fortunately, as the Thomson Reuters study notes, the same technological revolution that has resulted in the explosion of information also opens the way to new and improved tools for providing intelligent information: better organized and structured information, rapidly conveyed to the user's preferred device.
"We must use the benefits of the information technology revolution to minimize its risks. This is a joint task that the private sector and governments must closely focus on if we are to avoid systemic crises, in the future, whether we speak of finance, healthcare delivery, international security and a myriad of other areas," comments Craig.
How is your organization managing information overload?
Thursday, February 4, 2010
Don't Turn Cybersecurity into a Bureaucracy
Posted by Mark Brousseau
New legislation being discussed in Washington runs the risk of turning cybersecurity into a bureaucracy. Wayne Crews, vice president for policy at the Competitive Enterprise Institute, thinks a better solution is to enhance private sector practices. He explains:
The House of Representatives is considering HR 4061, the Cybersecurity Enhancement Act. A solid Cybersecurity Enhancement Act might read “Title I: Stop losing federal laptops.” That’s too flip, but consider that there are cybersecurity risks to cybersecurity legislation.
Vulnerabilities in the government’s information security policies and the need to “bring government into the 21st century” have long been noted. But given the constant temptation by politicians in both parties to meddle with cybersecurity policy by steering research and development in unnatural directions, any poor decisions made at this juncture could undermine both public and private information security.
Politicians, especially in frontier industries like information technology, often take the easy path of seeking massive sums to establish taxpayer funded research grants for politically favored cybersecurity initiatives, set up redundant cybersecurity agencies, programs, and subsidies. This is precisely what the Cybersecurity Enhancement Act will do, potentially steering cybersecurity research away from its natural, safer, course.
Vastly expanding federal grants, fleets of scholarships and government-induced Ph.D.s in computer security is not the same as actually bolstering security, nor is there any reason the private sector cannot fund the training of its own such personnel or provide application-specific training as needed. Moreover, many serious security problems are not matters of new training but simply of embracing security “best practices” that already exist.
The Cybersecurity Enhancement Act amounts to pork, and the private sector can and should fund the training of America’s security experts. Online security is an immensely valuable industry today, and there is no shortage of private research incentive and potential profit.
Taxpayer-funded scholarships have already been extended to universities in countless respects, and incentives already abound for students to pursue technology careers. These new programs can easily grow beyond the proposed, already-generous bounds.
It’s beyond doubt that online security problems exist. Yet the tendency of cybersecurity today to be seen as an increasingly government-spearheaded function is worrisome. The taxpayer-funding approach can benefit some sectors and companies at the expense of competition and of computer security itself. Federal spending and intervention may encourage market distortion by skewing private investment decisions, or promoting one set of technologies or class of providers at the expense of others
We need better digital equivalents of barbed wire and door locks, which private companies are constantly competing to improve. While government law enforcement agencies have a necessary role to play in investigating and punishing intrusions on private networks and infrastructure, government must coexist with, rather than crowd out, private sector security technologies. Otherwise we become less secure, not more.
A substantial government role invariably grows into an irresistible magnet for lobbyists and the creation of bloated “research centers” and could all too easily become the locus for establishing sub-optimal government authority over our most vulnerable frontier technologies and sciences.
The solution? Enhancing private sector cybersecurity practices.
Both suppliers and customers in the high-tech sector increasingly demand better security from all players. Improving private incentives for information sharing is at least as important as greater government coordination and investment to ensure security and critical infrastructure protection. That job will entail liberalizing critical infrastructure assets—like telecommunications and electricity networks—and relaxing antitrust constraints so firms can coordinate information security strategies and enhance reliability of critical infrastructure through the kind of “partial mergers” that are anathema to today’s antitrust enforcers.
The future will deliver authentication technologies far more capable than those of today. Like everything else in the market, security technologies—from biometric identifiers to firewalls to network monitoring to encrypted databases—benefit from competition. Private cybersecurity initiatives will also gradually move us toward thriving liability and insurance markets, to help address the lack of authentication and inability to exclude bad actors that are at the root of today’s vulnerabilities.
Security is an industry unto itself, let’s not turn it into bureaucracy.
What do you think?
New legislation being discussed in Washington runs the risk of turning cybersecurity into a bureaucracy. Wayne Crews, vice president for policy at the Competitive Enterprise Institute, thinks a better solution is to enhance private sector practices. He explains:
The House of Representatives is considering HR 4061, the Cybersecurity Enhancement Act. A solid Cybersecurity Enhancement Act might read “Title I: Stop losing federal laptops.” That’s too flip, but consider that there are cybersecurity risks to cybersecurity legislation.
Vulnerabilities in the government’s information security policies and the need to “bring government into the 21st century” have long been noted. But given the constant temptation by politicians in both parties to meddle with cybersecurity policy by steering research and development in unnatural directions, any poor decisions made at this juncture could undermine both public and private information security.
Politicians, especially in frontier industries like information technology, often take the easy path of seeking massive sums to establish taxpayer funded research grants for politically favored cybersecurity initiatives, set up redundant cybersecurity agencies, programs, and subsidies. This is precisely what the Cybersecurity Enhancement Act will do, potentially steering cybersecurity research away from its natural, safer, course.
Vastly expanding federal grants, fleets of scholarships and government-induced Ph.D.s in computer security is not the same as actually bolstering security, nor is there any reason the private sector cannot fund the training of its own such personnel or provide application-specific training as needed. Moreover, many serious security problems are not matters of new training but simply of embracing security “best practices” that already exist.
The Cybersecurity Enhancement Act amounts to pork, and the private sector can and should fund the training of America’s security experts. Online security is an immensely valuable industry today, and there is no shortage of private research incentive and potential profit.
Taxpayer-funded scholarships have already been extended to universities in countless respects, and incentives already abound for students to pursue technology careers. These new programs can easily grow beyond the proposed, already-generous bounds.
It’s beyond doubt that online security problems exist. Yet the tendency of cybersecurity today to be seen as an increasingly government-spearheaded function is worrisome. The taxpayer-funding approach can benefit some sectors and companies at the expense of competition and of computer security itself. Federal spending and intervention may encourage market distortion by skewing private investment decisions, or promoting one set of technologies or class of providers at the expense of others
We need better digital equivalents of barbed wire and door locks, which private companies are constantly competing to improve. While government law enforcement agencies have a necessary role to play in investigating and punishing intrusions on private networks and infrastructure, government must coexist with, rather than crowd out, private sector security technologies. Otherwise we become less secure, not more.
A substantial government role invariably grows into an irresistible magnet for lobbyists and the creation of bloated “research centers” and could all too easily become the locus for establishing sub-optimal government authority over our most vulnerable frontier technologies and sciences.
The solution? Enhancing private sector cybersecurity practices.
Both suppliers and customers in the high-tech sector increasingly demand better security from all players. Improving private incentives for information sharing is at least as important as greater government coordination and investment to ensure security and critical infrastructure protection. That job will entail liberalizing critical infrastructure assets—like telecommunications and electricity networks—and relaxing antitrust constraints so firms can coordinate information security strategies and enhance reliability of critical infrastructure through the kind of “partial mergers” that are anathema to today’s antitrust enforcers.
The future will deliver authentication technologies far more capable than those of today. Like everything else in the market, security technologies—from biometric identifiers to firewalls to network monitoring to encrypted databases—benefit from competition. Private cybersecurity initiatives will also gradually move us toward thriving liability and insurance markets, to help address the lack of authentication and inability to exclude bad actors that are at the root of today’s vulnerabilities.
Security is an industry unto itself, let’s not turn it into bureaucracy.
What do you think?
Friday, September 11, 2009
Improving Access to Government Data on the Web
Posted by Mark Brousseau
Diane Mueller passes along her thoughts on the holes in source data on the Web and how the government can help:
On September 4th, the President took another important step toward a more open and transparent government by announcing a new policy to voluntarily disclose White House visitor access records. Aside from a small group of appointments that cannot be disclosed because of their necessarily confidential nature, the record of every visitor who comes to the White House for an appointment, a tour or to conduct business will be released. As historic as the President’s announcement is, it is also a good illustration of what is missing from the administration’s technology infrastructure plan — a coordinated approach to providing data standards.
On the surface, this new disclosure of visitor data looks perfectly fine. The data made available in a simple Comma Separated Values (.csv) file is easily downloaded and opened into a spreadsheet for viewing purposes.
Take a step beyond simple viewing, and try to mash up this content to see where the visitor’s list collides with other interest groups and data sources — you begin to get an idea of the complex nature of data mapping. For example, think of mashing up this visitor information with the U.S. SEC filings that include the names and remuneration of executives of publicly traded companies tagged in XBRL.
Better yet, simply try to blog about someone’s visit to the White House and reference a snippet from the .csv content. Then go to Twitter and post a tweet with a link to your blog so you can have bragging rights about being the first to notice some VIP’s visit. If I then repost the information on my blog and one of my readers wants to get back to the source file to verify the facts without some form of metadata and URI associated with the content, there is no path back to the original source. Therefore, there is no validation that the information is accurate. When I repost your information on my blog, I am simply trusting your cutting and pasting skills and trusting that you accurately interpreted the information. This can be a potentially dangerous situation that often leads to a lot of misinformed “noise.”
So far, in the marriage of social networks and open government, there has been a lot of “noise” coming in, but there has been very little done in the way of creating constructive solutions for accurate and trusted citizen participation.
Without the metadata about the newly disclosed visitor content or any other government information, the accuracy with which data is interpreted is jeopardized with each reuse. Without a link back to the source, the authenticity of the content is no longer discoverable. Without this information, it’s all just more “noise” on the web.
Where Does XML Fit in?
XML industry standards bring metadata to the content. Even a simple XML schema and an instance document would have gone a long way to ensure that, regardless of what tool consumed the visitor data (including spreadsheets), the information would always be interpreted in the same manner. Furthermore, the use of an XML industry standard for identity would enable one to leverage existing tools to mash up the content with other data sources. The key benefit of XML is that consuming applications no longer requires someone to reinvent clever ways of mapping and representing complex data, so developers can expend their energies on solving higher level problems that have a greater return.
There are plenty of other examples across federal, state and municipal government agencies that build the case for leveraging XML industry standards to aid in creating greater transparency and to create efficiencies for the agencies themselves.
Where Do We Go from Here?
Recovery.gov and multiple other individual government agency projects have taken strides forward to granting the public access to government data. However, cross-agency conversations are still taking place to get some agreement on common data models for comparing and mashing up information from multiple data sources accurately.
Efforts such as the NIEM XBRL harmonization discussions should be applauded as this combined effort should aid in the accurate mapping of government financial data across agencies. There is still a long way to go before we can start to leverage the really interesting technologies like Resource Description Framework (RDF) and the Semantic Web.
While everyone wants to jump on the Web 2.0 bandwagon, designing the technology infrastructure to ensure that it is done in an open, transparent and accurate manner requires a lot of cross-agency collaboration. The administration’s goal should be to ensure that the public can collaborate on the analysis and dissemination of public information across the web in a manner that can be trusted, authenticated and redistributed without imposing a cost burden on the consumers or the producers of that information. That is no small task.
This all leaves me wondering if I am guessing correctly about what was being talked about in the White House on 7/14/2009 at 3:00:00PM and about who was in the room. If my assumptions are right — loosely based on about 22,200 Google hits for Stephen J. Hemsley, who was listed as visiting Aneesh Chopra, for whom there are about 1,170,000 Google hits — I’m guessing a lot of these same data topics were addressed with a slight healthcare twist. But then again, I’m doing the interpretations here and making the free associations, so you’ll just have to trust me.
Diane Mueller passes along her thoughts on the holes in source data on the Web and how the government can help:
On September 4th, the President took another important step toward a more open and transparent government by announcing a new policy to voluntarily disclose White House visitor access records. Aside from a small group of appointments that cannot be disclosed because of their necessarily confidential nature, the record of every visitor who comes to the White House for an appointment, a tour or to conduct business will be released. As historic as the President’s announcement is, it is also a good illustration of what is missing from the administration’s technology infrastructure plan — a coordinated approach to providing data standards.
On the surface, this new disclosure of visitor data looks perfectly fine. The data made available in a simple Comma Separated Values (.csv) file is easily downloaded and opened into a spreadsheet for viewing purposes.
Take a step beyond simple viewing, and try to mash up this content to see where the visitor’s list collides with other interest groups and data sources — you begin to get an idea of the complex nature of data mapping. For example, think of mashing up this visitor information with the U.S. SEC filings that include the names and remuneration of executives of publicly traded companies tagged in XBRL.
Better yet, simply try to blog about someone’s visit to the White House and reference a snippet from the .csv content. Then go to Twitter and post a tweet with a link to your blog so you can have bragging rights about being the first to notice some VIP’s visit. If I then repost the information on my blog and one of my readers wants to get back to the source file to verify the facts without some form of metadata and URI associated with the content, there is no path back to the original source. Therefore, there is no validation that the information is accurate. When I repost your information on my blog, I am simply trusting your cutting and pasting skills and trusting that you accurately interpreted the information. This can be a potentially dangerous situation that often leads to a lot of misinformed “noise.”
So far, in the marriage of social networks and open government, there has been a lot of “noise” coming in, but there has been very little done in the way of creating constructive solutions for accurate and trusted citizen participation.
Without the metadata about the newly disclosed visitor content or any other government information, the accuracy with which data is interpreted is jeopardized with each reuse. Without a link back to the source, the authenticity of the content is no longer discoverable. Without this information, it’s all just more “noise” on the web.
Where Does XML Fit in?
XML industry standards bring metadata to the content. Even a simple XML schema and an instance document would have gone a long way to ensure that, regardless of what tool consumed the visitor data (including spreadsheets), the information would always be interpreted in the same manner. Furthermore, the use of an XML industry standard for identity would enable one to leverage existing tools to mash up the content with other data sources. The key benefit of XML is that consuming applications no longer requires someone to reinvent clever ways of mapping and representing complex data, so developers can expend their energies on solving higher level problems that have a greater return.
There are plenty of other examples across federal, state and municipal government agencies that build the case for leveraging XML industry standards to aid in creating greater transparency and to create efficiencies for the agencies themselves.
Where Do We Go from Here?
Recovery.gov and multiple other individual government agency projects have taken strides forward to granting the public access to government data. However, cross-agency conversations are still taking place to get some agreement on common data models for comparing and mashing up information from multiple data sources accurately.
Efforts such as the NIEM XBRL harmonization discussions should be applauded as this combined effort should aid in the accurate mapping of government financial data across agencies. There is still a long way to go before we can start to leverage the really interesting technologies like Resource Description Framework (RDF) and the Semantic Web.
While everyone wants to jump on the Web 2.0 bandwagon, designing the technology infrastructure to ensure that it is done in an open, transparent and accurate manner requires a lot of cross-agency collaboration. The administration’s goal should be to ensure that the public can collaborate on the analysis and dissemination of public information across the web in a manner that can be trusted, authenticated and redistributed without imposing a cost burden on the consumers or the producers of that information. That is no small task.
This all leaves me wondering if I am guessing correctly about what was being talked about in the White House on 7/14/2009 at 3:00:00PM and about who was in the room. If my assumptions are right — loosely based on about 22,200 Google hits for Stephen J. Hemsley, who was listed as visiting Aneesh Chopra, for whom there are about 1,170,000 Google hits — I’m guessing a lot of these same data topics were addressed with a slight healthcare twist. But then again, I’m doing the interpretations here and making the free associations, so you’ll just have to trust me.
Labels:
CSV,
Mark Brousseau,
Obama,
regulations,
TAWPI,
Web 2.0,
XBRL,
XML
Thursday, July 30, 2009
Regulations, Outsourcing Top Industry Trends
By Mark Brousseau
As TAWPI prepares to raise the curtain on its annual Forum & Expo in Washington, D.C. next week, payments and document management operations executives are grappling with mounting regulations, industry-wide over capacity, and pressure from senior management to outsource.
"As a result of the industry scandals, bank and broker/dealer failures, and stock market decline, increased financial services regulations are likely," says Edward Kinsella, second vice president, transfer agent, for John Hancock Financial Services (ekinsella@jhancock.com). "Companies will need to find ways to quickly and efficiently adhere to these new requirements," he warns.
Kinsella says the financial services industry also is facing significant over capacity. "This will likely lead to consolidation, and mergers and acquisitions," Kinsella says. "Companies will be challenged to combine their operations to broaden their product offerings, increase profit margins, reduce expenses, and create new efficiencies and economies of scale," Kinsella adds.
Kinsella also sees a greater push towards outsourcing: "As a result of the economic slowdown, companies are focusing on their core competencies and looking to outsource functions and processes that can be handled by third-parties. Companies must steer clear of functions that distract them from their core competency, or can be handled more cost effectively by others."
Mike Reynolds, executive vice president and director of sales and marketing at Cash Management Solutions, Inc. (mike.reynolds@cashmgmt.com), sees continued interest in outsourcing across all levels of financial institutions. "This is being driven by economics, cost pressures, footprint considerations, and platform replacement decisions," he says, noting that many banks are struggling with whether they should invest in newer lockbox technology. "Innovative banks are exploring combinations of outsourcing and in-house processing."
John Kincade, vice president of business development for J&B Software, Inc. (johnki@jbsoftware.com) also expects increasing customer interest in "hybrid" outsourcing solutions where the customer keeps some of its more strategic payment vehicles in-house, and outsources the labor-intensive functions. "Vendors will have to provide modular solutions that allow this," Kincade says.
Mark Stevens, president and CEO of Moorestown, NJ-based OPEX Corporation (mstevens@opex.com), expects significant consolidation in the retail lockbox market. "There are fewer and fewer companies doing this kind of work," Stevens says. "I believe that we will see three or four companies as the 'last man standing' in this space."
Kinsella says oversight and risk management is critical to the success of outsourcing.
Reynolds notes that for operations that stay in-house, the focus is on improving efficiency and productivity by taking a hard look at existing workflows, technologies, and analyzing staffing and capacity models.
“Companies are driving the last ounce of expense from the business as they strive to meet Wall Street targets,” agrees Bob Young of Manasquan, NJ (lcpard77@verizon.net). “The latest round of earnings releases the past few weeks prove this point.” Payments processing executives are challenged with finding ways to use their current technology – software and hardware – to make their operations more efficient, to satisfy upper management, Young added. “I have to think that the purchase of new processing systems is a low priority, given the economy.”
Reynolds adds that everyone -- service providers, technology vendors and end-user customers -- are seemingly squeezing each other on pricing. "I'm seeing renegotiation initiatives on almost every front as organizations try to better align pricing with volume and product deliverables," Reynolds says, adding that he hopes this eases as the economy improves.
As part of the push to reduce costs and gain operations efficiencies, Stevens believes shared services will become a hot topic. "We are seeing several remittance shops with scanners looking to do AP work for their organizations," Stevens said, adding that he expects this trend to continue.
Similarly, Kincade believes the convergence of forms and payments processing will accelerate next year, with customers moving to more sophisticated correspondence management systems. In some applications, payments can accompany correspondence 30 to 50 percent of the time, Kincade notes.
What do you think? Post your comments below.
As TAWPI prepares to raise the curtain on its annual Forum & Expo in Washington, D.C. next week, payments and document management operations executives are grappling with mounting regulations, industry-wide over capacity, and pressure from senior management to outsource.
"As a result of the industry scandals, bank and broker/dealer failures, and stock market decline, increased financial services regulations are likely," says Edward Kinsella, second vice president, transfer agent, for John Hancock Financial Services (ekinsella@jhancock.com). "Companies will need to find ways to quickly and efficiently adhere to these new requirements," he warns.
Kinsella says the financial services industry also is facing significant over capacity. "This will likely lead to consolidation, and mergers and acquisitions," Kinsella says. "Companies will be challenged to combine their operations to broaden their product offerings, increase profit margins, reduce expenses, and create new efficiencies and economies of scale," Kinsella adds.
Kinsella also sees a greater push towards outsourcing: "As a result of the economic slowdown, companies are focusing on their core competencies and looking to outsource functions and processes that can be handled by third-parties. Companies must steer clear of functions that distract them from their core competency, or can be handled more cost effectively by others."
Mike Reynolds, executive vice president and director of sales and marketing at Cash Management Solutions, Inc. (mike.reynolds@cashmgmt.com), sees continued interest in outsourcing across all levels of financial institutions. "This is being driven by economics, cost pressures, footprint considerations, and platform replacement decisions," he says, noting that many banks are struggling with whether they should invest in newer lockbox technology. "Innovative banks are exploring combinations of outsourcing and in-house processing."
John Kincade, vice president of business development for J&B Software, Inc. (johnki@jbsoftware.com) also expects increasing customer interest in "hybrid" outsourcing solutions where the customer keeps some of its more strategic payment vehicles in-house, and outsources the labor-intensive functions. "Vendors will have to provide modular solutions that allow this," Kincade says.
Mark Stevens, president and CEO of Moorestown, NJ-based OPEX Corporation (mstevens@opex.com), expects significant consolidation in the retail lockbox market. "There are fewer and fewer companies doing this kind of work," Stevens says. "I believe that we will see three or four companies as the 'last man standing' in this space."
Kinsella says oversight and risk management is critical to the success of outsourcing.
Reynolds notes that for operations that stay in-house, the focus is on improving efficiency and productivity by taking a hard look at existing workflows, technologies, and analyzing staffing and capacity models.
“Companies are driving the last ounce of expense from the business as they strive to meet Wall Street targets,” agrees Bob Young of Manasquan, NJ (lcpard77@verizon.net). “The latest round of earnings releases the past few weeks prove this point.” Payments processing executives are challenged with finding ways to use their current technology – software and hardware – to make their operations more efficient, to satisfy upper management, Young added. “I have to think that the purchase of new processing systems is a low priority, given the economy.”
Reynolds adds that everyone -- service providers, technology vendors and end-user customers -- are seemingly squeezing each other on pricing. "I'm seeing renegotiation initiatives on almost every front as organizations try to better align pricing with volume and product deliverables," Reynolds says, adding that he hopes this eases as the economy improves.
As part of the push to reduce costs and gain operations efficiencies, Stevens believes shared services will become a hot topic. "We are seeing several remittance shops with scanners looking to do AP work for their organizations," Stevens said, adding that he expects this trend to continue.
Similarly, Kincade believes the convergence of forms and payments processing will accelerate next year, with customers moving to more sophisticated correspondence management systems. In some applications, payments can accompany correspondence 30 to 50 percent of the time, Kincade notes.
What do you think? Post your comments below.
Labels:
accounting fraud,
Brousseau,
CMS,
economy,
Forum and Expo,
JB Software,
John Hancock,
OPEX,
outsourcing,
regulations,
Reynolds,
TAWPI
Subscribe to:
Posts (Atom)