The Debt Bill’s Impact on Healthcare

Jason Lee, Director, Global Institute for Emerging Healthcare Practices, CSC

Congress and the President delivered legislation in early August that averted a potentially “catastrophic” default on U.S. debt obligations. In a deal that some news analysts have called a “Greek Tragedy,” the debt ceiling was raised in exchange for bipartisan agreement to reduce the federal budget by the same amount over the next decade.  This is certainly a case where the devil is in the details.

The first part of the Budget Control Act of 2011 establishes budget caps in federal discretionary spending over the next decade.  The second part authorizes a so-called “gang of 12” (a joint committee of a dozen Members of Congress, half from each party and half from each body) to propose at least $1.5 trillion in additional cuts over ten years.  It can recommend any kind of deficit-reducing measures, including changes to Medicare and Medicaid, the massive entitlement programs that escaped untouched in the first part of the law.

If legislation to enact at least $1.2 trillion (not a typo) in additional spending cuts does not result by mid-January 2012 from joint committee action, then automatic across-the-board spending cuts will take effect beginning January 2013, affecting defense and non-defense budgets about equally.  Cuts to the big health care programs are padded by the recently enacted debt bill, which limits Medicare cuts to no more than 2 percent each fiscal year for ten years, and exempts Medicaid and CHIP entirely.

But this is no cause for celebration.  Some Medicare providers operate on less than 2 percent positive margins serving Medicare patients.  Moreover, this potential reduction must be considered in light of two (or even three) other factors that could affect providers in the near term.  One is a significant reduction in physician fees required by sustainable growth rate (SGR) legislation absent a “doc fix” bill.  Another is the implementation of various accountable care payment modifications included in the Affordable Care Act and subsequent administrative rule making (see this new CSC white paper).  A third is the uncertainty linked to growth reduction plans to be proposed by the Medicare Independent Payment Advisory Board (IPAB), slated to go into effect in 2015.  In short, providers and many health plans (not to mention millions of beneficiaries) are concerned about the double, triple or quadruple whammy that could substantially change this highly popular federal program whose costs have run amuck.

What are the implications for healthcare?  At one extreme, providers will go bankrupt and millions of seniors will be unable to get care.  At the other, providers will buckle down and figure out how to operate more efficiently and effectively.  More likely, some providers will operate with reduced margins, others will innovate and offer value propositions, and still others will voluntarily exit the Medicare market hoping to concentrate on more lucrative sources of payment.

Another perspective is that squeezing providers by reducing government health care program spending just accelerates cost shifting to the private sector – which is reflected in high premiums, higher self-funded plan costs, and higher out of pockets costs for individuals

Future funding obligations for Medicare and Medicaid constitute about 23 percent of federal spending.  This amount is five times greater than that for Social Security.  So how can the powerful debt reduction panel—the gang of twelve—not consider mechanisms to reduce the cost of the Medicare and Medicaid programs?  They will.  We now experience a calm before the storm as politicians, powerful interest groups, vaulted thought-leaders, and perhaps the mass public (the elderly) prepare for a period of vigorous debate.

Posted in News | Leave a comment

Emerging Models for Retained IT

David Moschella, Global Research Director, CSC Leading Edge Forum

In an era of both increased sourcing and new technology proliferation, Enterprise IT organizations are deciding what skills they should and shouldn’t retain in-house. The future mission of Enterprise IT is clearly changing, but the nature of the change is not particularly well understood or articulated. Forbes recently published an article that discussed how disruptive forces, such as cloud computing, are influencing the role of CIOs.

It is evident that Enterprise IT is likely to change over the coming years, but what do these changes mean to the skills, competencies and cultures that firms need to retain and develop?

We see four new models emerging to replace the way IT departments traditionally organized themselves:

1.) The Studio model. The metaphor is that of a Hollywood Studio where a variety of skills – directing, acting, cameras, sound, lighting, etc — are brought together in flexible, ever-changing teams to take on sophisticated projects. In this model, IT will be increasingly expected to work seamlessly with scientists, engineers, marketing professionals and others to deliver projects of high strategic value to the business.

2.) The DIY model. This metaphor comes from stores such as Home Depot in the US or B&Q in the UK. These stores provide a huge range of goods (like the IT marketplace itself), but a key part of their value proposition is their knowledgeable staff who provide “free” information and advice. Analogously, many employees and departments will increasingly do their own IT, but will occasionally need expert support.

3.) The Process model. No one understands the details of many key business processes better than Enterprise IT. In most cases, it is the underlying sequence of information processing steps that defines how a business process actually works. Increasingly, IT will take responsibility, and even ownership, of end-to- end business processes, greatly strengthening the CIO/COO relationship.

4.) The Stewardship model. The word, steward, has many connotations, but we particularly like its root meaning as a “keeper.” In the past, this role has been largely focused on information security and associated risks, but going forward, it will expand to include issues such as information architecture and master data management, especially in firms that seek to be information and data-driven.

The opportunities in information technology have never been greater, but only for those organizations that embrace a much more front-of-the-firm future. Read more about the changing nature of business in our Future of Retained IT report.

Posted in As A Service, Business/IT Co-evolution, Cloud Computing, Organizing IT for the Future, The Changing Nature of Work | Tagged , , , | Leave a comment

IT Industry Leaders Release Recommendations for Government Transition to Cloud

Jim Sheaffer, President, North American Public Sector, CSC

The U.S. government earlier this week gained two valuable aids for its transition to new technologies enabled by cloud computing. TechAmerica, responding to requests and encouragement from the Obama administration, released a set of 14 recommendations to guide government through this transition, along with a “Buyer’s Guide” for federal agencies carrying out the administration’s Cloud First policy.

I am personally proud of these two products because I have been privileged to lead some of the efforts that produced them. As I wrote in a previous post, I have served as one of two vice-chairs of TechAmerica’s Commission on the Leadership Opportunity in U.S. Deployment of the Cloud, otherwise known as CLOUD2. The commission comprises 71 IT leaders, mostly from industry, who are helping the government understand how best to move forward in the inevitable – and ultimately highly beneficial – shift to the cloud.

The Commission has undertaken this work because we believe, in the words of Michael Capellas, chairman and CEO of VCE and the Commission’s co-chair, “Faster adoption of cloud computing will strengthen the United States’ leadership position in the global marketplace and ignite creation of jobs that will be in high demand over the next decade.”

Cloud computing is a highly disruptive technology that enables innovative new ways of doing things.   The potential improvements in productivity, entrepreneurial growth and our standard of living are enormous.  The Commission’s 14 recommendations address current barriers to adoption, innovation and growth in the cloud. Some of these barriers have prevented government agencies from moving to the cloud; others have hindered commercial deployment. The recommendations cluster around four themes: (1) Trust, (2) Transnational Data Flows, (3) Transparency and (4) Transformation.

Trust and transparency are familiar issues with respect to the cloud.   Work remains to be done to strengthen security, identity management and information sharing in the cloud.  Also, metrics for cloud offerings must be developed so that customers can understand what they’re buying.

The transnational nature of the cloud leads quickly to matters of national sovereignty and international law. A Brookings report released earlier this week explores the collision of cloud computing and export-control laws, but the difficulties hardly end there. The Commission’s recommendations in this area are complex. They also are urgent. The United States is both a major consumer of the cloud and a leader in cloud markets and innovation. If we are not proactive in addressing issues of transnational data flows, we may impede the global process of cloud adoption.

A major focus in the Transformation area is the federal procurement process, codified in the Federal Acquisition Regulations (FAR). The Commission’s research found that FAR is flexible enough for acquiring cloud services. What’s needed is more a change in mind-set not only among agencies, but also Congress and the Office of Management and Budget (OMB). We further recommend establishing fiscal incentives to encourage agencies to implement cloud solutions. Improvements in infrastructure are also important, including the move to IPv6 – the cloud depends upon reliable and modern networks.

Finally, transformation cannot happen without education and training. Transitioning to the cloud will require new capabilities for business and agency leaders, acquisition and IT workforces. We recommend that government, industry and academia continue to collaborate to develop and distribute the necessary educational resources.

I feel strongly that the work of CLOUD2 is important. The U.S. can achieve global leadership if we act boldly and create an environment of cooperation among government, academia and the private sector.  By embracing the cloud, government can demonstrate, through example, the unprecedented opportunity this transformational technology offers to improve government performance and reduce IT costs.

Posted in Cloud Computing, Government | Tagged , , | Leave a comment

Will the FDCCI Create a Bright Future of IT Professionals?

Yogesh Khanna, Vice President, Chief Technology Officer, CSC

The Federal Data Center Consolidation Initiative (FDCCI) went mainstream last week when the New York Times published an article on the subject. Most of it was old news to the federal IT community, but it was probably new information for the rest of the world.

Aimed at readers who do not live and breathe high tech, the article provided a lot of context about data centers and cloud computing. It was accurate, useful context as far as it went. But I found myself wishing for a richer picture concerning FDCCI’s impact on jobs. That may be too much to ask from an article like this one, but the probable future of the IT profession is more complex and interesting than a simple “tens of thousands of jobs will most likely be eliminated.”

There is indeed a general belief among COOs and CFOs that consolidating data center operations and adopting cloud computing will reduce the staff needed to manage IT operations. There are plenty of examples in Industry that validate this belief. But the role of IT professionals will also evolve.

Even as infrastructure and workloads migrate into the cloud, IT professionals will still be required to manage and integrate cloud-based application services. But the IT professional of the future will have to be much more business centric, and will need “soft skills” to communicate with the business managers and refine the business models that define cloud computing.  Rather than managing inefficient, underutilized infrastructure and business applications, the future IT professional will collaborate with key stakeholders and strategic vendor partners to co-create business value.

IDC predicts that as a result of cloud computing, there will be fewer large-scale contracts for outsourcing functions as well as less spending by government agencies on custom application work.  Competition by the largest vendors will focus on consulting, implementing and managing private clouds, and small vendors will continue to focus more on “traditional” IT. [IDC, Government Insights #GI226454, January 2011]

This evolution will lead to job growth and present opportunities to a new generation of IT professionals who can provision and manage a set of highly integrated services on a virtualized and highly automated infrastructure that can be configured using easy-to-understand business rules.

So while the adoption of cloud computing may have a detrimental impact on IT jobs in the near term, the future looks bright for IT professionals.

Posted in Cloud Computing, Data Centers, Government, News, Organizing IT for the Future | Tagged , | Leave a comment

Operating in Cyberspace – DoD’s Policy Architecture Begins to Emerge

Guy L. Copeland, Vice President, Information Infrastructure Advisory Programs, CSC

Moves to put in place another aspect of the national cybersecurity strategy environment occurred recently with the release of the Department of Defense Strategy for Operating in Cyberspace.

The strategy, delivered by Deputy Secretary of Defense, William J. Lynn III, marks an important step in supporting our national interests in cyberspace, with the definition of five initiatives:

  • Elevate cyberspace to a domain of DoD
  • Adopt active defenses
  • Increase cooperation between Department of Homeland Security and the private sector
  • Cooperate with NATO and other partners
  • Invest in technology and training to make the Internet more secure

While wide-ranging, the strategy does not address fully or resolve clearly, all aspects of cyberspace, reflecting the reality that our national policy has remaining gaps.  For example, the strategy does not make clear what differentiates the use of cyber as an instrument of national power in a contested environment from armed conflict, or cyber war, reflecting that such considerations are still a “work in progress.”

Though many have called on the government to discuss cyber war explicitly, unanswered questions remain addressing cyber war and offensive operations. There may be strategic advantage in retaining ambiguity on this topic, however.

Addressing these questions may need to take place within the larger context of the foreign relations and national security responsibilities of other government departments and agencies. We are just beginning to see the outline of a policy architecture that should lead to a whole-of-government approach to cyber policy and operations.

In addition, the strategy does not touch yet on some of the principal challenges that continue to impede public-private information sharing.

Private sector interests must consider and address potential liabilities associated with both providing or receiving and acting on cybersecurity information. And, in the global economy, effective information sharing across governments and foreign companies or companies operating globally is critical.

Cybersecurity is a team sport.  The essence of successful collaborative partnership is trust, teamwork, and information sharing about threats and best practices to mitigate those threats.

In the end, the DoD strategy provides evidence that the public sector’s policy architecture is beginning to emerge – and that may be its most important feature.  We are beginning to define our national interests in cyberspace, to determine who will develop what aspects of policy to support those interests, what policies are needed, and how resources will be allocated in support of those policies.

While much work remains to be done, we see the DoD Strategy for Operating in Cyberspace as a step in the right direction. We look forward to helping address the many challenges that remain.

Posted in Cyber Security, Cybersecurity, News | Tagged , , | 1 Comment

Cyber: The Strategic Priority to Industry and Government Executives

Michael W. Laphen, Chairman, President and Chief Executive Officer, CSC

Last week I had the pleasure of hosting some of our most distinguished clients alongside government officials and other private sector executives in a candid discussion about the boundless nature and potentially life-altering threats that exist within the IT infrastructures, communications networks and computer systems that comprise our world’s cyberspace.

The list of organizations across the private and public sectors that have been breached by a cyber attack continues to grow by the week. While this fact made for a timely backdrop for our conversation, the harsh reality is that if we, as an industry and as a nation, don’t make a concerted effort to raise awareness and embed security into our core corporate and government competencies, the potential is there – as one of our panelists described – for the most disruptive force in our world to emerge since the discovery of the New World.

That statement is not only an indication of the importance of the threats knocking on our door, but is also representative of the scale and velocity with which attacks have the capacity to impact our lives on every conceivable level – at work, at home or abroad. Though not nearly as developed as it needs to be, the effort to thwart cyber attacks is well underway. During our discussion, several interesting points were raised that I’d like to share.

Public/Private Partnership – The overarching consensus, and mind you there were organizations at the table operating around the world, was that there needs to be a shared responsibility between the public and private sectors to successfully manage the future cyber threat. Lines of communication have to be maintained in order to keep up with the pace and characteristics of the attacks, make the overall effort bottom-line relevant when answering to shareholders and constituents, and develop/adopt a set of standards that all organizations can model.

Defense and Resilience – Defense of the network perimeter is crucial, but even more important is the resilience of an organization to maintain operations after an attack has occurred. More than likely, those creating the damaging code in the form of phishing attacks, botnets and the like, are going to make it into a network if they put their minds to it. Some of these threats reach a sophistication level of commercially developed and sold software used by millions of people. For this reason, resiliency and the ability to curb negative repercussions stemming from the loss of intellectual property will ultimately be the determining factor of success.

The Cyber Domain – In the national security context, cyber is more commonly being defined as a “domain” of equal importance to air, land and sea, and thus requires all of the pertinent doctrine and legal statutes. One of the fundamental differences with cyber, and, frankly, one of the positives, is that, unlike the other domains, we have the capability to control it. The group roundly applauded the Administration’s efforts to develop a set of policy standards around cybersecurity and agreed that, through collaboration with the private sector, the proposal can become a benchmark for securing the new cyber domain. We agreed, however, that the effort must gain additional momentum.

Assurance and Attribution – Last year, for the first time, computer network activity was used to cause physical damage to a nation’s critical infrastructure. There is a growing concern about the lack of trust and assurance of knowing where attacks like Stuxnet originate, and who could be selling that information to the highest bidder. Terrorist networks and enemy states are recruiting “cyber warriors” and creating more sophisticated threats. The private sector would be smart to make a concerted effort to create solutions that allow clients to withstand these attacks and even determine their origin. Without assurance and attribution, lowering our level of risk remains difficult.

The Emergence of Cloud – Cloud computing, and the evolution of the IT services industry to an “as-a-service” business model, presents organizations with the opportunity to pay for alternative networks that offer greater cybersecurity protection. In this scenario, cloud providers would develop separate architectures, provide guarantees of security and accept liabilities, and recoup their investments with higher rates than current public cloud providers.

Whatever sector to which your business belongs, we are all operating within the same threat environment, and we all share a set of extremely dynamic and persistent cybersecurity threats that pose challenges on a daily basis. While news reports of the latest hack may be on the mind of the general public, millions of intrusions go unreported every day. Together, through an open and ongoing dialogue between business and those enforcing and creating the law, we can overcome the hurdles, thwart attacks and operate confidently in cyberspace.

Posted in Cyber Security, Cybersecurity | Tagged , , , | 1 Comment

Intelligent Rail Systems….But where’s my cargo?

Jim Taylor, Vice President, Travel & Transportation Industry Group, CSC

I think the rail industry’s advancements in the use of technology are great, as they bring our society closer to the realization of a vision for effective transportation systems. But an Intelligent Rail System has the potential to be yet another example of a victory in isolation unless it is part of a more holistic approach to Intelligent Freight or Passenger Transportation systems. Taken in isolation, it is an advancement no doubt, but if Intelligent Rail was developed in concert with container shipping lines, truckers and logistics service providers, it would be a major breakthrough.

Don’t get me wrong, it is great to know where a train is, even better to know that it is operating in a safe manner and fantastic that it has the ability to adjust its speed and route based on real-time traffic and route-setting adjustments based on alerts and notifications. But having the ability to provide real-time updates back to customers on where their cargo is, and the condition of that cargo if temperature-sensitive or if hazardous, is the more important breakthrough that is still pending. Further, to know when it will arrive at the destination rail ramp, how that compares to a customer’s inventory planning schedule, whether the payments to release it have been made and whether the next leg of transportation is standing by waiting for the cargo arrival – this is the paradigm shift that consumers and retailers are demanding, and thus the REAL challenge facing the rail industry.

A few years back, the freight railroads in the U.S. took a stab at improving cargo or shipment velocity by shortening the number of free days a trailer or container was allowed to sit on their rail terminals.  The “free storage” interval went from five days to two.  While this was a painful (and expensive) change to impose, users of the services (cargo owners) adapted and were able to move their cargo faster.  This meant better utilization of land, and significantly better utilization of the rail network and infrastructure.

The movement of freight via trains, much like ships, is a very capital intensive business. The only way to get the kind of returns necessary to justify these capital investments and the renewal of infrastructure is to increase the throughput – the volume of paying cargo or paying passengers – across this system.  Optimized cycle times to execute processes is crucial, and making the system easy to use by customers is another critical success factor. To achieve both of these goals, the systems have to be intelligent enough to generate accurate and timely data, which then must be converted into usable information.  That information in turn is used to help model and predict areas for improvement in processes, as well as to make it easier for customers to know when and where trains will arrive with a high degree of confidence. Rail transport systems today (as well as air transport) have significant buffer time built into schedules. This is partly due to congestion in the systems, but also due to poor data, and thus poor information. Schedule integrity and on-time performance reliability are not where they need to be.  We can do better, and the investments being made are a good indication that railroads are willing to spend money to improve their operations.

Hopefully, other players in the value chain will benefit from the railroads’ investments and attempts to improve their systems. As a user of rail services for many years, I am glad to see they have stepped up to the plate. I just hope we can create forums and initiatives where these advancements can be discussed and debated in order to create a more integrated approach and commitment to an optimized network of freight transportation for the entire country. Ideally, improving the velocity of shipments will lead to even lower transportation costs, better utilization of assets and, oh yeah, perhaps happier customers along the way!

Posted in Innovation, News, Transportation | Tagged , | 1 Comment

Too Reliant on ITO? Seriously?

Peter Allen, President, Global Sales & Marketing, CSC

Recent press reports have provided emphasis to the reckoning that some IT Services providers have become “too reliant on IT Outsourcing.”  I find this remarkable on a number of fronts.

Foremost, most legitimate industry analysts have been prolific in expressing the magnitude of the gravitational shift away from “economies of scale” propositions that defined the IT Outsourcing promise of yesterday.  That era began to erode with the rise of the wage-based sourcing strategies that dominated the industry for the past several years.  Given the opportunity to tap into lower cost offshore expertise was a convenient and effective means to avoid outsourcing IT operations, and began to highlight the power of process-driven efficiencies.

Offshoring taught the industry that scale-based propositions didn’t necessarily yield more value than those pivoting around “economies of skill.”

To wit, leading practitioners in the IT Services industry have been transforming their offerings to be decidedly more blended in terms of industrial-grade components that serve as the foundation for business-relevant “as a service” offerings driven by high-order skills in particular business processes.  There are very few “desktop support outsourcing” contracts awarded, in favor of “virtual end user computing services” that equip the mobile knowledge worker of the enterprise. Today’s leaders (and we count CSC among them) have been busy re-evaluating their service portfolio, making investments in emerging services, and bringing innovations in business models along with the latest technological solutions.

It’s not IT Outsourcing any longer.  It’s Infrastructure-as-a-Service.  That’s not a matter of nomenclature; it’s a matter of service pragmatics. Corporations and governments are not just buying different types of services; they want to move from being buyers of hardware and software and associated heavy capital expenditure to an “as a service economy” with operational expenditures.

We believe that within five years about 30 percent of the IT services business will be “as a service” rather than traditional outsourcing. Why do we believe this?  We listen to the plans of our Clients.

The advice I offer to prospective buyers of IT Services is to quiz the candidate providers around their perspective of the industry, and their strategy to succeed.  If the answer sounds like, “grow our base of traditional ITO business,” it’s time to move the conversation to another party.  The “as a Service” train left the station and ITO’s legacy is best expressed through the large number of informed and experienced buyers who can make the distinction between yesterday and tomorrow.

Posted in As A Service, IT Services, News, Outsourcing | Tagged , , | 1 Comment

The Government Needs to Remove Telemedicine Roadblocks

Fran Turisco, Research Principal in the Global Institute of Emerging Healthcare Practices at CSC

Voxiva and ViTelCare are among the many companies in the Washington area and around the globe that have developed telemedicine solutions through online and mobile technologies. Telemedicine is considered to be a key technology enabler of U.S. health reform efforts as it has proven in numerous pilot efforts to contain medical costs while improving access and quality.

Instead of taking off work to make a face-to-face visit, patients through telemedicine can send e-mails regarding minor health issues or medical questions and use webcams for online visits. For physicians, online communication offers a unique way to control their schedule and optimize productivity.

Still, there are roadblocks to moving this technology forward: Often the regulations governing the processes are well behind the innovations. Progress is slowly being made but there are several ways the federal government as well as states can help put telemedicine in the fast lane:

Telemedicine technology allows patients to confer with physicians anywhere around the country. But licensing requirements limit physicians’ jurisdictions.

Overcoming unnecessary licensing barriers is the best way to expedite professional mobility. Cross-state licensing “is seen as one element in the panoply of strategies needed to improve access to quality care services through the deployment of telehealth and other electronic practice services,” according to a recent report issued to Congress by the Health Resources and Services Administration, a U.S. Department of Health and Human Services agency that seeks to improve access to health care services for the uninsured. The agency has awarded grants to the Federation of State Medical Boards to promote physician license portability, encouraging states to set criteria allowing them to approve a valid license of another state.

To date, eight states have adopted the process. More need to do so.

Another problem is reimbursement for services. Not all payers reimburse for telemedicine services, and those that do have restrictions that prohibit telemedicine as a solution for widespread use. For example, waivers are needed to remove restrictions in Medicare Part A and B, to expand the types of locations for telemedicine such as a patient’s home, to include care for patients living in metropolitan areas and for a broader range of services such as physical therapy, occupational therapy and speech therapy.

Physician credentialing is another issue that has hindered telemedicine. The current process requires hospitals receiving the telemedicine services to credential the physicians from the sending hospital — an extremely burdensome task. However, the Centers for Medicare and Medicaid Services recently released a final rule to streamline the credentialing process which should help.

Telemedicine leads to higher workplace productivity, reduced health care costs and less strain on care providers. Patients and the medical community understand the benefits of telemedicine. While progress is being made, the government needs to ensure that legal and regulatory changes are made so telemedicine can truly be a key component of health reform in the United States.

Originally published in the Washington Post, May 8

Posted in Health IT, Innovation, News | Tagged , , | Leave a comment

Privacy and Security – Does Anyone REALLY Care?

Mark Rasch, Director of Cybersecurity and Privacy Consulting, CSC

There has recently been a spate of data breaches and reported cyber-attacks. Epsilon, and SONY PlayStation have lost data on millions; Amazon’s Cloud services were taken down; the RSA SecureID token was compromised, and the South Korean financial institution Nonghyup was reportedly attacked from North Korea. Meanwhile, Google’s offices in South Korea were reportedly raided by the South Korean police, looking for evidence that Google collected and shared geolocation data without the effective consent of users.

And yet, business goes on.

In an era where the COSTS of security and privacy can be readily measured, but the BENEFITS cannot, how do IT and risk officers encourage responsible data protection?

Consumers appear to care about privacy and security. When polled, they universally state that they want their data to be protected, and that they want their privacy protected. And yet, when there is a data breach, with the exception of a few industrious class-action lawyers, most consumers continue to make purchasing decisions based on extrinsic factors. The College Board saw no real diminution in the number of people taking the SAT’s or AP exams as a result of the Epsilon breach. PlayStation sales will likely depend more on the next-generation PlayStation’s 3D Graphics than on SONY’s privacy policies. This is not to say that such breaches are without cost. The SONY breaches (there were two) involve potentially 77 and 25 million data records. Even accounting for redundancy, if the cost of “repair” is a shockingly low $10 per record, we are conservatively estimating half a billion dollars in expense. That’s real money.

But investors are likely to punish SONY not for the breach itself, but for the cost of the breach.

Consumers seem unceasingly willing to forgive retailers for data breaches – particularly in the long term – if the retailer has a product or service that the consumer wants. Indeed, information security, while necessary, is seen as a cost or impediment to progress. Privacy is as well. A secure company that respects privacy and forgoes collecting some personal information may be at a competitive disadvantage over others. Sure, they are in better shape if there are fines or enforcement actions, but those are very rare. So, in this environment, why pay for security and why implement privacy?

A Business Case for Security and Privacy

All too often, privacy and security are seen at best as mandates, and at worst impediments to business. It is something you “have” to do, like taking medicine or driving the speed limit. Sure, it’s good for you, but in the scheme of things, it’s not going to get you the big promotion or bonus. That’s the wrong way to look at security.

At its core, security and privacy are ENABLING technologies. Cloud, mobility, location are all transformative technologies. We can access anything and everything from anywhere. Sensitive email finds us wherever we are. We can do online banking, brokerage, and communication. Collaboration can be done from workstations, iPads and smartphones. The pace of information sharing and data collection continues exponentially. But NONE of these things can be done without effective security – not perfect security, not perfect privacy, but effective security. Authentication, access control, real-time monitoring, secure architecture, secure coding, forensics and incident response are all necessary conditions to enable any Internet based technology. Just as elevator technology was necessary to enable the development of skyscrapers, security technology is essential to enabling cloud, mobility, Bring Your Own Device flexibility and a host of new products and services.

The lessons from these hacks and attacks is that security MUST be embedded in everything we do online. It cannot be “bolted on” afterwards. We must develop a “culture of security” where respect for privacy and understanding of risk are integrated in product development and design. The PlayStation began its life as a gaming device. It then became a gaming PLATFORM, permitting multiplayer gaming. From there it morphed into a communications platform, allowing users to talk to each other while engaged in game playing. Permitting in-game purchases once again changed the platform into a full-fledged e-commerce site, raising PCI and other privacy concerns. In the future, the addition of features like voice, facial or gait recognition, augmented reality, location enablement or a host of other features that could be added to a gaming platform would once again transform the nature of the platform itself. Security and privacy processes which were adequate for a stand-alone gaming device may not be adequate for the new network enabled technology, and privacy concerns increase with each added feature.

With each new feature, companies must address these concerns head on. Regulators and lawmakers are looking carefully at the lessons companies take from these hacks and attacks, and remain wary of “industry self-regulation.” If industries do not act swiftly to protect the privacy and security of data in a meaningful way, regulators will step in. At that point, investors WILL take notice.

Posted in Cloud Computing, Cyber Security, Cybersecurity, Privacy | Tagged , , , , , , , | Leave a comment