Securing Your Business in the Cloud: Asset Management

Originally published by Tom’s IT Pro, then a unit of TechMedia Network.

It’s fair to say it’s impossible to draw an accurate picture of an information network any more.  You can’t put up a picture of a castle with a moat and a drawbridge and talk about security in terms of a fortress — nor, for that matter, as trench warfare or the Battle of Midway or any other situation that can be simulated on a Hollywood soundstage.  Not that this stops folks from trying.

 “In all my years in security, I’ve never recognized a timeframe where the [chief information security officers] I talked to had more trepidation,” remarked Jon Oltsik, a 19-year veteran security analyst currently with Enterprise Strategy Group.  He was speaking to a pre-conference packed with attendees planning to attend the 2014 RSA security conference in San Francisco.  “We’re at a point in history where the things that we relied on historically — the controls, the technologies, the processes — really don’t work anymore.

“In World War I, the most high-tech war beforehand was the Civil War,” Oltsik continued.

When the Allies started charging hills like they did in the Civil War, the Germans responded with machine guns and pill boxes.  So there was different technology, and [the Allies] had to evolve their warfare tactics accordingly.  We’re not evolving our warfare tactics.  And too many security professionals really don’t know what to do.

The problem is this:  Information is no longer containable.  Up until very recently, the entire problem of information security has been framed in the context of the devices that contain it — closing off hard drives, storage networks, and processors; establishing checkpoints; and asking for passwords.  Essentially, all information used to be bits, and bits were in memory or on hard drives or in transit over wires.  But the “containment” of information, such as it is, has evolved from a physical presumption into a service.  Data centers are no longer centralized.  The structures upon which all access to data is leveraged are now virtual.

Here’s how Eric Chiu, president and co-founder of access control tools provider HyTrust, describes the situation:

If you’re a company of 100,000 employees, and now you’ve enabled that company to use any mobile device, instead of having 100,000 desktops, you now have 200,000 endpoints.  Maybe more, because people not only have their phones but their tablet.  Now you have this proliferation of endpoints, and any one of those could be a point of attack.

And by the way, you’ve got to keep up with all these different versions of endpoints.  IPhone is probably the most standardized, but even with that, you probably have everything from iPhone 3 up to 5 and 5s.  Think about Android.  There are so many different versions of Android — hardware, software — and so many different versions of BlackBerry.  Any one of those could be compromised, and could lead to data being accessed through that mobile device.

The Internet is now quite literally an information network — a superstructure of data constructs that transcend all manners of memory, storage, and routing.  The notion of “information at rest,” as opposed to “information in motion,” has become silly.  Information is always somewhere, and there is always some mechanism that is capable of locating and accessing it.  This is by design.

Data centers are no longer centralized.  Data warehouses are actually globally distributed data constructs designed to be accessible logically through what presents itself as a single source — not even the “endpoint” is an endpoint.  Mobile devices have effectively rendered the strategy of protecting the PC moot.  And as Chiu points out, there’s a new class of virtual systems being deployed to enable mobile devices to access corporate networks behind protected envelopes.  But any amount of endpoint protection on those virtual envelopes will not reduce the likelihood of the servers running those envelopes from being compromised at the headend.

“I’ve just hit the mother lode,” says HyTrust’s Chiu, portraying the natural response of a malicious actor who’s bypassed all 100,000 endpoints and penetrated the one virtual system that counts.  “I can access everybody’s data in that one central site.  That’s crazy to think about.”

Theoretically, the list of obstructions rendering it slightly difficult for a fellow to access the credit card numbers of millions of department store customers from any broom closet chosen at random, is chillingly finite.

Define “Asset”

The basic principles of information security should have changed already.  It’s not like everyone failed to see the Internet coming — or at the very least, everyone except Bill Gates.  The change is happening.  It’s slower than it needs to be, it’s leaving a lot of bewildered security experts, and it’s altering the structure of every organization’s IT department.

The first key change is the definition of an information asset.  It’s no longer a device.  Rather, it’s the actual information.  In this new model of the information network:

  • An asset may be a document or file or database from which the information is logically accessed.  It may be stored or replicated any number of places.  It may be written down on a piece of paper.  It quite literally, in the case of research institutions, may solely exist in someone’s mind.  But in this model, it’s a single, tangible, identifiable asset because:

  • Someone is responsible for it.  One name being proffered for this role is steward.  This is not necessarily, and probably not, someone on the IT staff, but rather the employee or worker who created it or has been delegated authority.

  • The objective of asset management in the context of information security changes from protecting the device(s) to protecting the information.

  • The role of policy in this new asset management scheme shifts, to that of specifying who has permission to access the asset, how, when, and under what conditions.

  • The system of ensuring that policies are effective and enforced for each potential point of access (or trespass) is called, simply enough, controls.

  • The regimens of controls that organizations must uphold in order to meet requirements — most of them legal ones — for their respective industries, are frameworks.  And in this new system of information security, frameworks are providing the guideposts for organizations to enact sensible, manageable policies that rely more upon people than upon software.  Proper information asset management can only be carried out by people, perhaps with the aid of software, but not by means of any process that can be effectively automated.  The surprising side-effect of implementing a governance framework, such as ISO 27001 or ISO 27002, is that it gives every employee incentive to participate in security policy.

You often read about the phenomenon of “the consumerization of IT” as a disruptive force and a positive development, simultaneously.  From a security standpoint, BYOD has the consequence of rendering any inventory of an organization’s network, almost by definition, incomplete.

In 2012, in a model for security policy implementation that is intentionally being held to public scrutiny as it happens, Indiana University undertook an unprecedented effort to educate its own faculty, along with its students, as to the care and maintenance of its information systems.  It was IU’s “12 Domains in 12 Months” initiative.  Driven by the school’s chief privacy officer and chief security officer, “12-in-12” rallied the entire campus behind a program derived from a code of practice based on international standards.

Since IU’s program was derived from ISO 27001, it placed a high priority on asset management.  That was month 4.

As IU chief security officer Thomas R. Davis tells us in an interview, there’s no longer a way to maintain a lock on the number of devices accessing the university network that may have access to, or that may end up storing, institutional information.  In addition to students whose devices enter and leave campus at random, separate research projects across a range of departments have their own individual budgets, which are used to purchase, use, and maintain systems and devices of their own.  So the IU network has to be somewhat open by design.  As Davis tells us in an interview:

There really isn’t a silver bullet or a one-stop shop for asset management — one tool that we use for inventory.  Normally it’s part of an audit process, where an auditing organization works with the IT individuals to inventory the assets and identify those that are critical for the function of the organization, and those that may contain, store, process, or transmit substantive information.  Other times, it’s done as the basis of the implementation of a policy.

For example, we have a new policy called IT-28.  It’s more about how we architect IT at the University to be as efficient and as secure as possible.  Part of that whole policy implementation was doing an inventory of the assets that are out in the school’s departments, and that may make more sense to have those assets within the central IT organization — the more robust and secure data centers that we have.

Centralized IT is difficult to accomplish throughout an organization when only certain parts of it are centrally funded.  So IU’s IT-28 policy first determines which resources are central to the campus, and therefore falling under the purview of university IT (UITS).  Teachers, researchers, and faculty may bring in their own devices and even systems, or may provision some resources from cloud providers.  Because those resources do not belong to UITS, then UITS cannot assume management responsibility for them.  Yet it’s these resources that are arguably the most likely to be leveraged by malicious actors exploiting the IU network.

In-between these two classes are administrative resources whose funding may be provided in part by sources outside the university.  For that reason, this “middle” class of resources may actually be maintained by someone else’s centralized IT.  And these IT departments may actually impose requirements on the university, to protect their networks from incursion by a malicious actor originating on campus.

So IT-28 divides active resources into categories as part of everyday asset management.  From there, the standard relies upon education — a process where IU arguably has a unique advantage — to compel faculty, partners, and students to manage their own resources, in a manner consistent with UITS best practices.  Continues IU’s Davis:

The one key part of any kind of cloud environment is appropriate security awareness and education, and training of the entire university population:  What are the risks of cloud computing?  What are the things you need to factor in?  Consider what kind of information you’re sharing with this outside provider.

Also, in conjunction with that, the security and privacy professionals always want to collaborate with the IT organization to make sure that IT sees the risks of cloud computing, and can partner with the security group to develop, or purchase, or enter into agreements with third-party cloud providers, that solve the need of the faculty and staff.  [These third parties] provide IT services that scratch the itch, and minimize the need for faculty and staff to be creative in finding off-site cloud storage solutions that work better.  Ensuring that our course management system is robust and evolving, and meeting the demands placed upon them by faculty, is one example.

To that end, prior to IT-28’s implementation, adds Davis, IU established a relationship with cloud storage provider Box.net.  The company worked with IU’s data stewards committee, which manages its governance policies for institutional data, to ensure the committee’s guidelines are in line with what Box.net can deliver.

Responsibility and Liability

Relationships such as this are necessary for every organization to be able to coordinate stable and manageable policies for the ownership of information assets.  Under international standards such as ISO 27001, but more importantly for every organization whose assets are accessible over a network, the job of asset management essentially should be a delegation of responsibility.  Who owns what?  Prior to the cloud era, any asset considered informational was deemed the responsibility of the IT department.  Most likely someone in another department created it, but since IT was charged with its security, IT was given full access to it.

You can see where this leads:  All information had a set of keys, if you will, that was possessed by an administrator.  (For an understanding of the ramifications, see “Snowden, Edward.”)  That seemed to make sense:  All money was the responsibility of the CFO, all personnel the responsibility of the HR director or chief personnel officer, and all information the purview of the CIO — giving the CEO someone to fire or sue when everything under their purview got lost.  If you think I’m being flippant about that last sentence, consider the fact that the categorical name for this designation is liability.

Asset management under ISO 27001 and 27002 is not the act of finding a scapegoat.  It begins with the act of registering the stewards of information assets — who has the unchallenged right to see and use them, and to decide who else has similar permissions.  In another era (where portions of the intelligence community still reside), granting someone admin privileges was the easiest way to check this job off the to-do list.  With modern asset management, ownership is distributed throughout the organization.  And yes, some privileges may be delegated to a group leader or a senior manager or a mid-level executive that are excluded from the C-level.

How do organizations go about designating stewards while, at the same time, relocating some or all of their data centers’ infrastructure to cloud providers — which, quite typically, employ their own personnel who may have limited rights to access customer data?

I asked Roger Hale, who is now a senior director for global security at Symantec.  “It’s the same checklist,” he responded, “but it’s a checklist of who’s actually providing that information.”

When we’re talking about BYOD, [there’s] liability — who has the liability for access to that data?  If you’re an internal IT department, you have a [service-level agreement] with your business units for how you’re protecting that data, and agreements and policies within your company for working with internal audit compliance as well, today.  All you’re doing is formalizing that, and now you’re going out as a contractual service to a third party.  Even a high-tech manufacturer has, within their tech support systems, access to their customers’ confidential information.  They still have to manage that information in a specific manner.

This isn’t new ground.  It’s actually, how do we solve the problem and make it more efficient?  Because we are going to the cloud.  It’s not if, it’s when.

Confidence vs. Confidentiality

Since it may not be feasible to physically regulate every scrap of information in the corporate network, your next critical phase of information asset management is to prioritize those classes of information that require the most protection.  On the surface, this may seem as pointless as Homeland Security designating an appropriate hue for the alertness level.  But in practice, determining the criticality of information boils down to asking a very simple question:  How much damage would be caused to the organization if documents or databases were lost or stolen?

There’s a frame of mind that remains prevalent in many organizations (especially state and municipal governments) that all information must be treated as equally critical, so that any information loss is considered a breach.  It’s a zero-tolerance policy that has the detrimental side effect of eliminating the texture and contours of the network from administrators’ and managers’ minds.  Once every resource is considered equally pertinent, eventually a sense of general irrelevance creeps in — that everything is equally insecure, like a zombie apocalypse film.

Strangely, this creeping feeling is most pervasive in organizations whose resource depth seems most unfathomable.

“When anything connects to a control system that hasn’t been through security analysis or security litigation, really, you’re just playing Russian Roulette,” says Paul Forney, chief security system architect for automation engineering company Invensys (soon to become part of Schneider Electric).  Forney and his colleagues deal with “controls” in two contexts, the second one being the policies and procedures put in place to maintain information security rather than just respond to events.  But “controls” are actually Invensys’ core business, producing control and safety systems for oil drilling stations and refineries, electric and nuclear power stations, and water treatment facilities.  Invensys’ clients are all considered prime national security targets in their respective countries.

So Forney helped develop a cyber security development lifecycle (SDL) that, at the conceptual level, is actually not that different from its physical security controls.  Stage 1 of SDL is network assessment, but it’s only numbered “1” because it’s the first item on the list.  In both a physical and a virtual asset management system, either of which may comprise literally millions of controls, assessment is a continual, ongoing process that must always include people, and that cannot — despite Invensys’ stated specialty — be completely automated.

But Invensys’ day-to-day scenarios are illustrative of the types of situations where BYOD policies truly cannot be tolerated.  The compromise of certain information assets here could lead to global catastrophes.  “It’s just not practical to have anything on the control system,” says Forney, “that has a dynamically assigned IP address, or that’s connecting via wireless where you don’t have gateways protecting the information flowing back and forth.  I’m not saying you can’t have good wireless networks and control procedures.  [But behind those procedures] is a great set of people working on the types of standards that make this secure.”

Few businesses in the world control information assets whose misuse or pilfering could lead to such dire consequences.  But the common-sense approach Invensys takes to securing those assets could apply to literally every business, including the corner donut shop:  Consultation and training with security professionals who are not software- or cloud-based but human beings (which in an ISO 27001 scenario may include auditors and risk assessors) enables businesses to prioritize.  This means identifying information assets whose loss could lead to the most severe consequences, and designing controls to maintain them first.

For a security professional to qualify for working with Forney and his colleagues, she must now become certified as a Global Industrial Cyber Security Professional (GICSP).  Mastery of the techniques necessary to achieve this certification requires a comprehensive understanding of a newly revised list of the most critical industrial security controls.  Numbers 1 and 2 on this list of 20 concern the regular task of inventory.  And a critical step in these tasks, as officially listed by the SANS security institute, reads as follows:

In addition to an inventory of hardware, organizations should develop an inventory of information assets that identifies their critical information and maps critical information to the hardware assets (including servers, workstations, and laptops) on which it is located.  A department and individual responsible for each information asset should be identified, recorded, and tracked.

No, not the IT department; no, not the CIO.  All information belongs to someone within a working department of the organization.  It should be up to that person to identify information assets, prioritize them in terms of criticality, and designate who has access to that information and how.  It is then up to IT to carry out these designations as policies — or, as the standards now refer to them, as controls.

Previous
Previous

Another side of net neutrality: The case in favor of Title II

Next
Next

The Office 2013 Developers’ Model: Fourth Time’s the Charm?