By Lawrence C. Miller, Peter H. Gregory

Sensitive assets, including data, must be appropriately protected throughout their lifecycles. As a security professional, that’s your job. Information lifecycle management (ILM) covers data through the following five stages:

  • Creation. Data is created by an end user or application. Data needs to be classified at this time, based on the criticality and sensitivity of the data, and a data owner (usually, but not always, the creator) needs to be assigned. Data may exist in many forms such as in documents, spreadsheets, email and text messages, database records, forms, images, presentations (including videoconferences), and printed documents.
  • Distribution (“data in motion”). Data may be distributed (or retrieved) internally within an organization or transmitted to external recipients. Distribution may be manual (such as via courier) or electronic (typically over a network). Data in transit is vulnerable to compromise, so appropriate safeguards must be implemented based on the classification of the data. For example, encryption may be required to send certain sensitive data over a public network. In such cases, appropriate encryption standards must be established. Data loss prevention (DLP) technologies may also be used to prevent accidental or intentional unauthorized distribution of sensitive data.
  • Use (“data in use”). This stage refers to data that has been accessed by an end user or application and is being actively used (for example, read, analyzed, modified, updated, or duplicated) by that user or application. Data in use must be accessed only on systems that are authorized for the classification level of the data and only by users and applications that have appropriate permissions (clearance) and purpose (need-to-know).
  • Maintenance (“data at rest”). Any time between the creation and disposition of data that it is not “in motion” or “in use”, data is maintained “at rest”. Maintenance includes the storage (on media such as a hard drive, removable USB thumb drive, backup magnetic tape, or paper) and filing (for example, in a directory and file structure) of data. Data may also be backed up, and the backup media transported to a secure off-site location (referred to as “data in transit”). Classification levels of data should also be routinely reviewed (typically by the data owner) to determine if a classification level needs to be upgraded (not common) or can be downgraded. Appropriate safeguards must be implemented and regularly audited to ensure
    • Confidentiality (and privacy). For example, using system, directory and file permissions, and encryption.
    • Integrity. For example, using baselines, cryptographic hashes, cyclic redundancy checks (CRCs), and file locking (to prevent or control modification of data by multiple simultaneous users).
    • Availability. For example, using database and file clustering (to eliminate single points of failure), backups and real-time replication (to prevent data loss).
  • Disposition. Finally, when data no longer has any value or is no longer useful to the organization, it needs to be properly destroyed in accordance with corporate retention and destruction policies, as well as any applicable laws and regulations. Certain sensitive data may require a final disposition determination by the data owner, and may require specific destruction procedures (such as witnesses, logging, and a magnetic wipe followed by physical destruction).

Data that has merely been deleted HAS NOT been properly destroyed. It is merely “data at rest” waiting to be over-written — or inconveniently discovered by an unauthorized and potentially malicious third party!

Data remanence refers to data that still exists on storage media or in memory after the data has been “deleted”.

Baselines

Establishing a baseline is a standard business method used to compare an organization to a starting point or minimum standard, or for comparing progress within an organization over time. With security controls, these methods provide valuable insight:

  • Comparing to other organizations. Organizations can compare their control sets with other organizations, to see what differences exist in controls.
  • Comparing internal controls over time. An organization can baseline its set of controls, to see what changes occur in its control set over a period of years.
  • Comparing control effectiveness over time. An organization can compare its record of control effectiveness, to see where progress is being made, and where more effort is needed to make progress.

Scoping and tailoring

Because different parts of an organization and its underlying IT systems store and process different sets of data, it doesn’t make sense for an organization to establish a single set of controls and impose them upon all systems. Like an oversimplified data classification program and its resulting overprotection and underprotection of data, organizations often divide themselves into logical zones, and then specify which controls and sets of controls are applied into these zones.

Another approach is to tailor controls and sets of controls to different IT systems and parts of the organization. For instance, controls on password strength can have categories that are applied to systems with varying security levels.

Both approaches for applying a complex control environment into a complex IT environment are valid – they’re really just different ways of achieving the same objective: applying the right level of control to various systems and environments, based on the information they store and process or on other criteria.

Standards selection

Several excellent control frameworks are available for security professionals’ use. In no circumstances is it necessary to start from scratch. Instead, the best approach is to start with one of several industry leading control frameworks, and then add or remove individual controls to suit the organization’s needs.

Control framework standards include

  • ISO27002, Code of practice for information security management.
  • COBIT, Control Objectives for Information and Related Technology.
  • NIST 800-53, Recommended Security Controls for Federal Information Systems and Organizations.

Cryptography

Crypto plays a critical role in data protection, whether we’re talking about data in motion through a network, or at rest on a server or workstation. Cryptography is all about hiding data in plain sight, because there are situations where persons may be able to access sensitive data; crypto denies people that access unless they are in possession of an encryption key and the method for decrypting it.