Dictionary of Information Security From A-Z [Part 1]
This is a dictionary of Information Security, If you working Information Security, that post compilation of security terms and definitions that working security professionals and IT students will find helpful.
Symbols & Numbers
I will post all sysmbol characters related to the confidentiality of letters az
American Bar Association (ABA) Digital Signature Guidelines, a framework of legal principles for using digital signatures and digital certificates in electronic commerce
abnormal termination of a software process or a system (ABnormal END); a crash. The word derives from an error message originally displayed on the IBM 360 computer: it does not appear in error messages of current operating systems although the term is still used. Abnormal or unexpected termination may result in possible security vulnerabilities, and so the term may also be used to refer to the option to terminate, in a controlled manner, a processing activity in a computer system because it is impossible or undesirable for the activity to proceed.
Abstract Syntax Notation One (ASN.1)
standard for describing data objects, this notation format is important to security because of its significance in networking discussions. OSI (Open System Interconnection) standards use ASN.1 to specify data formats for protocols. Syntax is needed to define abstract objects, and encoding rules are needed to transform between abstract objects and bit strings. In ASN.1, formal names are written without spaces, and separate words in a name are indicated by capitalizing the first letter of each word except the first word. For example, the name of a CRL is “certificateRevocationList.”
Acceptable Use Policy (AUP)
written policy outlining the usage that may or may not be made of computing or network resources. Previously, this applied primarily to institutions (such as universities) providing access to systems such as the Internet. Although the term is currently not as widely used, these instructions should still be part of a company’s security policy.
the final inspection to determine whether a facility or system meets the specified technical and performance standards. Similar to certification although generally referring to facilities rather than applications. This inspection is held immediately following facility and software testing and is the basis for commissioning or accepting the information system. If the system is accepted, it receives accreditation.
ability and means to communicate with or otherwise interact with a system: a specific type of interaction between a subject and an object that results in the flow of information from one to the other. A subject may access a file object to obtain data, or a subject may access a system resource and give it command information to obtain service. There is not full agreement on the definition of access: some would insist that the simple ability to receive information is not access unless the subject can also command the object.
process of limiting access to the resources of a system only to authorized users, programs, processes, or other systems (in a network).
Synonymous with controlled access and limited access. Access control may be an administrative, physical, or technical control, but is most commonly implemented via technical controls limiting access to information or resources on a system. Access control is generally a preventive control.
access control list
list of users, programs, and/or processes and the specifications of access categories to which each is assigned
access control mechanism
hardware or software features, operating procedures, management procedures, and various combinations of these designed to detect and prevent unauthorized access and to permit authorized access in an automated system. Access control lists are a technical access control mechanism.
the hierarchical portion of the security level used to identify the sensitivity of data and the clearance or authorization of users. Note: The access level, in conjunction with the non-hierarchical categories, forms the sensitivity label of an object. See category, security level, and sensitivity label.
segment of time, generally expressed on a daily or weekly basis, during which access rights prevail
the nature of an access right to a particular device, program, or file (e.g., read, write, execute, append, modify, delete, or create)
in the narrow view of technical security, the property that enables activities on a system to be traced to individuals (or entities) who may then be held responsible for their actions. More broadly, when dealing with ethics, the duty or voluntary agreement to be responsible for one’s actions, particularly with regard to visibility, transparency, or explanation. In this latter sense, to be accountable, one may have to freely reduce one’s claim on certain other properties that are ordinarily considered part of security, such as privacy.
formal declaration by the command or management authority that the system is approved to operate in a particular security mode using a prescribed set of safeguards. Accreditation is the official management authorization for operation of a system and is based on the certification process and other management considerations. The accreditation statement affixes security responsibility with the management or operating authority, and shows that due care has been taken for security. Essentially, accreditation involves acceptance of the system.
management or command level with authority to accept a particular system
in business continuity planning, the implementation of a procedure, activity, or plan in response to an event or incident
attack or exploit involving an attempt to change or influence a system. See also passive, which generally involves listening or spying.
the system automatically blocks or acts against the progress of a detected attack. The response may take one of three forms: amending the environment (such as changing entries in firewall tables), collecting more information, or striking back against the user. (The last option is not recommended.)
ActiveX controls are software modules based on Microsoft’s Component Object Model (COM) architecture and appear to be Microsoft’s preferred form of active content for Web pages. ActiveX controls are, in fact, almost identical in structure to MS Windows programs, and have full system access. The only security provision is a digital signature system called Authenticode, which offers only “run/don’t run” options, and has additional security limitations.
type of antiviral software that checks for signs of suspicious activity, such as attempts to rewrite program files, format disks, etc. The term activity monitor is usually considered to include operation restrictor type software (also known as activity blocker or behaviour blocker), but is sometimes differentiated in that an activity monitor may just alert the operator to the attempt, rather than disabling it.
see operation restrictor
the retrofitting of protection mechanisms, implemented by hardware or software
the management constraints and supplemental controls established to provide an acceptable level of protection for data. Synonymous with procedural security. More commonly referred to as administrative controls.
Advanced Encryption Standard (AES)
standard developed by the United States National Institute of Standards and Technology to succeed DES. Intended to specify an unclassified, publicly disclosed, symmetric encryption algorithm, available royalty-free worldwide.
entity that attacks, or is a threat to, a system
while not necessarily malware, adware is considered to go beyond the reasonable advertising one might expect from freeware or shareware. Typically, a separate program that is installed at the same time as a shareware or similar program, adware will usually continue to generate advertising even when the user is not running the originally desired program. See also cookies, spyware, and web bugs.
circumstance in which higher level information (which may be thought to be subject to a higher level of security clearance) may be inferred from a large number of lower level data items. As a result, a collection of information items may require classification at a higher security level than any of the individual items that comprise it.
occurrence when a subject’s right to individual pieces of information results in knowledge to which the subject does not have a right. This is usually addressed by restricting combinations of accesses in certain ways.
see Authentication Header
complexity increases the possibility of failure; conversely, simplicity increases robustness. The name comes from the observation that, while it appears obvious that having more than one engine in an airplane increases safety, in fact, a twin-engine airplane has twice as many engine problems as a single-engine airplane, and the loss of either engine may lead to instability.
Automated Information System. Term formerly used by the United States government and military for computer or electronic information systems. Sometimes found in older security texts.
any of a number of devices having three basic components: a sensor (that determines or triggers on some condition), a communications or control system, and an actuator or annunciator (that takes some action or alerts a user or operator). Alarms come in a wide variety of forms and complexities. Antiviral software may have a scanning engine (sensor), user interface (control), and report screen (annunciator). CCTV (Closed Circuit TeleVision) systems may have cameras (sensor), circuit (communications), and monitors (annunciator). Simple fire sprinkler heads have all components in one package: a plug that melts at high temperature (sensor) unblocking the flow (control) of the water behind it (actuator).
notification that an event or incident has occurred
initial phase of a business continuity or disaster recovery plan, during which the first emergency actions, and assessments of damage and implications, take place
sequence of steps needed to solve logical or mathematical problems. In security, the term usually refers to cryptographic algorithms used in encryption or decryption of data files and messages and to create digital signatures, but it may also refer to pattern matching in virus or intrusion detection that does not rely on the use of a simple scan string (see signature).
name that an entity uses in place of its real name, in computing usually for purposes of convenience or brevity, but in security often for the purpose of either anonymity or deception
routing of a call or message over a substitute route when a primary route is unavailable for immediate use. Note that while this function increases availability, it may create problems with integrity or possibly confidentiality.
site pre-arranged for use in the event of a business continuity incident. See cold site, warm site, and hot site.
annual loss expectancy (ALE)
the expected yearly dollar value loss from the system or activity by attacks or threats. Generally calculated by taking the value of a single such loss by a given threat or event (single loss expectancy), and multiplying by the expected number of events over time (annual rate of occurrence) (therefore also known as annualized loss expectancy). Also, sometimes the sum of a number of such calculations.
detecting intrusions by looking for activity that is different from the user’s or system’s normal behaviour. A type of intrusion detection system.
the condition of having an identity that is unknown or concealed. To hide an entity’s real name, an alias may be used. In some applications, anonymous entities may be completely untraceable. See also anonymous login.
access control feature (or weakness) in many Internet hosts that enables users to gain access to general-purpose or public services and resources on a host (such as allowing any user to transfer data using ftp) without having a pre-established, user-specific account (i.e., username and secret password). This feature exposes a system to more threats than when all the users are known, pre-registered entities who are individually accountable for their actions.
use of certain codes (escape sequences, usually embedded in text files or email messages) that remap keys on the keyboard to commands such as DELETE or FORMAT. ANSI (the American National Standards Institute) is a shorthand form that refers to the ANSI screen formatting rules. Many early MS-DOS programs relied on these rules, and required the use of the ANSI.SYS file, which also allowed keyboard remapping. The use of ANSI.SYS is very rare nowadays.
although an adjective, frequently used as a noun as a short form for antivirus software or systems of all types
see scanner, change detection, activity monitor
virus that specifically looks for and removes other viruses. These entities cannot be said to be beneficial or useful examples of viruses, since they have generally created more problems than the viruses they remove. See benign.
small application transported over the networks, especially as an enhancement to a Web page. Applets often arrive from systems that cannot be verified as trusted. Two common applet systems are ActiveX and Java. Java applets are only allowed access to certain functions or information: this restriction is often referred to as the sandbox.
application development control
process or method intended to ensure that an application continues to operate according to its specifications and continues to be available
application level gateway
firewall system in which service is provided by processes that maintain complete TCP connection state and sequencing. Application level firewalls often re-address traffic so that outgoing traffic appears to have originated from the firewall, rather than the internal host. See also proxy server.
(1) site containing a large number of files, possibly acquired over time, and often publicly accessible. See also ftp, particularly anonymous ftp. (2) file that contains a number of related files, usually in a compressed format to reduce file size and transmission (upload or download) time on electronic bulletin boards or download sites on the Internet. Most software distributed as shareware (or similar concepts) is distributed as an archive that contains all related programs, and documentation and possibly data files. Archived files, because of the compression, appear to be encrypted, and therefore, infected files inside archives may not be detected by virus or malware scanning software. See also compressed executable, self-extracting. (3) often synonymous with backup
virus that tries to prevent analysts from examining its code, particularly in terms of resistance to software forensics or forensic programming. The virus may use various methods to make tracing, disassembling, and reverse engineering its code more difficult.
American Standard Code for Information Interchange, a coding system that assigns numerical values to characters such as letters, numbers, punctuation, and other symbols, and used in most American manufactured computers. Often used as a synonym for text. ASCII allows only seven bits per character (for a total of 128 characters).
files consisting of only ASCII characters, and generally only the printable characters. With effort, it is possible to write program files for Intel-based computers consisting only of printable characters. (An example is the EICAR Standard Antivirus Test File.) Windows batch (BAT) files and Visual Basic Script files are also typically plain text, but are interpreted as program files, rather than being executed as object code.
see Abstract Syntax Notation One
computer-oriented language whose instructions are symbolic and usually in one-to-one correspondence with direct computer instructions (machine language). Assembly language is generally specific to a given CPU (central processing unit), although a given CPU may have multiple assembly languages written for it. Machine language is specific to a given CPU, and there is only one machine language for a CPU.
entity of value to the business or enterprise, be it a computer processor, disk, network link, program, datum, or user
measure of confidence that the security features and architecture of a system accurately mediate and enforce the security policy. The policy need not be tied solely to issues of confidentiality, but may address requirements for availability, integrity of data or processing, reliability, safety, or other factors. Assurance is often neglected in planning for security: each security function should have an assurance requirement or metric. Assurance may result from formal methods, or it may be partially determined by audit, penetration testing, simulation, testing, or third-party reviews.
specific level on a hierarchical scale representing successively increased confidence that a target of evaluation adequately fulfills the requirements. The Trusted Computer Security Evaluation Criteria (TCSEC) is one example of such a hierarchy, the Common Criteria is another.
process used to determine that the security features of a system are implemented as designed, and are adequate for the proposed environment. This process may include hands-on functional testing, penetration testing, and/or verification.
asymmetric key encryption
asymmetric key encryption, also known as public key encryption, uses two keys—one publicly known, and one privately held. There are key management advantages to using asymmetric encryption, although the work factor, and therefore the strength of the system, is felt to be weaker than for symmetric systems with equivalent key length.
indivisible, or cannot be split up. For example, an operation may be said to do several things “atomically” if all the component functions are done immediately, and there is no chance of the operation being half-completed or of another being interspersed. Particularly used in terms of database transaction processing, where all the parts of a transaction will be done before any are “committed.”
the act of trying to bypass security controls on a system. An attack may be active, resulting in the alteration of data; or passive, resulting in the release of data. Note: The fact that an attack is made does not necessarily mean it will succeed. The degree of success depends on the vulnerability of the system or activity and the effectiveness of existing countermeasures. Attack is often used as a synonym for a specific exploit. See also brute force, denial of service, distributed denial of service, hijacking, social engineering, sniffing, spoofing, trojan horse, and virus.
activities on, or alterations to, a system indicating an attack or attempted attack, and particularly a specific type of attack, often determined by examination of audit or network logs
in MS-DOS and Windows systems, the characteristics representing file permissions
the collection and analysis of records of activities to assess their compliance with security policy
chronological record of system activities that is sufficient to enable the reconstruction, reviewing, and examination of the sequence of environments and activities surrounding or leading to an operation, a procedure, or an event in a transaction from its inception to final results. Sometimes specifically referred to as a security audit trail.
(1) to verify the identity of a user, device, or other entity in a computer system, often as a prerequisite to allowing access to resources in a system (2) to verify the integrity of data that have been stored, transmitted, or otherwise exposed to possible unauthorized modification
(1) the process of verifying identity, origin, or lack of modification of a subject or object. Authentication of a user is generally based on something the user knows, is, or has. (2) the use of some kind of system to ensure that a file or message that purports to come from a given individual or company actually does. Many authentication systems are now looking toward public key encryption, and the calculation of a check based on the contents of the file or message, and a password or key. Related concepts are change detection and integrity.
Authentication Header (AH)
Internet IPsec protocol (RFC 2402) designed to provide connectionless data integrity service and data origin authentication service for IP datagrams, and (optionally) to provide protection against replay attacks. AH may be used alone, in combination with the IPsec Encapsulating Security Payload (ESP) protocol, or in a nested fashion with tunnelling. ESP can provide the same security services as AH, and can also provide data confidentiality service. The main difference between authentication services provided by ESP and AH is the extent of the coverage; ESP does not protect IP header fields unless they are encapsulated by AH.
portable device used for authenticating a user. Authentication tokens operate by challenge/response, time-based code sequences, or other techniques. These may include paper-based lists of one-time passwords.
means used to confirm the identity or to verify the eligibility of a station, originator, or individual. The standard authenticators are something you have, something you are, or something you know. A common mistake is to use, as an authenticator, an entity or characteristic that is more suitable for use as an identifier. As an example, in the United States a Social Security number (SSN) is unique to an individual, and therefore suitable for identification, but the wide use and availability of this information (despite regulations to the contrary) mean that the SSN is not appropriate as authentication. (The reverse error is also made: face recognition is effective as an authenticator under the right conditions, but has spectacularly failed when used, in large-scale settings, for identification.) Sometimes referred to as authentication information.
property of being genuine and capable of being verified and trusted. It is important, in security, not to assume too much about authenticity. For example, authentication of identity does not prove anything about the motives, competency, or activities of the individual so identified. Checksumming of a program verifies that it has not changed, but does not prove that it was not originally intended to be malicious. See also authenticate, authentication, validate, and verify.
Microsoft’s security system for ActiveX controls as active Web content, and other program verification. A digital signature system, Authenticode verifies only that the code has not changed since it was signed, and that the certificate used to sign the code was originally issued by the certificate authority. Authenticode does not provide for any sandbox restrictions, and, at the time of writing, most systems and applications using Authenticode do not have any certificate revocation capabilities.
the granting of access or other rights to a user, program, or process
abbreviation used to distinguish the antiviral research community (AV) from those who call themselves “virus researchers” but are primarily interested in writing and exchanging viral programs (vx). Also an abbreviation for antivirus software. See also vx.
the state when the system, resources, and data are in the place needed by the user, at the time the user needs them, and in the form needed by the user. Availability is one of the three pillars of security.
fictional character, possibly created by Joe Talmadge, originally used or referred to in Usenet news postings, portraying the stereotypical “newbie” (novice or newcomer) on the Net, generally a blackhat wannabe. Postings from B1FF typically use capital letters, deliberately misspell words, and replace letters with non-alphabetic characters (dudes=D00Dz). Because such people tend to claim “!33t” (elite) status, !33t has come to replace B1FF in more recent writings, although there are many variations in spelling for both terms. B1FF is probably the preferred one, since it looks like hexadecimal representation. No relation to the biff mail notification utility.
hidden software or hardware mechanism that can be triggered to permit system protection mechanisms to be circumvented. The function will generally provide unusually high, or even full, access to the system either without an account or from a normally restricted account. It is activated in some innocent-appearing manner; for example, a key sequence at a terminal. Invocation of the backdoor can also be done by sending a specific packet to a network port; see RAT. Software developers often introduce backdoors in their code to enable them to reenter the system and perform certain functions; see maintenance hook. The backdoor is sometimes left in a fully developed system either by design or by accident. Synonymous with trap door, which was formerly the preferred usage. Usage back door is also very common.
task executed by the system that generally remains invisible to the user. Most processes in advanced or multi-user systems operate in the background. Some malware is designed to be executed as a background task so the user does not realize unwanted actions are occurring. Many attacks often take advantage of loopholes in utility processes operating in the background.
(n) a duplicate copy of data made for archiving purposes or for protecting against damage or loss (v) the process of creating duplicate data. Some programs back up data files while maintaining both the current version and the preceding version on disk. However, a backup is not considered secure unless it is stored away from the original, and so removable media is preferred.
procedure for maintaining backups of system and user data. See contingency plan, differential backup, full backup, and incremental backup.
specialized form of virus that does not attach to a specific file, possibly also related to spread by electronic mail. Usage obscure: term should not be used.
acronym for Broken As Designed, said of a program that is useless because of bad design rather than bugs
usually used in reference to a file, this refers to a virus infection target of initially known characteristics. To trap file infectors that insist on larger files, a string of null characters of arbitrary length is often used.
Floppy disks are, of course, used as bait for boot sector viruses, but the term is not often used in that way. Another name for bait files is goat or sacrificial goat files.
the initial message given by a system to prompt a login or identify a connection. Originally called a “welcome message,” since most banners said something like “Welcome to XYZ Corp Computer System,” the term banner is now preferred because some system intruders used the “welcome” implication to avoid prosecution.
situation of a system either in normal operation, or at a particular point in time. Generally, this is measured by an image or calculation taken on a system at a given moment.
system that has been hardened to resist attack and is installed on a network in such a way that it is expected to come under attack. Bastion hosts are often components of firewalls, or may be Web servers or public access systems connected to an untrusted or public network. A honeypot is often a bastion host with additional audit and alerting functions.
in business continuity planning, a collection of information and other resources necessary for the operation of a plan packed in a single container so as to be readily available and transportable as needed. See also go bag.
see activity monitor
see operation restrictor
Bell-La Padula model
formal state transition model of computer security policy that describes a set of access control rules. In this formal model, the entities in a computer system are divided into abstract sets of subjects and objects. The notion of a secure state is defined, and it is proven that each state transition preserves security by moving from secure state to secure state, thereby inductively proving that the system is secure. A system state is defined to be “secure” if the only permitted access modes of subjects to objects are in accordance with a specific security policy. In order to determine whether a specific access mode is allowed, the clearance of a subject is compared to the classification of the object, and a determination is made as to whether the subject is authorized for the specific access mode.
More specifically, Bell-La Padula is concerned with confidentiality. Subjects in the model are forbidden from obtaining (reading) information from an object of higher classification, and from divulging (writing) information to an object of lower classification. See star property (*-property) and simple security property.
somewhat careless term often used to describe a virus that appears not to be intentionally malicious in that it does not carry an obviously damaging payload code section. Since viral programs may cause problems simply by the use of system resources or the modification of files, many are of the opinion that a benign or good virus is impossible.
non-hostile environment that may be protected from external hostile elements by physical, personnel, and procedural security countermeasures
the gold standard for security buzzphrases. In fact, there was an extended discussion on the use of the phrase “best practice” on the CISSPforum in July 2005. The implication of best practice is that it is an optimum procedure for most situations, although it may also imply a practice that works in every situation, or a minimum standard. It was, however, noted that “best practice” is never a guarantee or panacea. Other phrases discussed were standard practice (what most people do), essential practice (what should be done as an absolute minimum), and leading practice (what the “best” companies do).
unauthorized access obtained by tapping the temporarily inactive terminal of a legitimate user. See hijacking, and piggyback. The general condition is referred to as TOC/TOU (Time Of Check versus Time Of Use, pronounced like “talk to”), meaning a discrepancy between the time of a check being made, and the time the resource is used.
level of trust defined by the Department of Defense Trusted Computer System Evaluation Criteria (TCSEC) that is beyond the technology available at the time the criteria were developed. It includes all the A1-level features plus additional ones not required at the A1 level.
pertaining to the measurement of the human body: in security terms, relating to means of authentication based on patterns unique to an individual’s body, such as fingerprints and retinal patterns, or behaviour, such as voiceprints or handwriting
Basic Input/Output System, the “hardwired” firmware programming used to start the boot process in ISA/Wintel computers. The BIOS is located in the ROM area of the system and is usually stored permanently. There are many BIOS versions in ISA/Wintel computers, but they generally assume the operating system will be interrupt driven (as MS-DOS is), and start to set up structures to support that model. Since boot sector infectors run before the operating system starts, and require only the BIOS programming, they are sometimes called BIOS viruses, although the term can create confusion and should be avoided. Some computers now use EEPROM (Electrically Erasable Programmable Read Only Memory), and at least one virus now tries to erase such “flash” BIOS programming. Otherwise, BIOS cannot be infected or corrupted by a virus.
bit error rate
the number of erroneous bits divided by the total number of bits transmitted, received, or processed over some period of time. Also bit error ratio. Abbreviated BER.
communities or individuals who attempt to break into computer systems without prior authorization, or explore security primarily from an attack perspective. The term originates from old American western genre movies where the “good guys” always wore white hats and the “bad guys” always wore black. See also whitehat.
crypto-algorithm that encrypts data in discrete blocks of a given size, rather than as a continuous stream of bits. Compare with stream cipher.
to start (cold boot) or reset (warm boot) the computer. The term arises from the phrase “bootstrap program,” and the idea of lifting oneself by one’s own bootstraps, or starting with no support.
the program recorded in the first physical or logical sector mounted on the disk drive, and containing software to get the computer to a working state. The term is most commonly used in connection with ISA or Wintel computers, where there are actually two boot records: the master boot record (dealing with disk and hardware structure), and the system boot record (containing pointers to operating system files). See also boot sector.
generically, the first sector, or sectors, on any disk, usually containing programming necessary for the boot process. In ISA or Wintel computers, the term is not well defined, although it is generally accepted to be the system boot record, and thus the first physical sector on floppy diskettes and the first logical sector on hard disks. For precision in dealing with security issues and concerns, it is best to refer specifically to the master boot record or system boot record.
boot sector infector (BSI)
virus that places its starting code in the boot sector, thus being run before any programming, including the operating system. A BSI is able to take control of interrupts and machine functions, may be able to subvert some protection and detection measures, and is also considered a virus of the base computer hardware, rather than of the operating system. In ISA computers, when MS-DOS was the dominant operating system and before widespread use of public networks for data transfer, BSIs were the most successful form of virus, and were considered BIOS viruses. Some BIOS boot sector infectors occupied the master boot record, while others inhabited the system boot record: in most cases, the displaced record was moved to an unused sector of the disk so control could be passed to it once the virus had run, and thus the computer would appear to have a normal boot process. Also known as boot sector virus, BSV.
boot sector virus
see boot sector infector
notification message returned to the sender by a site unable to relay email to the intended recipient. Reasons might include a nonexistent or misspelled username, or a relay site that is down.
testing of computer program results for access to storage outside authorized or proper limits
almost certainly the first virus written in the MS-DOS computing environment that became widespread among normal computer users. An example of a “strict” boot sector infector and the earliest known use of stealth virus programming. Sometimes referred to as “Brain (C)” or “(C) Brain” due to the presence of the string “(C) 1986 Brain” in the body of the virus. (Many books and articles use the copyright symbol instead of the “(C)” string, but the copyright symbol does not appear in the body of the virus.)
the successful and repeatable defeat of security controls with or without an arrest of the system itself, which could result in penetration of the system
to stop a program temporarily, so it may be debugged. The place where it stops is a “breakpoint.”
British Standard 7799 (BS7799)
standard code of practice and guidance on how to secure an information system, and the management framework, objectives, and control requirements for information security management systems, in three, increasingly detailed, parts. Versions were subsequently accepted as ISO Standards 17799, 27001, and 27002.
“Adding manpower to a late software project makes it later,” a result of the fact that the advantage from splitting work among N programmers is expected to be proportional to N, but the overhead costs associated with coordinating their work is proportional to the square of N. The quote is from Fred Brooks, a manager of IBM’s OS/360 project and author of The Mythical Man-Month, an excellent early book on software engineering.
the act of searching through storage to locate or acquire information without necessarily knowing the existence or the format of the information being sought
attack methodology whereby all possible options are used in turn, usually in a programmed sequence attempting to use all possible passwords or decryption keys. See also dictionary attack.
see British Standard 7799
see boot sector infector
see boot sector infector
common program error in which input is not checked for length. Excessive input may overflow the memory allotted and, if not discarded, may create a situation where the program can be forced to execute arbitrary code or switch operation control to an arbitrary location.
unintentional fault, generally in programming code or implementation, that may make a system fail or behave in unexpected ways, and, in any case, causes actions that neither the programmer nor the user planned. Common examples of bugs are buffer overflows, loopholes, or maintenance hooks left in place when a project is complete. An error of design is more correctly referred to as a flaw. Computer mythology credits Grace Murray Hopper with the invention of the term “bug,” but it was known to have been used in engineering circles in the nineteenth century, and Hopper herself referred to the “[f]irst actual case of bug being found” in a machine. The “moth in the Mark II,” and its subsequent use as an excuse to Howard Aiken when he asked why the machine was not “makin’ numbers,” may have been the origin of the use of “debugging” as a verb. The bug can be seen online courtesy of the Smithsonian institution at http://americanhistory.si.edu/csr/comphist/objects/bug.htm.
often the final compilation of a software system, but also the particular program that results. Thus, a build is similar to a version of software, except that build is generally more precise. Security and vulnerability assessments of software will be based on a given build, and applying an assessment to other versions could result in incorrect conclusions. See also system model.
business continuity plan (BCP)
plan and preparations directed toward the immediate recovery of systems critical to the function of the business, or to the ability of the business to operate in the temporary absence of important systems. Activities related to the preparation and maintenance of such a plan are usually referred to as business continuity management. See also contingency plan and disaster recovery plan.
business impact analysis (BIA)
assessment of the effects and potential loss resulting from an incident that might affect business continuity. Similar, but not identical, to risk analysis.
procedure for identifying a remote terminal. In a call back, the host system disconnects the caller and then dials the authorized telephone number of the remote terminal to reestablish the connection. Of limited use for remote access, and recently subject to failure because of call forwarding technologies. Synonymous with dial back or ring back.
structured and generally cascading process for contacting personnel in an emergency, particularly in business continuity planning. See also fan out.
protected identifier that identifies the object and specifies the access rights to be allowed to the accessor (or subject) who possesses the capability. In a capability-based system, access to protected objects such as files is granted if the would-be subject possesses a capability for the object. This can also be used for authentication (by seeing the system as an object). Due to the protection of the capability identifier, the ability to assume false authorization is reduced. Often implemented as capability tables.
restrictive label that has been applied to classified or unclassified data as a means of increasing the protection of the data and further restricting access to the data
type of overwriting virus that overwrites either slack space within or behind the target program file, or sections of null data within the file, such that it can infect the host file without increasing the length of the file while also preserving the host’s functionality. Usage rare.
the Computer Emergency Response Team established at the Software Engineering Institute (SEI) of Carnegie-Mellon University after the 1988 Internet worm attack. Recently, the preferred reference has been CERT/CC (Computer Emergency Response Team Coordination Center). CMU has apparently obtained exclusive use of the name CERT, and recommends that other emergency teams style themselves as CIRTs (Computer Incident Response Teams).
digitally signed statement that contains information about an entity and the entity’s public key
certificate revocation list (CRL)
document maintained and published by a certification authority listing those certificates previously issued by the CA that are no longer valid
the comprehensive evaluation of the technical and nontechnical security features of a system and other safeguards, made in support of the accreditation process, that establishes the extent to which a particular design and implementation meets a specified set of security requirements. Note that certification has no relation to an asymmetric key encryption certificate, or the related authorities and lists.
certification authority (CA)
central authority for key management in an overall system for the use of asymmetric encryption known as a public key infrastructure, or PKI. Certification authority, for some reason, is generally capitalized, and is usually referred to by the acronym CA. CA may refer to an individual office or server, but a single CA is usually part of a hierarchy, and certification authority may refer to the entire hierarchy as well.
message directing the recipient to send out multiple copies so its circulation increases in a geometric progression as long as the instructions are carried out. The chain letter is usually described as having a tripartite structure: a hook to catch interest, a threat to persuade the recipient to comply, and a request to copy and spread the message. The content of the chain letter message often mentions spreading luck, friendship, some type of urban legend, or a warning of some kind. A chain letter is a type of spam, and a virus alert hoax is a type of chain letter.
Challenge Handshake Authentication Protocol (CHAP)
peer entity authentication method for PPP (Point to Point Protocol), using a randomly generated challenge and requiring a matching response that depends on a cryptographic hash of the challenge and a secret key
security procedure in which one communicator requests authentication of another communicator, and the latter replies with a response based on data provided by the first. The concepts of challenge/response, initialization vector, nonce, salt, and seed are closely related. Challenge/response is generally used in regard to password and authentication schemes, initialization vector to block ciphers, nonce to short, automated network messages, salt to password storage, and seed to pseudorandom number generation.
management tool to provide control and traceability for all changes made to the system. Often, reference is made to change management.
antiviral software that looks for changes in the computer system. A virus must change something, and it is assumed that program files, disk system areas, and certain areas of memory should not change. This software is often referred to as integrity checking software, but it does not necessarily protect the integrity of data, nor does it always assess the reasons for a possibly valid change. Change detection using strong encryption is sometimes also known as authentication software.
formal process for requesting, specifying, approving, developing (or acquiring), testing, and accepting changes to a software system. Generally, change management involves new programming, or a new purchase of software. Depending on the criticality of the system, similarly formal processes may be applied to configuration changes, or the application of patches (which may involve protection against security vulnerabilities). (Note that managers in non-technical environments may define change management as any process to reduce disruption from any change to operations.)
see Challenge Handshake Authentication Protocol
known situation or state of the system or process, from which processing may continue or be restarted in the case of a failure or other problem
calculation based on the content of data, which, if performed at one time and then compared against the same calculation at a later time, can be used to determine if the content of the data has changed. In its strictest form, a checksum is a calculation based on adding or summing up all the bytes or 1-bits in a file or message. Parity bits in asynchronous transmission are a type of checksum. The term is sometimes carelessly used to refer to all methods of change detection or authentication that rely on some level of calculation based on file contents, such as cyclic redundancy checking (CRC).
cryptanalysis technique in which the analyst tries to determine the key from knowledge of plaintext that corresponds to ciphertext selected or dictated by the analyst
cryptanalysis technique in which the analyst tries to determine the key from knowledge of ciphertext that corresponds to plaintext selected or dictated by the analyst
specific example of a viral type of email message, the earliest known script email virus, using the REXX scripting language. This message was released in December 1987. The user was asked to type “CHRISTMA” to generate an electronic Christmas card, but was not told that the program also made, and mailed, copies of itself during the display. (Within the virus research community, the form “CHRISTMA EXEC” is used almost universally. The more correct form is “CHRISTMA exec,” since REXX scripts were referred to as “execs” to distinguish them from the earlier EXEC language in IBM mainframes.)
see Computer Incident Advisory Capability
cryptographic algorithm for encryption and decryption
cipher block chaining (CBC)
block cipher mode that enhances electronic codebook mode by chaining together blocks of ciphertext it produces. This mode operates by combining (exclusive OR-ing) the algorithm’s ciphertext output block with the next plaintext block to form the next input block for the algorithm.
cipher feedback (CFB)
block cipher mode that enhances electronic codebook mode by chaining together the blocks of ciphertext it produces and operating on plaintext segments of variable length less than or equal to the block length. This mode operates by using the previously generated ciphertext segment as the algorithm’s input (i.e., by “feeding back” the ciphertext) to generate an output block, and then combining (exclusive OR-ing) that output block with the next plaintext segment (block length or less) to form the next ciphertext segment.
apparently random string of data, conveying little or no information to an unauthorized entity, but from which an original message or plaintext can be extracted by means of an appropriate key and algorithm
cryptanalysis technique in which the analyst tries to determine the key solely from knowledge of intercepted ciphertext (although the analyst may also know other clues, such as the cryptographic algorithm, the language in which the plaintext was written, the subject matter of the plaintext, and some probable plaintext words)
(1) a grouping of classified information to which a hierarchical, restrictive security label is applied to increase protection of the data (2) the level of protection required to be applied to that information. See also security level.
refers to information that is formally required by a security policy to be given data confidentiality service and to be marked with a security label to indicate its protected status. The term is mainly used in government, especially in the military, and particularly in the United States Department of Defense. See also unclassified.
unencrypted data, also known as plaintext
system entity that requests and uses a service or resource provided by another system entity (the server)
model of network operation where services and resources are requested by the client and fulfilled by the server. The significance to security is that security policy should be (but is not always) enforced by the server. In peer-to-peer models of networking, a more complex security model must generally be implemented.
closed security environment
environment in which both of the following conditions hold true: (1) application developers (including maintainers) have sufficient clearances and authorizations to provide an acceptable presumption that they have not introduced malicious logic. (2) configuration control provides sufficient assurance that applications and the equipment are protected against the introduction of malicious logic prior to and during the operation of system applications.
virus that makes a change to disk or directory structure data such that when a valid program is invoked, the virus is run first. Because the data to be changed is very small, it can be altered very rapidly, affecting large numbers of files in a short space of time, and therefore these viruses were sometimes called fast infectors. Also known as FAT virus (after the MS-DOS File Allocation Table directory structure), sector virus, and system virus.
stands for complementary metal oxide semiconductor. This technology is used in a form of memory that can be held in the computer, while the main power is off, with low-power battery backup. CMOS memory is used in MS-DOS/BIOS/ISA computers to hold small tables of information regarding the basic hardware of the system. Since the memory is maintained while the power is off, there is a myth that viruses can hide in the CMOS. (CMOS memory is too small, and the contents are never executed as a program.) Also, when the battery power fails, the computer is temporarily unusable. This is often attributed, falsely, to viral activity.
Control Objectives for Information and related Technology, the set of IT control objectives published by the Information Systems Audit and Control Association (ISACA), the body that certifies IT security auditors
(1) in computer terminology, refers to either human (source) or machine (object) readable programming or fragments thereof. Since viruses, before they attach to a host program, are not complete programs, they are often referred to as code to distinguish them from programs that are complete in themselves. (2) a system of symbols used to represent information, which might originally have had some other meaning. This is often seen as synonymous with cipher or encryption, but codes usually have fixed meaning relations, rather than algorithmic transformations of data. In some cases, specialized types of codes may be applied to or embedded with data, supplying redundant information for purposes such as error detection and correction. See forward error correction and Hamming code.
the first variant of a family that possibly included the almost equally well-known Nimda. Code Red infected Internet servers running the Microsoft IIS (Internet Information Server) software, and used a known bug in that program to infect new machines. Probably due to the popularity of the IIS server on low maintenance sites, Code Red infected approximately 350,000 machines within 9 to 13 hours. Despite this success, Code Red was never as dangerous as it was made out to be, and was definitely a media virus.
in business continuity planning or disaster recovery planning, an alternate site with necessary electrical and service connections, but no running system, maintained by an organization to facilitate prompt resumption of service after a disaster. Some organizations now refer to “gradual recovery.” See also warm site, hot site.
central point for coordination of activities for emergency operations or business continuity
programs that are sold either directly from the manufacturer or through normal retail channels, as opposed to shareware. Users are often told to “buy only commercial” as a defence against virus infections or other types of malware. In fact, there is very little risk of obtaining viruses from shareware, and there are many known instances of viral programs infecting commercial software. In terms of other forms of malware, it is often proposed that the number of serious bugs in any new commercial software may rival the number of trojan programs released in any given period of time. See also freeware, public domain, open source, and shareware.
an attempt to harmonize the various national security standards and security philosophies. See Common Criteria for Information Technology Security.
Common Criteria for Information Technology Security
the Common Criteria is a standard for evaluating information technology products and systems, such as operating systems, computer networks, distributed systems, and applications. It states requirements for security functions and for assurance measures. Canada, France, Germany, the Netherlands, the United Kingdom, and the United States (NIST and NSA) began developing this standard in 1993, based on the European ITSEC, the Canadian Trusted Computer Product Evaluation Criteria (CTCPEC), and the United States “Federal Criteria for Information Technology Security” (FC) and its precursor, the TCSEC. Version 2.1 of the Criteria is equivalent to ISO’s International Standard 15408 (I15408).
communications security (COMSEC)
measures taken to deny unauthorized persons information derived from telecommunications of the United States Government concerning national security, and to ensure the authenticity of such telecommunications. Communications security includes cryptosecurity, transmission security, emission security, and physical security of communications security material and information.
type of virus that does not actually attach to another program, but interposes itself into the chain of command, so that the virus is executed before the infected program. Most often, this is done by using a similar name and the rules of program precedence to associate itself with a regular program. Also referred to as a spawning virus.
class of information that has need-to-know access controls beyond those normally provided for access to multilevel security. Related to multilateral security and the lattice model.
compartmented security mode
see modes of operation
program file that has been compressed to save disk space, and automatically returns to executable form when invoked. Because compression appears to be a form of encryption, programs that are infected with a virus before being compressed, or those that contain other forms of malware, may hide the infection from scanning software. See also archive, and self-extracting.
(v) to perform an action not in accordance with the security policy, or to cause a system to do so (n) a violation of the security policy of a system such that unauthorized disclosure of sensitive information may have occurred
unintentional data-related or intelligence-bearing signals that, if intercepted and analyzed, disclose the information transmission received, handled, or otherwise processed by any information processing equipment. See TEMPEST.
the misuse, alteration, disruption, or destruction of data processing resources. The key aspect is that it is intentional and improper.
the use of a crypto-algorithm in a computer, microprocessor, or microcomputer to perform encryption or decryption to protect information or to authenticate users, sources, or information. U.S. government or military term.
originally the full means of obtaining legal evidence from computers and computer use, computer forensics is now limited to recovery of data from computers and computer media. Computer forensics has therefore become only one part of digital forensics.
computer-related crimes involving deliberate misrepresentation, alteration, or disclosure of data to obtain something of value (usually for monetary gain). A computer system must have been involved in the perpetration or cover-up of the act or series of acts. A computer system might have been involved through improper manipulation of input data; output or results; applications programs; data files; computer operations; communications; or computer hardware, systems software, or firmware.
Computer Incident Advisory Capability (CIAC)
computer emergency response team within the U.S. Department of Energy, this group is widely known for a series of highly regarded messages and postings about security vulnerabilities
computer security audit
independent evaluation of the controls employed to ensure appropriate protection of an organization’s information assets. A formal security audit has goals and procedures somewhat different from the normal and ongoing audit process.
computer security subsystem
device designed to provide limited computer security features in a larger system environment
Computer Security Technical Vulnerability Reporting Program (CSTVRP)
program that focuses on technical vulnerabilities in commercially available hardware, firmware, and software products acquired by the U.S. Department of Defense. CSTVRP provides for the reporting, catalouging, and discrete dissemination of technical vulnerability and corrective measure information to Department of Defense components on a need-to-know basis.
computer viral program
Rob Slade’s own invention. In an attempt to avoid the fights over what constitutes a “true” virus, he uses the term “viral” to refer to self-reproducing programs regardless of other distinctions. So far, he’s gotten away with it.
method of achieving confidentiality in which sensitive information is hidden by embedding it in irrelevant data. See also steganography.
probably the first Microsoft Word macro virus, and certainly the first macro virus to be successful in the wild
the concept of holding sensitive data in confidence, limited to an appropriate set of individuals or organizations. Confidentiality is considered one of the three pillars of security.
the process of controlling modifications to the system’s hardware, firmware, software, and documentation that provides sufficient assurance that the system is protected against the introduction of improper modifications prior to, during, and after system implementation. Compare configuration management.
the management of security related features and assurances through control of changes made to a system’s configuration. Historically, this involved any changes to a system (see configuration control), but more recently it has been seen to involve configuration that is short of changes to the software (see change management).
the prevention of the leaking of sensitive data from a program
synonymous with covert channel. Archaic usage.
see star property (*-property)
(1) the intermixing of data at different sensitivity and need-to-know levels. The lower level data is said to be contaminated by the higher level data; thus, the contaminating (higher level) data may not receive the required level of protection. (2) similarly for data of varying integrity or corruption
plan for emergency response, backup operations, and post-disaster recovery that will ensure the availability of critical resources and facilitate the continuity of operations in an emergency situation. May be synonymous with disaster plan and emergency plan, but most commonly held to be specifically related to information systems. See also disaster recovery plan and business continuity plan.
the space, expressed in feet of radius, surrounding equipment processing sensitive information, that is under sufficient (primarily) physical and (possibly) technical control to preclude an unauthorized entry or compromise
see access control
the condition that exists when access control is applied to all users and components of a system
an important, but strangely ill-defined, area of security, very similar to safeguards and countermeasures, used to prevent failures of integrity, availability, and confidentiality. Controls are grouped and discussed in a number of not quite orthogonal ways. One way of dividing controls (sometimes referring to categories) examines administrative (policies, procedures, etc.), physical (locks, guards, etc.), and technical (encryption, network auditing, etc.) controls. Another way of classifying (sometimes referring to types) surveys preventative/preventive (deterring and blocking an event), detective (determining and investigating an event), corrective (restoring and recovering from an event), deterrent (increasing perceived risk to an attacker), recovery (restoring lost resources), and compensation (provision of redundancy or other means to counteract loss of resources) controls. Access control is considered a special case, but may also be considered preventative/preventive and technical controls. However, access controls could also be considered administrative and deterrent controls. As you can see, these divisions are not always clear.
(1) a small piece of data originally intended to maintain state between Web browser accesses to a site. (HTTP [HyperText Transfer Protocol] 1.0 did not provide for persistent connections.) Because the data is stored on the user’s computer, and because it is possible to store the data in such a way as to allow it to be world readable, careless setting of cookies, or the ubiquitous presence of an entity on many Web sites, may create a situation in which a user’s privacy is at risk. (2) the term has been used to indicate some form of authentication information or ticket, and is specifically used for a piece of data in the ISAKMP security association negotiation, but these usages are relatively rare
the rights notice (General Public License) carried by GNU EMACS and other Free Software Foundation software, granting modification, reuse, and reproduction rights to all users. The term is used in somewhat ironic contradistinction to copyright, where the creator is granted control, and the right to restrict usage of, the creation. See also General Public Virus.
the right of the author of a written or artistic work to control the use and distribution of the work. One of the basic intellectual property rights, the others of which are patent, trademark, and trade secret.
computer game in which two or more programs attempt to destroy each other inside a real or simulated computer. Originally played with real programs in the earliest timesharing computers and inspired by the operations of rogue programs in early multi-tasking machines. Because one of the successful strategies was to create a program that submitted copies of itself to various places in the address space, this game is often seen as a precursor of viruses and worms, and is often discussed in connection with the battle between malicious software and protective software developers. Core Wars (capitalized) is now a standardized game using a simulated machine language called Redstone code (or redcode). The earliest formal example of core wars was DARWIN, conceived and developed in 1961 by V. A. Vyssotsky, M. Douglas McIlroy, and Bob Morris at Bell Telephone Laboratories in Murray Hill, New Jersey. It ran on an IBM 7090 mainframe under the Bell Labs operating system.
the assessment of the costs of providing data protection for a system versus the cost of losing or compromising the data. Sometimes also known as cost-risk analysis.
any action, device, procedure, technique, or other measure that reduces the vulnerability of or threat to a system. See also safeguard.
communications channel that allows two cooperating processes to transfer information in a manner that violates the system’s security policy. More specifically, a means of information leaking from a system via a channel not normally considered a communications medium. Synonymous with confinement channel.
covert storage channel
covert channel that involves the direct or indirect writing of a storage location by one process, and the direct or indirect reading of the storage location by another process. Covert storage channels typically involve a resource (such as sectors on a disk) that is shared by two subjects at different security levels.
covert timing channel
covert channel in which one process signals information to another by modulating its own use of system resources (for example, CPU time) in such a way that this manipulation affects the real response time observed by the second process
originally, “crabs” was a prank program on Macintosh and Atari computers that erased the screen display by having graphical crabs “eat” it. An obscure usage refers to malicious software that erases screen displays. (There are very few examples of this type of activity.)
(1) someone who tries to break the security of, and gain access to, someone else’s system without being invited to do so. This is, of course, an attempt to avoid the controversial usage of the term hacker. A more specific usage is that referring to software piracy aficionados (warez d00dz) who actually perform the cracking of copy protection codes, rather than simply distributing the pirated packages. See also adversary, intruder, and blackhat. (2) person who drives a pickup truck and objects to definition (1)
see cyclic redundancy check
(1) a condition of a service or other system resource such that denial of access to, or lack of availability of, that resource would jeopardize a system user’s ability to perform a primary function or would result in other serious consequences (2) each extension of an X.509 certificate (or CRL) is marked as being either critical or non-critical. If an extension is critical and a certificate user (or CRL user) does not recognize the extension type or does not implement its semantics, the user is required to treat the certificate (or CRL) as invalid.
see certificate revocation list
crossover error rate (CER)
if the false acceptance rate and false rejection rate are graphed as the sensitivity of a security system is varied, false acceptance will start at a high value and fall, whereas false rejection will start with a low value and climb. The point at which the graph of the FAR crosses that of the FRR is the crossover error rate, and is generally considered a reasonable overall measure of the accuracy of a system. (It is easy to demonstrate situations in which the CER is not the best measure or setting for a system.)
the science that deals with analysis of a cryptographic system to gain knowledge needed to break or circumvent the protection the system is designed to provide. In some cases, this would be conversion of ciphertext to plaintext, but in other cases, it might involve forging of digital signatures or certificates. The basic cryptanalytic attacks on encryption systems are ciphertext-only, known-plaintext, chosen-plaintext, and chosen-ciphertext; and these generalize to the other kinds of cryptography. See also cryptology.
formerly widely used as an abbreviation for cryptography, cryptographic, cryptology, or even encryption, this term probably should not be used because of the potential for misunderstanding
well-defined procedure or sequence of rules or steps used to produce a key stream or ciphertext from plaintext, and vice versa. Older usage is crypto-algorithm.
one-way function applied to a file to produce a unique “fingerprint” of the file for later reference. Often part of the process of creating a digital signature.
the principles, means, and methods for rendering information unintelligible, and for restoring encrypted information to intelligible form. Literally, hidden writing.
slightly more general field than cryptography, cryptology includes cryptanalysis, or code breaking, and code making
the time span during which a particular key is authorized to be used in a cryptographic system, an aspect of key management. Also known as key lifetime and validity period.
the security or protection resulting from the proper use of technically sound cryptosystems
complete and functional system for cryptography, including a sound crypto-algorithm, provisions for the required functions of the system, and proper key choice and management
becoming the preferred form of key
cyclic redundancy check (CRC)
version of change detection that performs calculation on the data in a file or message as a matrix. This can detect multiple or subtle changes that ordinary checksum calculations miss. Also used extensively in data communications for ensuring the integrity of file transfers.
community of users and developers dedicated to creating systems for anonymous communications and network access. Since the cypherpunk community is generally opposed to any invasion of privacy or any form of surveillance, the law enforcement community generally perceives them in a negative light. Unfortunately, there does seem to be a relation between certain segments of the cypherpunk community and some groups engaged in software piracy and other forms of intellectual property theft.