Need to know

The term "need to know", when used by governments and other organizations (particularly those related to military or intelligence), describes the restriction of data which is considered very confidential and sensitive. Under need-to-know restrictions, even if one has all the necessary official approvals (such as a security clearance) to access certain information, one would not be given access to such information, or read into a clandestine operation, unless one has a specific need to know; that is, access to the information must be necessary for one to conduct one's official duties. This term also includes anyone that the people with the knowledge deemed necessary to share it with.

As with most security mechanisms, the aim is to make it difficult for unauthorized access to occur, without inconveniencing legitimate access. Need-to-know also aims to discourage "browsing" of sensitive material by limiting access to the smallest possible number of people.

Examples

[edit]

The Battle of Normandy in 1944 is an example of a need-to-know restriction. Though thousands of military personnel were involved in planning the invasion, only a small number of them knew the entire scope of the operation; the rest were only informed of data needed to complete a small part of the plan. The same is true of the Trinity project, the first test of a nuclear weapon in 1945.

Problems and criticism

[edit]

Like other security measures, need to know can be misused by persons who wish to refuse others access to information they hold in an attempt to increase their personal power, prevent unwelcome review of their work or prevent embarrassment resulting from actions or thoughts.

Need to know can also be invoked to hide illegal activities. This may be considered a necessary use, or a detrimental abuse of such a policy when considered from different perspectives.

Need to know can be detrimental to workers' efficiency. Even when done in good faith, one might not be fully aware of who actually needs to know the information, resulting in inefficiencies as some people may inevitably withhold information that they require to perform their duty. The speed of computations with IBM mechanical calculators at Los Alamos dramatically increased after the calculators' operators were told what the numbers meant:[1]

What they had to do was work on IBM machines – punching holes, numbers that they didn't understand. Nobody told them what it was. The thing was going very slowly. I said that the first thing there has to be is that these technical guys know what we're doing. Oppenheimer went and talked to the security and got special permission so I could give a nice lecture about what we were doing, and they were all excited: "We're fighting a war! We see what it is!" They knew what the numbers meant. If the pressure came out higher, that meant there was more energy released, and so on and so on. They knew what they were doing. Complete transformation! They began to invent ways of doing it better. They improved the scheme. They worked at night. They didn't need supervising in the night; they didn't need anything. They understood everything; they invented several of the programs that we used.

In computer technology

[edit]

The discretionary access control mechanisms of some operating systems can be used to enforce need to know.[2] In this case, the owner of a file determines whether another person should have access. Need to know is often concurrently applied with mandatory access control schemes, in which the lack of an official approval (such as a clearance) may absolutely prohibit a person from accessing the information. This is because need to know can be a subjective assessment. Mandatory access control schemes can also audit accesses, in order to determine if need to know has been violated.

The term is also used in the concept of graphical user interface design where computers are controlling complex equipment such as airplanes. In this usage, when many different pieces of data are dynamically competing for finite user interface space, safety-related messages are given priority.

See also

[edit]

References

[edit]
  1. ^ Feynman, Richard (1997). Surely you're joking, Mr. Feynman!. W. W. Norton & Company. p. 128. ISBN 978-0-393-31604-9.
  2. ^ "DEPARTMENT OF DEFENSE TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA". 2006-05-27. Archived from the original on 2006-05-27. Retrieved 2020-12-05.