The observer effect, or observer bias, has any of various context-specific meanings, some of which are related.
Use in science
In science, the term observer effect refers to changes that the act of observing will make on the phenomenon being observed. For example, for us to "see" an electron, a photon must first interact with it, and this interaction will change the path of that electron. It is also theoretically possible for other, less direct means of measurement to affect the electron; even if the electron is simply put into a position where observing it is possible, without actual observation taking place, it will still (theoretically) alter its position.
In physics, a more mundane observer effect can be the result of instruments that by necessity alter the state of what they measure in some manner. For instance, in electronics, ammeters and voltmeters need to be connected to the circuit, and so by their very presence affect the current or the voltage they are measuring. Likewise, a standard mercury-in-glass thermometer must absorb some thermal energy to record a temperature, and therefore changes the temperature of the body which it is measuring.
A common lay misuse of the term refers to quantum mechanics, where, if the outcome of an event has not been observed, it exists in a state of 'superposition', which is akin to being in all possible states at once. In the famous thought experiment known as Schrödinger's cat the cat is supposedly neither alive nor dead until observed — until that time, the cat is both alive and dead (technically half-alive and half-dead in probability terms). However, most quantum physicists, in resolving Schrödinger's seeming paradox, now understand that the acts of 'observation' and 'measurement' must also be defined in quantum terms before the question makes sense. From this point of view, there is no 'observer effect', only one vastly entangled quantum system. A significant minority still find the equations point to an observer; Wheeler, who probably worked more deeply on this subject than any physicist thus far, devised a graphic in which the universe was represented by a "U" with an eye on one end, turned around and viewing itself, to describe his understanding.
The Heisenberg uncertainty principle is also frequently confused with the "observer effect". The uncertainty principle actually describes how precisely we may measure the position and momentum of a particle at the same time — if we increase the precision in measuring one quantity, we are forced to lose precision in measuring the other. Thus, the uncertainty principle deals with measurement, and not observation. The idea that the Uncertainty Principle is caused by disturbance (and hence by observation) is not considered to be valid by some, although it was extant in the early years of quantum mechanics, and is often repeated in popular treatments.
There is a related issue in quantum mechanics relating to whether systems have pre-existing — prior to measurement, that is — properties corresponding to all measurements that could possibly be made on them. The assumption that they do is often referred to as "realism" in the literature, although it has been argued that the word "realism" is being used in a more restricted sense than philosophical realism. A recent experiment in the realm of quantum physics has been quoted as meaning that we have to "say goodbye" to realism, although the author of the paper states only that "we would [..] have to give up certain intuitive features of realism"  . These experiments demonstrate a puzzling relationship between the act of measurement and the system being measured, but it is unclear if they require a conscious observer or not.
Use in information technology
In information technology, the observer effect is the potential impact of the act of observing a process output while the process is running. For example: if a process uses a log file to record its progress, the process could slow. Furthermore, the act of viewing the file while the process is running could cause an I/O error in the process, which could, in turn, cause it to stop.
Another example would be observing the performance of a CPU by running both the observed and observing programs on the same CPU, which will lead to inaccurate results because the observer program itself affects the CPU performance (modern, heavily cached and pipelined CPUs are particularly affected by this kind of observation).
Observing (or rather, debugging) a running program by modifying its source code (such as adding extra output or generating log files) or by running it in a debugger may sometimes cause certain bugs to diminish or change their behavior, creating extra difficulty for the person trying to isolate the bug (see Heisenbug).
In the social sciences and general usage, the effect refers to how people change their behavior when aware of being watched (see Hawthorne effect and Observer's Paradox). For instance, in the armed forces, an announced inspection is used to see how well soldiers can do when they put their minds to it, while a surprise inspection is used to see how well prepared they generally are.
In parapsychology, the observer effect refers to the situation of an experiment subject's expectations creating the experiment's results. The phrase was coined by two friends performing an experiment wherein they set up a number of volunteers who had to press the button when they felt they were being watched by the experimenters.
The related social-science term observer bias is error introduced into measurement when observers overemphasize behavior they expect to find and fail to notice behavior they do not expect. This is why medical trials are normally double-blind rather than single-blind. Observer bias can also be introduced because researchers see a behavior and interpret it according to what it means to them, whereas it may mean something else to the person showing the behavior. See subject-expectancy effect and observer-expectancy effect.
- Observer Effect in the social sciences (Association for Qualitative Research)
- The observer effect (usage of the term in the computer industry)