Perception

From HandWiki
Short description: Interpretation of sensory information
The Necker cube and Rubin vase can be perceived in more than one way.
Humans are able to have a very good guess on the underlying 3D shape category/identity/geometry given a silhouette of that shape. Computer vision researchers have been able to build computational models for perception that exhibit a similar behavior and are capable of generating and reconstructing 3D shapes from single or multi-view depth maps or silhouettes.[1]

Perception (from la perceptio 'gathering, receiving') is the organization, identification, and interpretation of sensory information in order to represent and understand the presented information or environment.[2] All perception involves signals that go through the nervous system, which in turn result from physical or chemical stimulation of the sensory system.[3] Vision involves light striking the retina of the eye; smell is mediated by odor molecules; and hearing involves pressure waves.

Perception is not only the passive receipt for of these signals, but it is also shaped by the recipient's learning, memory, expectation, and attention.[4][5] Sensory input is a process that transforms this low-level information to higher-level information (e.g., extracts shapes for object recognition).[5] The process that follows connects a person's concepts and expectations (or knowledge), restorative and selective mechanisms (such as attention) that influence perception.

Perception depends on complex functions of the nervous system, but subjectively seems mostly effortless because this processing happens outside conscious awareness.[3] Since the rise of experimental psychology in the 19th century, psychology's understanding of perception has progressed by combining a variety of techniques.[4] Psychophysics quantitatively describes the relationships between the physical qualities of the sensory input and perception.[6] Sensory neuroscience studies the neural mechanisms underlying perception. Perceptual systems can also be studied computationally, in terms of the information they process. Perceptual issues in philosophy include the extent to which sensory qualities such as sound, smell or color exist in objective reality rather than in the mind of the perceiver.[4]

Although people traditionally viewed the senses as passive receptors, the study of illusions and ambiguous images has demonstrated that the brain's perceptual systems actively and pre-consciously attempt to make sense of their input.[4] There is still active debate about the extent to which perception is an active process of hypothesis testing, analogous to science, or whether realistic sensory information is rich enough to make this process unnecessary.[4]

The perceptual systems of the brain enable individuals to see the world around them as stable, even though the sensory information is typically incomplete and rapidly varying. Human and other animal brains are structured in a modular way, with different areas processing different kinds of sensory information. Some of these modules take the form of sensory maps, mapping some aspect of the world across part of the brain's surface. These different modules are interconnected and influence each other. For instance, taste is strongly influenced by smell.[7]

Process and terminology

The process of perception begins with an object in the real world, known as the distal stimulus or distal object.[3] By means of light, sound, or another physical process, the object stimulates the body's sensory organs. These sensory organs transform the input energy into neural activity—a process called transduction.[3][8] This raw pattern of neural activity is called the proximal stimulus.[3] These neural signals are then transmitted to the brain and processed.[3] The resulting mental re-creation of the distal stimulus is the percept.

To explain the process of perception, an example could be an ordinary shoe. The shoe itself is the distal stimulus. When light from the shoe enters a person's eye and stimulates the retina, that stimulation is the proximal stimulus.[9] The image of the shoe reconstructed by the brain of the person is the percept. Another example could be a ringing telephone. The ringing of the phone is the distal stimulus. The sound stimulating a person's auditory receptors is the proximal stimulus. The brain's interpretation of this as the "ringing of a telephone" is the percept.

The different kinds of sensation (such as warmth, sound, and taste) are called sensory modalities or stimulus modalities.[8][10]

Bruner's model of the perceptual process

Psychologist Jerome Bruner developed a model of perception, in which people put "together the information contained in" a target and a situation to form "perceptions of ourselves and others based on social categories."[11][12] This model is composed of three states:

  1. When people encounter an unfamiliar target, they are very open to the informational cues contained in the target and the situation surrounding it.
  2. The first stage does not give people enough information on which to base perceptions of the target, so they will actively seek out cues to resolve this ambiguity. Gradually, people collect some familiar cues that enable them to make a rough categorization of the target.
  3. The cues become less open and selective. People try to search for more cues that confirm the categorization of the target. They actively ignore and distort cues that violate their initial perceptions. Their perception becomes more selective and they finally paint a consistent picture of the target.

Saks and John's three components to perception

According to Alan Saks and Gary Johns, there are three components to perception:[13]

  1. The Perceiver: a person whose awareness is focused on the stimulus, and thus begins to perceive it. There are many factors that may influence the perceptions of the perceiver, while the three major ones include (1) motivational state, (2) emotional state, and (3) experience. All of these factors, especially the first two, greatly contribute to how the person perceives a situation. Oftentimes, the perceiver may employ what is called a "perceptual defense", where the person will only see what they want to see.
  2. The Target: the object of perception; something or someone who is being perceived. The amount of information gathered by the sensory organs of the perceiver affects the interpretation and understanding about the target.
  3. The Situation: the environmental factors, timing, and degree of stimulation that affect the process of perception. These factors may render a single stimulus to be left as merely a stimulus, not a percept that is subject for brain interpretation.

Multistable perception

Stimuli are not necessarily translated into a percept and rarely does a single stimulus translate into a percept. An ambiguous stimulus may sometimes be transduced into one or more percepts, experienced randomly, one at a time, in a process termed multistable perception. The same stimuli, or absence of them, may result in different percepts depending on subject's culture and previous experiences.

Ambiguous figures demonstrate that a single stimulus can result in more than one percept. For example, the Rubin vase can be interpreted either as a vase or as two faces. The percept can bind sensations from multiple senses into a whole. A picture of a talking person on a television screen, for example, is bound to the sound of speech from speakers to form a percept of a talking person.

Types of perception

Cerebrum lobes

Vision

Main page: Philosophy:Visual perception

In many ways, vision is the primary human sense. Light is taken in through each eye and focused in a way which sorts it on the retina according to direction of origin. A dense surface of photosensitive cells, including rods, cones, and intrinsically photosensitive retinal ganglion cells captures information about the intensity, color, and position of incoming light. Some processing of texture and movement occurs within the neurons on the retina before the information is sent to the brain. In total, about 15 differing types of information are then forwarded to the brain proper via the optic nerve.[14]

The timing of perception of a visual event, at points along the visual circuit, have been measured. A sudden alteration of light at a spot in the environment first alters photoreceptor cells in the retina, which send a signal to the retina bipolar cell layer which, in turn, can activate a retinal ganglion neuron cell. A retinal ganglion cell is a bridging neuron that connects visual retinal input to the visual processing centers within the central nervous system.[15] Light-altered neuron activation occurs within about 5–20 milliseconds in a rabbit retinal ganglion,[16] although in a mouse retinal ganglion cell the initial spike takes between 40 and 240 milliseconds before the initial activation.[17] The initial activation can be detected by an action potential spike, a sudden spike in neuron membrane electric voltage.

A perceptual visual event measured in humans was the presentation to individuals of an anomalous word. If these individuals are shown a sentence, presented as a sequence of single words on a computer screen, with a puzzling word out of place in the sequence, the perception of the puzzling word can register on an electroencephalogram (EEG). In an experiment, human readers wore an elastic cap with 64 embedded electrodes distributed over their scalp surface.[18] Within 230 milliseconds of encountering the anomalous word, the human readers generated an event-related electrical potential alteration of their EEG at the left occipital-temporal channel, over the left occipital lobe and temporal lobe.

Sound

Anatomy of the human ear. (The length of the auditory canal is exaggerated in this image).
  Brown is outer ear.
  Red is middle ear.
  Purple is inner ear.

Hearing (or audition) is the ability to perceive sound by detecting vibrations (i.e., sonic detection). Frequencies capable of being heard by humans are called audio or audible frequencies, the range of which is typically considered to be between 20 Hz and 20,000 Hz.[19] Frequencies higher than audio are referred to as ultrasonic, while frequencies below audio are referred to as infrasonic.

The auditory system includes the outer ears, which collect and filter sound waves; the middle ear, which transforms the sound pressure (impedance matching); and the inner ear, which produces neural signals in response to the sound. By the ascending auditory pathway these are led to the primary auditory cortex within the temporal lobe of the human brain, from where the auditory information then goes to the cerebral cortex for further processing.

Sound does not usually come from a single source: in real situations, sounds from multiple sources and directions are superimposed as they arrive at the ears. Hearing involves the computationally complex task of separating out sources of interest, identifying them and often estimating their distance and direction.[20]

Touch

Gibson defined the haptic system as "the sensibility of the individual to the world adjacent to his body by use of his body."[21] Gibson and others emphasized the close link between body movement and haptic perception, where the latter is active exploration.

The concept of haptic perception is related to the concept of extended physiological proprioception according to which, when using a tool such as a stick, perceptual experience is transparently transferred to the end of the tool.

Taste

Traditionally, there have been four primary tastes: sweetness, bitterness, sourness, and saltiness. The recognition and awareness of umami, which is considered the fifth primary taste, is a relatively recent development in Western cuisine.[22][23] Other tastes can be mimicked by combining these basic tastes,[24][25] all of which contribute only partially to the sensation and flavor of food in the mouth. Other factors include smell, which is detected by the olfactory epithelium of the nose;[7] texture, which is detected through a variety of mechanoreceptors, muscle nerves, etc.;[25][26] and temperature, which is detected by thermoreceptors.[25] All basic tastes are classified as either appetitive or aversive, depending upon whether the things they sense are harmful or beneficial.[27]

Smell

Smell is the process of absorbing molecules through olfactory organs, which are absorbed by humans through the nose. These molecules diffuse through a thick layer of mucus; come into contact with one of thousands of cilia that are projected from sensory neurons; and are then absorbed into a receptor (one of 347 or so).[28] It is this process that causes humans to understand the concept of smell from a physical standpoint.

Smell is also a very interactive sense as scientists have begun to observe that olfaction comes into contact with the other sense in unexpected ways.[29] It is also the most primal of the senses, as it is known to be the first indicator of safety or danger, therefore being the sense that drives the most basic of human survival skills. As such, it can be a catalyst for human behavior on a subconscious and instinctive level.[30]

Social

Main page: Social perceptionSocial perception is the part of perception that allows people to understand the individuals and groups of their social world. Thus, it is an element of social cognition.[31]
Though the phrase "I owe you" can be heard as three distinct words, a spectrogram reveals no clear boundaries.

Speech

Main page: Philosophy:Speech perception

Speech perception is the process by which spoken language is heard, interpreted and understood. Research in this field seeks to understand how human listeners recognize the sound of speech (or phonetics) and use such information to understand spoken language.

Listeners manage to perceive words across a wide range of conditions, as the sound of a word can vary widely according to words that surround it and the tempo of the speech, as well as the physical characteristics, accent, tone, and mood of the speaker. Reverberation, signifying the persistence of sound after the sound is produced, can also have a considerable impact on perception. Experiments have shown that people automatically compensate for this effect when hearing speech.[20][32]

The process of perceiving speech begins at the level of the sound within the auditory signal and the process of audition. The initial auditory signal is compared with visual information—primarily lip movement—to extract acoustic cues and phonetic information. It is possible other sensory modalities are integrated at this stage as well.[33] This speech information can then be used for higher-level language processes, such as word recognition.

Speech perception is not necessarily uni-directional. Higher-level language processes connected with morphology, syntax, and/or semantics may also interact with basic speech perception processes to aid in recognition of speech sounds.[34] It may be the case that it is not necessary (maybe not even possible) for a listener to recognize phonemes before recognizing higher units, such as words. In an experiment, Richard M. Warren replaced one phoneme of a word with a cough-like sound. His subjects restored the missing speech sound perceptually without any difficulty. Moreover, they were not able to accurately identify which phoneme had even been disturbed.[35]

Faces

Main page: Face perceptionFacial perception refers to cognitive processes specialized in handling human faces (including perceiving the identity of an individual) and facial expressions (such as emotional cues.)

Social touch

Affective touch is a type of sensory information that elicits an emotional reaction and is usually social in nature. Such information is actually coded differently than other sensory information. Though the intensity of affective touch is still encoded in the primary somatosensory cortex, the feeling of pleasantness associated with affective touch is activated more in the anterior cingulate cortex. Increased blood oxygen level-dependent (BOLD) contrast imaging, identified during functional magnetic resonance imaging (fMRI), shows that signals in the anterior cingulate cortex, as well as the prefrontal cortex, are highly correlated with pleasantness scores of affective touch. Inhibitory transcranial magnetic stimulation (TMS) of the primary somatosensory cortex inhibits the perception of affective touch intensity, but not affective touch pleasantness. Therefore, the S1 is not directly involved in processing socially affective touch pleasantness, but still plays a role in discriminating touch location and intensity.[36]

Multi-modal perception

Multi-modal perception refers to concurrent stimulation in more than one sensory modality and the effect such has on the perception of events and objects in the world.[37]

Time (chronoception)

Main page: Philosophy:Time perception

Chronoception refers to how the passage of time is perceived and experienced. Although the sense of time is not associated with a specific sensory system, the work of psychologists and neuroscientists indicates that human brains do have a system governing the perception of time,[38][39] composed of a highly distributed system involving the cerebral cortex, cerebellum, and basal ganglia. One particular component of the brain, the suprachiasmatic nucleus, is responsible for the circadian rhythm (commonly known as one's "internal clock"), while other cell clusters appear to be capable of shorter-range timekeeping, known as an ultradian rhythm.

One or more dopaminergic pathways in the central nervous system appear to have a strong modulatory influence on mental chronometry, particularly interval timing.[40]

Agency

Main page: Sense of agency

Sense of agency refers to the subjective feeling of having chosen a particular action. Some conditions, such as schizophrenia, can cause a loss of this sense, which may lead a person into delusions, such as feeling like a machine or like an outside source is controlling them. An opposite extreme can also occur, where people experience everything in their environment as though they had decided that it would happen.[41]

Even in non-pathological cases, there is a measurable difference between the making of a decision and the feeling of agency. Through methods such as the Libet experiment, a gap of half a second or more can be detected from the time when there are detectable neurological signs of a decision having been made to the time when the subject actually becomes conscious of the decision.

There are also experiments in which an illusion of agency is induced in psychologically normal subjects. In 1999, psychologists Wegner and Wheatley gave subjects instructions to move a mouse around a scene and point to an image about once every thirty seconds. However, a second person—acting as a test subject but actually a confederate—had their hand on the mouse at the same time, and controlled some of the movement. Experimenters were able to arrange for subjects to perceive certain "forced stops" as if they were their own choice.[42][43]

Familiarity

Recognition memory is sometimes divided into two functions by neuroscientists: familiarity and recollection.[44] A strong sense of familiarity can occur without any recollection, for example in cases of deja vu.

The temporal lobe (specifically the perirhinal cortex) responds differently to stimuli that feel novel compared to stimuli that feel familiar. Firing rates in the perirhinal cortex are connected with the sense of familiarity in humans and other mammals. In tests, stimulating this area at 10–15 Hz caused animals to treat even novel images as familiar, and stimulation at 30–40 Hz caused novel images to be partially treated as familiar.[45] In particular, stimulation at 30–40 Hz led to animals looking at a familiar image for longer periods, as they would for an unfamiliar one, though it did not lead to the same exploration behavior normally associated with novelty.

Recent studies on lesions in the area concluded that rats with a damaged perirhinal cortex were still more interested in exploring when novel objects were present, but seemed unable to tell novel objects from familiar ones—they examined both equally. Thus, other brain regions are involved with noticing unfamiliarity, while the perirhinal cortex is needed to associate the feeling with a specific source.[46]

Sexual stimulation

Sexual stimulation is any stimulus (including bodily contact) that leads to, enhances, and maintains sexual arousal, possibly even leading to orgasm. Distinct from the general sense of touch, sexual stimulation is strongly tied to hormonal activity and chemical triggers in the body. Although sexual arousal may arise without physical stimulation, achieving orgasm usually requires physical sexual stimulation (stimulation of the Krause-Finger corpuscles[47] found in erogenous zones of the body.)

Other senses

Main page: SenseOther senses enable perception of body balance; acceleration, including gravity; position of body parts; temperature; and pain. They can also enable perception of internal senses, such as suffocation, gag reflex, abdominal distension, fullness of rectum and urinary bladder, and sensations felt in the throat and lungs.

Reality

In the case of visual perception, some people can see the percept shift in their mind's eye.[48] Others, who are not picture thinkers, may not necessarily perceive the 'shape-shifting' as their world changes. This esemplastic nature has been demonstrated by an experiment that showed that ambiguous images have multiple interpretations on the perceptual level.

The confusing ambiguity of perception is exploited in human technologies such as camouflage and biological mimicry. For example, the wings of European peacock butterflies bear eyespots that birds respond to as though they were the eyes of a dangerous predator.

There is also evidence that the brain in some ways operates on a slight "delay" in order to allow nerve impulses from distant parts of the body to be integrated into simultaneous signals.[49]

Perception is one of the oldest fields in psychology. The oldest quantitative laws in psychology are Weber's law, which states that the smallest noticeable difference in stimulus intensity is proportional to the intensity of the reference; and Fechner's law, which quantifies the relationship between the intensity of the physical stimulus and its perceptual counterpart (e.g., testing how much darker a computer screen can get before the viewer actually notices). The study of perception gave rise to the Gestalt School of Psychology, with an emphasis on a holistic approach.

Physiology

The receptive field is the specific part of the world to which a receptor organ and receptor cells respond. For instance, the part of the world an eye can see, is its receptive field; the light that each rod or cone can see, is its receptive field.[50] Receptive fields have been identified for the visual system, auditory system and somatosensory system, so far. Research attention is currently focused not only on external perception processes, but also to "interoception", considered as the process of receiving, accessing and appraising internal bodily signals. Maintaining desired physiological states is critical for an organism's well-being and survival. Interoception is an iterative process, requiring the interplay between perception of body states and awareness of these states to generate proper self-regulation. Afferent sensory signals continuously interact with higher order cognitive representations of goals, history, and environment, shaping emotional experience and motivating regulatory behavior.[51]

Features

Constancy

Grouping (Gestalt)

Main page: Philosophy:Principles of grouping

The principles of grouping (or Gestalt laws of grouping) are a set of principles in psychology, first proposed by Gestalt psychologists, to explain how humans naturally perceive objects with patterns and objects. Gestalt psychologists argued that these principles exist because the mind has an innate disposition to perceive patterns in the stimulus based on certain rules. These principles are organized into six categories:

  1. Proximity: the principle of proximity states that, all else being equal, perception tends to group stimuli that are close together as part of the same object, and stimuli that are far apart as two separate objects.
  2. Similarity: the principle of similarity states that, all else being equal, perception lends itself to seeing stimuli that physically resemble each other as part of the same object and that are different as part of a separate object. This allows for people to distinguish between adjacent and overlapping objects based on their visual texture and resemblance.
  3. Closure: the principle of closure refers to the mind's tendency to see complete figures or forms even if a picture is incomplete, partially hidden by other objects, or if part of the information needed to make a complete picture in our minds is missing. For example, if part of a shape's border is missing people still tend to see the shape as completely enclosed by the border and ignore the gaps.
  4. Good Continuation: the principle of good continuation makes sense of stimuli that overlap: when there is an intersection between two or more objects, people tend to perceive each as a single uninterrupted object.
  5. Common Fate: the principle of common fate groups stimuli together on the basis of their movement. When visual elements are seen moving in the same direction at the same rate, perception associates the movement as part of the same stimulus. This allows people to make out moving objects even when other details, such as color or outline, are obscured.
  6. The principle of good form refers to the tendency to group together forms of similar shape, pattern, color, etc.[52][53][54][55]

Later research has identified additional grouping principles.[56]

Contrast effects

Main page: Contrast effectA common finding across many different kinds of perception is that the perceived qualities of an object can be affected by the qualities of context. If one object is extreme on some dimension, then neighboring objects are perceived as further away from that extreme.

"Simultaneous contrast effect" is the term used when stimuli are presented at the same time, whereas successive contrast applies when stimuli are presented one after another.[57]

The contrast effect was noted by the 17th Century philosopher John Locke, who observed that lukewarm water can feel hot or cold depending on whether the hand touching it was previously in hot or cold water.[58] In the early 20th Century, Wilhelm Wundt identified contrast as a fundamental principle of perception, and since then the effect has been confirmed in many different areas.[58] These effects shape not only visual qualities like color and brightness, but other kinds of perception, including how heavy an object feels.[59] One experiment found that thinking of the name "Hitler" led to subjects rating a person as more hostile.[60] Whether a piece of music is perceived as good or bad can depend on whether the music heard before it was pleasant or unpleasant.[61] For the effect to work, the objects being compared need to be similar to each other: a television reporter can seem smaller when interviewing a tall basketball player, but not when standing next to a tall building.[59] In the brain, brightness contrast exerts effects on both neuronal firing rates and neuronal synchrony.[62]

Theories

Perception as direct perception (Gibson)

Cognitive theories of perception assume there is a poverty of stimulus. This is the claim that sensations, by themselves, are unable to provide a unique description of the world.[63] Sensations require 'enriching', which is the role of the mental model.

The perceptual ecology approach was introduced by James J. Gibson, who rejected the assumption of a poverty of stimulus and the idea that perception is based upon sensations. Instead, Gibson investigated what information is actually presented to the perceptual systems. His theory "assumes the existence of stable, unbounded, and permanent stimulus-information in the ambient optic array. And it supposes that the visual system can explore and detect this information. The theory is information-based, not sensation-based."[64] He and the psychologists who work within this paradigm detailed how the world could be specified to a mobile, exploring organism via the lawful projection of information about the world into energy arrays.[65] "Specification" would be a 1:1 mapping of some aspect of the world into a perceptual array. Given such a mapping, no enrichment is required and perception is direct.[66]

Perception-in-action

From Gibson's early work derived an ecological understanding of perception known as perception-in-action, which argues that perception is a requisite property of animate action. It posits that, without perception, action would be unguided, and without action, perception would serve no purpose. Animate actions require both perception and motion, which can be described as "two sides of the same coin, the coin is action." Gibson works from the assumption that singular entities, which he calls invariants, already exist in the real world and that all that the perception process does is home in upon them.

The constructivist view, held by such philosophers as Ernst von Glasersfeld, regards the continual adjustment of perception and action to the external input as precisely what constitutes the "entity," which is therefore far from being invariant.[67] Glasersfeld considers an invariant as a target to be homed in upon, and a pragmatic necessity to allow an initial measure of understanding to be established prior to the updating that a statement aims to achieve. The invariant does not, and need not, represent an actuality. Glasersfeld describes it as extremely unlikely that what is desired or feared by an organism will never suffer change as time goes on. This social constructionist theory thus allows for a needful evolutionary adjustment.[68]

A mathematical theory of perception-in-action has been devised and investigated in many forms of controlled movement, and has been described in many different species of organism using the General Tau Theory. According to this theory, "tau information", or time-to-goal information is the fundamental percept in perception.

Evolutionary psychology

Many philosophers, such as Jerry Fodor, write that the purpose of perception is knowledge. However, evolutionary psychologists hold that the primary purpose of perception is to guide action.[69] They give the example of depth perception, which seems to have evolved not to aid in knowing the distances to other objects but rather to aid movement.[69] Evolutionary psychologists argue that animals ranging from fiddler crabs to humans use eyesight for collision avoidance, suggesting that vision is basically for directing action, not providing knowledge.[69] Neuropsychologists showed that perception systems evolved along the specifics of animals' activities. This explains why bats and worms can perceive different frequency of auditory and visual systems than, for example, humans.

Building and maintaining sense organs is metabolically expensive. More than half the brain is devoted to processing sensory information, and the brain itself consumes roughly one-fourth of one's metabolic resources. Thus, such organs evolve only when they provide exceptional benefits to an organism's fitness.[69]

Scientists who study perception and sensation have long understood the human senses as adaptations.[69] Depth perception consists of processing over half a dozen visual cues, each of which is based on a regularity of the physical world.[69] Vision evolved to respond to the narrow range of electromagnetic energy that is plentiful and that does not pass through objects.[69] Sound waves provide useful information about the sources of and distances to objects, with larger animals making and hearing lower-frequency sounds and smaller animals making and hearing higher-frequency sounds.[69] Taste and smell respond to chemicals in the environment that were significant for fitness in the environment of evolutionary adaptedness.[69] The sense of touch is actually many senses, including pressure, heat, cold, tickle, and pain.[69] Pain, while unpleasant, is adaptive.[69] An important adaptation for senses is range shifting, by which the organism becomes temporarily more or less sensitive to sensation.[69] For example, one's eyes automatically adjust to dim or bright ambient light.[69] Sensory abilities of different organisms often co-evolve, as is the case with the hearing of echolocating bats and that of the moths that have evolved to respond to the sounds that the bats make.[69]

Evolutionary psychologists claim that perception demonstrates the principle of modularity, with specialized mechanisms handling particular perception tasks.[69] For example, people with damage to a particular part of the brain are not able to recognize faces (prosopagnosia).[69] Evolutionary psychology suggests that this indicates a so-called face-reading module.[69]

Closed-loop perception

The theory of closed-loop perception proposes dynamic motor-sensory closed-loop process in which information flows through the environment and the brain in continuous loops.[70][71][72][73] Closed-loop perception appears consistent with anatomy and with the fact that perception is typically an incremental process. Repeated encounters with an object, whether conscious or not, enable an animal to refine its impressions of that object. This can be achieved more easily with a circular closed-loop system than with a linear open-loop one. Closed-loop perception can explain many of the phenomena that open-loop perception struggles to account for. This is largely because closed-loop perception considers motion to be an integral part of perception, and not an interfering component that must be corrected for. Furthermore, an environment perceived via sensor motion, and not despite sensor motion, need not be further stabilized by internal processes.[73]

Feature integration theory

Main page: Feature integration theory

Anne Treisman's feature integration theory (FIT) attempts to explain how characteristics of a stimulus such as physical location in space, motion, color, and shape are merged to form one percept despite each of these characteristics activating separate areas of the cortex. FIT explains this through a two part system of perception involving the preattentive and focused attention stages.[74][75][76][77][78]

The preattentive stage of perception is largely unconscious, and analyzes an object by breaking it down into its basic features, such as the specific color, geometric shape, motion, depth, individual lines, and many others.[74] Studies have shown that, when small groups of objects with different features (e.g., red triangle, blue circle) are briefly flashed in front of human participants, many individuals later report seeing shapes made up of the combined features of two different stimuli, thereby referred to as illusory conjunctions.[74][77]

The unconnected features described in the preattentive stage are combined into the objects one normally sees during the focused attention stage.[74] The focused attention stage is based heavily around the idea of attention in perception and 'binds' the features together onto specific objects at specific spatial locations (see the binding problem).[74][78]

Shared Intentionality theory

Main page: Social:Shared intentionality

A fundamentally different approach to understanding the perception of objects relies upon the essential role of Shared intentionality.[79] Michael Tomasello hypothesized that social bonds between children and caregivers would gradually increase through the essential motive force of shared intentionality beginning from birth.[80] The notion of shared intentionality, introduced by Michael Tomasello, was developed by later researchers, who tended to explain this collaborative interaction from different perspectives, e.g., psychophysiology,[81][82][83] and neurobiology.[84] The Shared intentionality approach considers perception occurrence at an earlier stage of organisms’ development than other theories, even before the emergence of Intentionality. Because many theories build their knowledge about perception based on its main features of the organization, identification, and interpretation of sensory information to represent the holistic picture of the environment, Intentionality is the central issue in perception development. Nowadays, only one hypothesis attempts to explain Shared intentionality in all its integral complexity from the level of interpersonal dynamics to interaction at the neuronal level. The hypothesis of neurobiological processes occurring during Shared intentionality[85] highlights that, at the beginning of cognition, very young organisms cannot distinguish relevant sensory stimuli independently. Because the environment is the cacophony of stimuli (electromagnetic waves, chemical interactions, and pressure fluctuations), their sensation is too limited by the noise to solve the cue problem. The relevant stimulus cannot overcome the noise magnitude if it passes through the senses. Therefore, Intentionality is a difficult problem for them since it needs the representation of the environment already categorized into objects (see also binding problem). The perception of objects is also problematic since it cannot appear without Intentionality. From the perspective of this hypothesis, Shared intentionality is collaborative interactions in which participants share the essential sensory stimulus of the actual cognitive problem. This social bond enables ecological training of the young immature organism, starting at the reflexes stage of development, for processing the organization, identification, and interpretation of sensory information in developing perception.[86] From this account perception emerges due to Shared intentionality in the embryonic stage of development, i.e., even before birth.[87]

Other theories of perception

Effects on perception

Effect of experience

Past actions and events that transpire right before an encounter or any form of stimulation have a strong degree of influence on how sensory stimuli are processed and perceived. On a basic level, the information our senses receive is often ambiguous and incomplete. However, they are grouped together in order for us to be able to understand the physical world around us. But it is these various forms of stimulation, combined with our previous knowledge and experience that allows us to create our overall perception. For example, when engaging in conversation, we attempt to understand their message and words by not only paying attention to what we hear through our ears but also from the previous shapes we have seen our mouths make. Another example would be if we had a similar topic come up in another conversation, we would use our previous knowledge to guess the direction the conversation is headed in.[88]

Effect of motivation and expectation

Sets can be created by motivation and so can result in people interpreting ambiguous figures so that they see what they want to see.[89] For instance, how someone perceives what unfolds during a sports game can be biased if they strongly support one of the teams.[90] In one experiment, students were allocated to pleasant or unpleasant tasks by a computer. They were told that either a number or a letter would flash on the screen to say whether they were going to taste an orange juice drink or an unpleasant-tasting health drink. In fact, an ambiguous figure was flashed on screen, which could either be read as the letter B or the number 13. When the letters were associated with the pleasant task, subjects were more likely to perceive a letter B, and when letters were associated with the unpleasant task they tended to perceive a number 13.[91]

Perceptual set has been demonstrated in many social contexts. When someone has a reputation for being funny, an audience is more likely to find them amusing.[92] Individual's perceptual sets reflect their own personality traits. For example, people with an aggressive personality are quicker to correctly identify aggressive words or situations.[92]

One classic psychological experiment showed slower reaction times and less accurate answers when a deck of playing cards reversed the color of the suit symbol for some cards (e.g. red spades and black hearts).[93]

Philosopher Andy Clark explains that perception, although it occurs quickly, is not simply a bottom-up process (where minute details are put together to form larger wholes). Instead, our brains use what he calls predictive coding. It starts with very broad constraints and expectations for the state of the world, and as expectations are met, it makes more detailed predictions (errors lead to new predictions, or learning processes). Clark says this research has various implications; not only can there be no completely "unbiased, unfiltered" perception, but this means that there is a great deal of feedback between perception and expectation (perceptual experiences often shape our beliefs, but those perceptions were based on existing beliefs).[94] Indeed, predictive coding provides an account where this type of feedback assists in stabilizing our inference-making process about the physical world, such as with perceptual constancy examples.

Embodied cognition challenges the idea of perception as internal representations resulting from a passive reception of (incomplete) sensory inputs coming from the outside world. According to O'Regan (1992), the major issue with this perspective is that it leaves the subjective character of perception unexplained.[95] Thus, perception is understood as an active process conducted by perceiving and engaged agents (perceivers). Furthermore, perception is influenced by agents' motives and expectations, their bodily states, and the interaction between the agent's body and the environment around it.[96]

Philosophy

Main page: Philosophy:Philosophy of perception

Perception is an important part of the theories of many philosophers it has been famously addressed by Rene Descartes, George Berkeley, and Immanuel Kant to name a few. In his work The Meditations Descartes begins by doubting all of his perceptions proving his existence with the famous phrase "I Think therefore I am", and then works to the conclusion that perceptions are God-given.[97] George Berkely took the stance that all things that we see have a reality to them and that our perceptions were sufficient to know and understand that thing because our perceptions are capable of responding to a true reality.[98] Kant almost meets the rationalists and the empiricists half way. His theory utilizes the reality of a noumenon, the actual objects that cannot be understood, and then a phenomenon which is human understanding through the mind lens interpreting that noumenon.[99]

See also

References

Citations

  1. "Soltani, A. A., Huang, H., Wu, J., Kulkarni, T. D., & Tenenbaum, J. B. Synthesizing 3D Shapes via Modeling Multi-View Depth Maps and Silhouettes With Deep Generative Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 1511-1519).". 28 May 2019. https://github.com/Amir-Arsalan/Synthesize3DviaDepthOrSil. 
  2. Schacter, Daniel (2011). Psychology. Worth Publishers. ISBN 9781429237192. https://archive.org/details/psychology0000scha. 
  3. 3.0 3.1 3.2 3.3 3.4 3.5 Goldstein (2009) pp. 5–7
  4. 4.0 4.1 4.2 4.3 4.4 Gregory, Richard. "Perception" in Gregory, Zangwill (1987) pp. 598–601.
  5. 5.0 5.1 Bernstein, Douglas A. (5 March 2010). Essentials of Psychology. Cengage Learning. pp. 123–124. ISBN 978-0-495-90693-3. https://books.google.com/books?id=rd77N0KsLVkC&pg=PA123. Retrieved 25 March 2011. 
  6. Gustav Theodor Fechner. Elemente der Psychophysik. Leipzig 1860.
  7. 7.0 7.1 DeVere, Ronald; Calvert, Marjorie (31 August 2010). Navigating Smell and Taste Disorders. Demos Medical Publishing. pp. 33–37. ISBN 978-1-932603-96-5. https://books.google.com/books?id=m6WOtX2QAtwC&pg=PA39. Retrieved 26 March 2011. 
  8. 8.0 8.1 Pomerantz, James R. (2003): "Perception: Overview". In: Lynn Nadel (Ed.), Encyclopedia of Cognitive Science, Vol. 3, London: Nature Publishing Group, pp. 527–537.
  9. "Sensation and Perception". http://www.learner.org/discoveringpsychology/07/e07glossary.html. 
  10. Willis, William D.; Coggeshall, Richard E. (31 January 2004). Sensory Mechanisms of the Spinal Cord: Primary afferent neurons and the spinal dorsal horn. Springer. p. 1. ISBN 978-0-306-48033-1. https://books.google.com/books?id=uqnKCewO2voC&pg=PA1. Retrieved 25 March 2011. 
  11. "Perception, Attribution, and, Judgment of Others". http://catalogue.pearsoned.ca/assets/hip/ca/hip_ca_pearsonhighered/samplechapter/013613436X.pdf. 
  12. Alan S. & Gary J. (2011). Perception, Attribution, and Judgment of Others. Organizational Behaviour: Understanding and Managing Life at Work, Vol. 7.
  13. Sincero, Sarah Mae. 2013. "Perception." Explorable. Retrieved 8 March 2020 (https://explorable.com/perception).
  14. Gollisch, Tim; Meister, Markus (28 January 2010). "Eye Smarter than Scientists Believed: Neural Computations in Circuits of the Retina". Neuron 65 (2): 150–164. doi:10.1016/j.neuron.2009.12.009. PMID 20152123. 
  15. "Retinal Ganglion Cells-Diversity of Cell Types and Clinical Relevance". Front Neurol 12: 661938. 2021. doi:10.3389/fneur.2021.661938. PMID 34093409. 
  16. "The structure and precision of retinal spike trains". Proc Natl Acad Sci U S A 94 (10): 5411–6. May 1997. doi:10.1073/pnas.94.10.5411. PMID 9144251. Bibcode1997PNAS...94.5411B. 
  17. "Response Latency Tuning by Retinal Circuits Modulates Signal Efficiency". Sci Rep 9 (1): 15110. October 2019. doi:10.1038/s41598-019-51756-y. PMID 31641196. Bibcode2019NatSR...915110T. 
  18. "Neural mechanisms of rapid sensitivity to syntactic anomaly". Front Psychol 4: 45. 2013. doi:10.3389/fpsyg.2013.00045. PMID 23515395. 
  19. D'Ambrose, Christoper; Choudhary, Rizwan (2003). "Frequency range of human hearing". in Elert, Glenn. https://hypertextbook.com/facts/2003/ChrisDAmbrose.shtml. Retrieved 2022-01-22. 
  20. 20.0 20.1 Cite error: Invalid <ref> tag; no text was provided for refs named eop_constancy
  21. Gibson, J.J. (1966). The senses considered as perceptual systems.. Boston: Houghton Mifflin. ISBN 978-0-313-23961-8. https://archive.org/details/sensesconsidered00jame. 
  22. "Umami Dearest: The mysterious fifth taste has officially infiltrated the food scene". trendcentral.com. 23 February 2010. http://www.trendcentral.com/life/umami-dearest/. 
  23. "#8 Food Trend for 2010: I Want My Umami". foodchannel.com. 6 December 2009. http://www.foodchannel.com/articles/article/8-food-trend-for-2010-i-want-my-umami/. 
  24. Cite error: Invalid <ref> tag; no text was provided for refs named DeVereCalvert2010_39
  25. 25.0 25.1 25.2 Siegel, George J.; Albers, R. Wayne (2006). Basic neurochemistry: molecular, cellular, and medical aspects. Academic Press. p. 825. ISBN 978-0-12-088397-4. https://books.google.com/books?id=Af0IyHtGCMUC&pg=PA825. Retrieved 26 March 2011. 
  26. Food texture: measurement and perception (page 3–4/311) Andrew J. Rosenthal. Springer, 1999.
  27. Why do two great tastes sometimes not taste great together? scientificamerican.com. Dr. Tim Jacob, Cardiff University. 22 May 2009.
  28. Brookes, Jennifer (13 August 2010). "Science is perception: what can our sense of smell tell us about ourselves and the world around us?". Philosophical Transactions. Series A, Mathematical, Physical, and Engineering Sciences 368 (1924): 3491–3502. doi:10.1098/rsta.2010.0117. PMID 20603363. Bibcode2010RSPTA.368.3491B. 
  29. Weir, Kirsten (February 2011). "Scents and sensibility". https://www.apa.org/monitor/2011/02/scents.aspx. 
  30. Bergland, Christopher (29 June 2015). "Psychology Today". How Does Scent Drive Human Behavior?. https://www.psychologytoday.com/us/blog/the-athletes-way/201506/how-does-scent-drive-human-behavior. 
  31. E. R. Smith, D. M. Mackie (2000). Social Psychology. Psychology Press, 2nd ed., p. 20
  32. Watkins, Anthony J.; Raimond, Andrew; Makin, Simon J. (23 March 2010). "Room reflection and constancy in speech-like sounds: Within-band effects". in Lopez-Poveda, Enrique A.. The Neurophysiological Bases of Auditory Perception. Springer. p. 440. ISBN 978-1-4419-5685-9. Bibcode2010nbap.book.....L. https://books.google.com/books?id=ACkNL-G7gUUC&pg=PA440. Retrieved 26 March 2011. 
  33. Rosenblum, Lawrence D. (15 April 2008). "Primacy of Multimodal Speech Perception". in Pisoni, David; Remez, Robert. The Handbook of Speech Perception. John Wiley & Sons. p. 51. ISBN 9780470756775. https://books.google.com/books?id=EwY15naRiFgC&q=%22Primacy+of+Multimodal+Speech+Perception%22&pg=PA51. 
  34. Davis, Matthew H.; Johnsrude, Ingrid S. (July 2007). "Hearing speech sounds: Top-down influences on the interface between audition and speech perception". Hearing Research 229 (1–2): 132–147. doi:10.1016/j.heares.2007.01.014. PMID 17317056. 
  35. Warren, R. M. (1970). "Restoration of missing speech sounds". Science 167 (3917): 392–393. doi:10.1126/science.167.3917.392. PMID 5409744. Bibcode1970Sci...167..392W. 
  36. Case, LK; Laubacher, CM; Olausson, H; Wang, B; Spagnolo, PA; Bushnell, MC (2016). "Encoding of Touch Intensity But Not Pleasantness in Human Primary Somatosensory Cortex". J Neurosci 36 (21): 5850–60. doi:10.1523/JNEUROSCI.1130-15.2016. PMID 27225773. 
  37. "Multi-Modal Perception". p. Introduction to Psychology. https://courses.lumenlearning.com/waymaker-psychology/chapter/multi-modal-perception/. 
  38. "The evolution of brain activation during temporal processing". Nature Neuroscience 4 (3): 317–23. March 2001. doi:10.1038/85191. PMID 11224550. https://escholarship.org/uc/item/80c4d02m. 
  39. "Brain Areas Critical To Human Time Sense Identified". UniSci – Daily University Science News. 2001-02-27. http://www.unisci.com/stories/20011/0227013.htm. 
  40. "Executive dysfunction in Parkinson's disease and timing deficits". Frontiers in Integrative Neuroscience 7: 75. October 2013. doi:10.3389/fnint.2013.00075. PMID 24198770. "Manipulations of dopaminergic signaling profoundly influence interval timing, leading to the hypothesis that dopamine influences internal pacemaker, or "clock", activity. For instance, amphetamine, which increases concentrations of dopamine at the synaptic cleft advances the start of responding during interval timing, whereas antagonists of D2 type dopamine receptors typically slow timing;... Depletion of dopamine in healthy volunteers impairs timing, while amphetamine releases synaptic dopamine and speeds up timing.". 
  41. Metzinger, Thomas (2009). The Ego Tunnel. Basic Books. pp. 117–118. ISBN 978-0-465-04567-9. 
  42. "Apparent mental causation. Sources of the experience of will". The American Psychologist 54 (7): 480–92. July 1999. doi:10.1037/0003-066x.54.7.480. PMID 10424155. 
  43. Metzinger, Thomas (2003). Being No One. p. 508. 
  44. Mandler (1980). "Recognizing: the judgement of prior occurrence". Psychological Review 87 (3): 252–271. doi:10.1037/0033-295X.87.3.252. http://www.escholarship.org/uc/item/58b2c2fc. 
  45. "Bidirectional Modulation of Recognition Memory". The Journal of Neuroscience 35 (39): 13323–35. September 2015. doi:10.1523/JNEUROSCI.2278-15.2015. PMID 26424881. 
  46. "Detecting and discriminating novel objects: The impact of perirhinal cortex disconnection on hippocampal activity patterns". Hippocampus 26 (11): 1393–1413. November 2016. doi:10.1002/hipo.22615. PMID 27398938. 
  47. "Sensory Corpuscles". Abdominal Key. 2017-03-29. https://abdominalkey.com/sensory-corpuscles/. 
  48. Wettlaufer, Alexandra K. (2003). In the mind's eye : the visual impulse in Diderot, Baudelaire and Ruskin, pg. 257. Amsterdam: Rodopi. ISBN 978-90-420-1035-2. 
  49. The Secret Advantage Of Being Short by Robert Krulwich. All Things Considered, NPR. 18 May 2009.
  50. Kolb & Whishaw: Fundamentals of Human Neuropsychology (2003)
  51. Farb N.; Daubenmier J.; Price C. J.; Gard T.; Kerr C.; Dunn B. D.; Mehling W. E. (2015). "Interoception, contemplative practice, and health". Frontiers in Psychology 6: 763. doi:10.3389/fpsyg.2015.00763. PMID 26106345. 
  52. Gray, Peter O. (2006): Psychology, 5th ed., New York: Worth, p. 281. ISBN:978-0-7167-0617-5
  53. Wolfe, Jeremy M.; Kluender, Keith R.; Levi, Dennis M.; Bartoshuk, Linda M.; Herz, Rachel S.; Klatzky, Roberta L.; Lederman, Susan J. (2008). "Gestalt Grouping Principles". Sensation and Perception (2nd ed.). Sinauer Associates. pp. 78, 80. ISBN 978-0-87893-938-1. http://www.sinauer.com./wolfe/chap4/gestaltF.htm. 
  54. Goldstein (2009). pp. 105–107
  55. Banerjee, J. C. (1994). "Gestalt Theory of Perception". Encyclopaedic Dictionary of Psychological Terms. M.D. Publications Pvt. Ltd. pp. 107–108. ISBN 978-81-85880-28-0. 
  56. Weiten, Wayne (1998). Psychology: themes and variations (4th ed.). Brooks/Cole Pub. Co.. p. 144. ISBN 978-0-534-34014-8. 
  57. Corsini, Raymond J. (2002). The dictionary of psychology. Psychology Press. p. 219. ISBN 978-1-58391-328-4. https://books.google.com/books?id=0uxnglHzYaoC&pg=PA219. Retrieved 24 March 2011. 
  58. 58.0 58.1 Kushner, Laura H. (2008). Contrast in judgments of mental health. p. 1. ISBN 978-0-549-91314-6. https://books.google.com/books?id=TYn5VHp9jioC&pg=PA1. Retrieved 24 March 2011. 
  59. 59.0 59.1 Plous, Scott (1993). The psychology of judgment and decision making. McGraw-Hill. pp. 38–41. ISBN 978-0-07-050477-6. https://books.google.com/books?id=xvWOQgAACAAJ. Retrieved 24 March 2011. 
  60. Moskowitz, Gordon B. (2005). Social cognition: understanding self and others. Guilford Press. p. 421. ISBN 978-1-59385-085-2. https://books.google.com/books?id=_-NLW8Ynvp8C&pg=PA421. Retrieved 24 March 2011. 
  61. Popper, Arthur N. (30 November 2010). Music Perception. Springer. p. 150. ISBN 978-1-4419-6113-6. https://books.google.com/books?id=ZYXd3CF1_vkC&pg=PA150. Retrieved 24 March 2011. 
  62. Biederlack, J.; Castelo-Branco, M.; Neuenschwander, S.; Wheeler, D.W.; Singer, W.; Nikolić, D. (2006). "Brightness induction: Rate enhancement and neuronal synchronization as complementary codes". Neuron 52 (6): 1073–1083. doi:10.1016/j.neuron.2006.11.012. PMID 17178409. 
  63. Stone, James V. (2012): "Vision and Brain: How we perceive the world", Cambridge, MIT Press, pp. 155-178.
  64. Gibson, James J. (2002): "A Theory of Direct Visual Perception". In: Alva Noë/Evan Thompson (Eds.), Vision and Mind. Selected Readings in the Philosophy of Perception, Cambridge, MIT Press, pp. 77–89.
  65. Sokolowski, Robert (2008). Phenomenology of the Human Person. New York: Cambridge University Press. pp. 199–200. ISBN 978-0521717663. https://books.google.com/books?id=NIEJt5afhwgC. 
  66. Richards, Robert J. (December 1976). "James Gibson's Passive Theory of Perception: A Rejection of the Doctrine of Specific Nerve Energies". Philosophy and Phenomenological Research 37 (2): 218–233. doi:10.2307/2107193. http://philosophy.uchicago.edu/faculty/files/richards/James%20Gibson's%20Passive%20Theory%20of%20Perception.pdf. 
  67. Consciousness in Action, S. L. Hurley, illustrated, Harvard University Press, 2002, 0674007964, pp. 430–432.
  68. Glasersfeld, Ernst von (1995), Radical Constructivism: A Way of Knowing and Learning, London: RoutledgeFalmer; Poerksen, Bernhard (ed.) (2004), The Certainty of Uncertainty: Dialogues Introducing Constructivism, Exeter: Imprint Academic; Wright. Edmond (2005). Narrative, Perception, Language, and Faith, Basingstoke: Palgrave Macmillan.
  69. 69.00 69.01 69.02 69.03 69.04 69.05 69.06 69.07 69.08 69.09 69.10 69.11 69.12 69.13 69.14 69.15 69.16 Gaulin, Steven J. C. and Donald H. McBurney. Evolutionary Psychology. Prentice Hall. 2003. ISBN:978-0-13-111529-3, Chapter 4, pp. 81–101.
  70. Dewey J (1896). "The reflex arc concept in psychology". Psychological Review 3 (4): 359–370. doi:10.1037/h0070405. https://pdfs.semanticscholar.org/a7ab/dafa9cca3547d8f441ee9dc3b5ad19ee7f59.pdf. 
  71. Friston, K. (2010) The free-energy principle: a unified brain theory? nature reviews neuroscience 11:127-38
  72. Tishby, N. and D. Polani, Information theory of decisions and actions, in Perception-Action Cycle. 2011, Springer. p. 601-636.
  73. 73.0 73.1 Ahissar E., Assa E. (2016). "Perception as a closed-loop convergence process". eLife 5: e12830. doi:10.7554/eLife.12830. PMID 27159238.  This article incorporates text from this source, which is available under the CC BY 4.0 license.
  74. 74.0 74.1 74.2 74.3 74.4 Goldstein, E. Bruce (2015). Cognitive Psychology: Connecting Mind, Research, and Everyday Experience, 4th Edition. Stamford, CT: Cengage Learning. pp. 109–112. ISBN 978-1-285-76388-0. 
  75. Treisman, Anne; Gelade, Garry (1980). "A Feature-Integration Theory of Attention". Cognitive Psychology 12 (1): 97–136. doi:10.1016/0010-0285(80)90005-5. PMID 7351125. http://homepage.psy.utexas.edu/homepage/class/Psy355/Gilden/treisman.pdf. 
  76. Goldstein, E. Bruce (2010). Sensation and Perception (8th ed.). Belmont, CA: Cengage Learning. pp. 144–146. ISBN 978-0-495-60149-4. 
  77. 77.0 77.1 Treisman, Anne; Schmidt, Hilary (1982). "Illusory Conjunctions in the Perception of Objects". Cognitive Psychology 14 (1): 107–141. doi:10.1016/0010-0285(82)90006-8. PMID 7053925. https://www.sciencedirect.com/science/article/abs/pii/0010028582900068. 
  78. 78.0 78.1 Treisman, Anne (1977). "Focused Attention in The Perception and Retrieval of Multidimensional Stimuli". Cognitive Psychology 14 (1): 107–141. doi:10.1016/0010-0285(82)90006-8. PMID 7053925. https://www.sciencedirect.com/science/article/abs/pii/0010028582900068. 
  79. Tomasello, M. (1999). The Cultural Origins of Human Cognition. Cambridge, Massachusetts: Harvard University Press. 1999.
  80. Tomasello, M. (2019). Becoming Human: A Theory of Ontogeny. Cambridge, Massachusetts: Harvard University Press.
  81. Val Danilov, I. & Mihailova, S. (2023). "Empirical Evidence of Shared Intentionality: Towards Bioengineering Systems Development." OBM Neurobiology 2023; 7(2): 167; doi:10.21926/obm.neurobiol.2302167. https://www.lidsen.com/journals/neurobiology/neurobiology-07-02-167
  82. McClung, J. S., Placì, S., Bangerter, A., Clément, F., & Bshary, R. (2017). "The language of cooperation: shared intentionality drives variation in helping as a function of group membership." Proceedings of the Royal Society B: Biological Sciences, 284(1863), 20171682. http://dx.doi.org/10.1098/rspb.2017.1682.
  83. Shteynberg, G., & Galinsky, A. D. (2011). "Implicit coordination: Sharing goals with similar others intensifies goal pursuit." Journal of Experimental Social Psychology, 47(6), 1291-1294., https://doi.org/10.1016/j.jesp. 2011.04.012.
  84. Fishburn, F. A., Murty, V. P., Hlutkowsky, C. O., MacGillivray, C. E., Bemis, L. M., Murphy, M. E., ... & Perlman, S. B. (2018). "Putting our heads together: interpersonal neural synchronization as a biological mechanism for shared intentionality." Social cognitive and affective neuroscience, 13(8), 841-849.
  85. Val Danilov, Igor (2023-02-17). "Theoretical Grounds of Shared Intentionality for Neuroscience in Developing Bioengineering Systems". OBM Neurobiology 7 (1): 156. doi:10.21926/obm.neurobiol.2301156. https://www.lidsen.com/journals/neurobiology/neurobiology-07-01-156. 
  86. Val Danilov, Igor (2023). "Shared Intentionality Modulation at the Cell Level: Low-Frequency Oscillations for Temporal Coordination in Bioengineering Systems" (in en). OBM Neurobiology 7 (4): 1–17. doi:10.21926/obm.neurobiol.2304185. https://www.lidsen.com/journals/neurobiology/neurobiology-07-04-185. 
  87. Val Danilov, Igor (2023). "Low-Frequency Oscillations for Nonlocal Neuronal Coupling in Shared Intentionality Before and After Birth: Toward the Origin of Perception" (in en). OBM Neurobiology 7 (4): 1–17. doi:10.21926/obm.neurobiol.2304192. https://www.lidsen.com/journals/neurobiology/neurobiology-07-04-192. 
  88. Snyder, Joel (31 October 2015). "How previous experience shapes perception in different sensory modalities". Frontiers in Human Neuroscience 9: 594. doi:10.3389/fnhum.2015.00594. PMID 26582982. 
  89. Cite error: Invalid <ref> tag; no text was provided for refs named CoonMitterer2008
  90. Block, J. R.; Yuker, Harold E. (1 October 2002). Can You Believe Your Eyes?: Over 250 Illusions and Other Visual Oddities. Robson. pp. 173–174. ISBN 978-1-86105-586-6. https://books.google.com/books?id=uNMFiMQu8BMC&pg=PA173. Retrieved 24 March 2011. 
  91. Cite error: Invalid <ref> tag; no text was provided for refs named Weiten2008
  92. 92.0 92.1 Cite error: Invalid <ref> tag; no text was provided for refs named HardyHeyes1999
  93. "On the Perception of Incongruity: A Paradigm" by Jerome S. Bruner and Leo Postman. Journal of Personality, 18, pp. 206-223. 1949. Yorku.ca
  94. "Predictive Coding". http://www.edge.org/q2011/q11_6.html. 
  95. O'Regan, J. Kevin (1992). "Solving the "real" mysteries of visual perception: The world as an outside memory." (in en). Canadian Journal of Psychology 46 (3): 461–488. doi:10.1037/h0084327. ISSN 0008-4255. PMID 1486554. http://doi.apa.org/getdoi.cfm?doi=10.1037/h0084327. 
  96. O'Regan, J. Kevin; Noë, Alva (2001). "A sensorimotor account of vision and visual consciousness" (in en). Behavioral and Brain Sciences 24 (5): 939–973. doi:10.1017/S0140525X01000115. ISSN 0140-525X. PMID 12239892. https://www.cambridge.org/core/product/identifier/S0140525X01000115/type/journal_article. 
  97. Hatfield, Gary (2023), Zalta, Edward N.; Nodelman, Uri, eds., René Descartes (Winter 2023 ed.), Metaphysics Research Lab, Stanford University, https://plato.stanford.edu/archives/win2023/entries/descartes/, retrieved 2023-11-11 
  98. Downing, Lisa (2021), Zalta, Edward N., ed., George Berkeley (Fall 2021 ed.), Metaphysics Research Lab, Stanford University, https://plato.stanford.edu/archives/fall2021/entries/berkeley/, retrieved 2023-11-11 
  99. Rohlf, Michael (2023), Zalta, Edward N.; Nodelman, Uri, eds., Immanuel Kant (Fall 2023 ed.), Metaphysics Research Lab, Stanford University, https://plato.stanford.edu/archives/fall2023/entries/kant/, retrieved 2023-11-11 

Sources

Bibliography

  • Arnheim, R. (1969). Visual Thinking. Berkeley: University of California Press. ISBN:978-0-520-24226-5.
  • Flanagan, J. R., & Lederman, S. J. (2001). "'Neurobiology: Feeling bumps and holes. News and Views", Nature, 412(6845):389–91. (PDF)
  • Gibson, J. J. (1966). The Senses Considered as Perceptual Systems, Houghton Mifflin.
  • Gibson, J. J. (1987). The Ecological Approach to Visual Perception. Lawrence Erlbaum Associates. ISBN:0-89859-959-8
  • Robles-De-La-Torre, G. (2006). "The Importance of the Sense of Touch in Virtual and Real Environments". IEEE MultiMedia,13(3), Special issue on Haptic User Interfaces for Multimedia Systems, pp. 24–30. (PDF)

External links