Nothing Special   »   [go: up one dir, main page]

Academia.eduAcademia.edu

Auditory attention—focusing the searchlight on sound

2007, Current Opinion in Neurobiology

Auditory attention — focusing the searchlight on sound Jonathan B Fritz, Mounya Elhilali, Stephen V David and Shihab A Shamma Some fifty years after the first physiological studies of auditory attention, the field is now ripening, with exciting recent insights into the psychophysics, psychology, and neural basis of auditory attention. Current research seeks to unravel the complex interactions of pre-attentive and attentive processing of the acoustic scene, the role of auditory attention in mediating receptive-field plasticity in both auditory spatial and auditory feature processing, the contrasts and parallels between auditory and visual attention pathways and mechanisms, the interplay of bottom-up and top-down attentional mechanisms, the influential role of attention, goals, and expectations in shaping auditory processing, and the orchestration of diverse attentional effects at multiple levels from the cochlea to the cortex. Addresses Centre for Auditory and Acoustic Research, Institute for Systems Research, University of Maryland, College Park, MD 20742, USA Corresponding author: Fritz, Jonathan B (ripple@isr.umd.edu) Current Opinion in Neurobiology 2007, 17:437–455 This review comes from a themed issue on Sensory systems Edited by Peter Mombaerts and Tony Zador Available online 21st August 2007 0959-4388/$ – see front matter Published by Elsevier Ltd. DOI 10.1016/j.conb.2007.07.011 of auditory attention. Although of great interest, we will say little about the psychophysics of selective auditory attention in extracting salient ‘signals’ from a complex and noisy acoustic ‘background’, as this topic has recently been reviewed elsewhere [7]. Since tremendous insights have been gathered on the mechanisms and effects of attention in other modalities, particularly visual attention [8,9], we shall discuss some of the emerging parallels (and differences) between visual and auditory attention. We shall focus on recent advances that reveal different ways in which the neural representation of sound is influenced by task-specific demands, expectations, and the focus of attention [10,11,12,13,14,15]. Since the pioneering work of Hubel, Galambos and colleagues [16], it has been known that the responses of neurons in auditory cortex can be strongly modulated by attention. Other neurophysiological studies confirmed these effects of auditory attention, and subsequent human studies showed that event-related potential (ERP) responses, including early responses (the P20–P50 — a mere 20–50 ms after stimulus onset) and the N1 waveform (100 ms latency) could be influenced by attention [17,18]. Since these early single-unit and ERP studies, there has been a gathering interest in the neurobiology of auditory attention, using a variety of approaches and techniques, including psychoacoustic, behavioral, neurophysiological (in single-unit, multi-unit, local field potential (LFP) and whole brain EEG studies), MEG (magnetoencephalography), and functional neuroimaging (PET and fMRI), all of which have added considerably to our insights into the nature of auditory attention. Introduction and overview Auditory attention allows us to rapidly and precisely direct our acoustic searchlight toward sounds of interest in our acoustic environment. Attention can be top-down (voluntary or task-dependent) or bottom-up (sound-based salience). At the interface of perception and action, top-down attention leads to enhanced information processing, behavioral sensitivity, and shortened response latencies. Topdown attention is a selection process that focuses cortical processing resources on the most relevant sensory information in order to maintain goal-directed behavior in the presence of multiple, competing distractions and comprises several distinct behavioral and neural processes operating at multiple levels [1,2,3,4,5]. Bottom-up ‘pop-out’ attention also plays an important role in ‘reading’ the acoustic scene and selectively gating incoming salient signals [6]. This review seeks to provide a roadmap to current insights and outstanding questions in the neurobiology www.sciencedirect.com One important methodological caveat and conceptual caution is that while many human and animal studies infer the presence of auditory attention (or its absence) from a combination of task design, subject behavioral performance, and the ensuing neural effects — attention itself can be flickering and elusive. It is notoriously difficult to precisely measure its highly variable selectivity, intensity, and duration. The lack of a commonly accepted, quantifiable measure of attention bedevils cross-study comparisons. Although the magnitude of attentional modulation of neuronal activity may scale with increasing task difficulty [19,20], this correlation can be confounded with the effects of task design and subject behavioral strategy [20]. In studies of human auditory attention, typical controls are to direct the subject to view a silent video or read a book and to ignore auditory input, but it is quite possible that the subject could sneak an occasional ‘listen’ or auditory peek. These Current Opinion in Neurobiology 2007, 17:437–455 438 Sensory systems Glossary A1: Primary auditory cortex. ACC: Anterior cingulate cortex — medial prefrontal structure likely to be important in control of attention. ASA: Auditory scene analysis — decomposition of complex mixture of incoming sounds into individual sound sources and sound streams. ERP: Event-related brain potentials (averaged EEG segments timelocked to stimulus onset). FBD: Foreground–background decomposition — separation of foreground sound stream of interest from background acoustic scene. MMN: Mismatch negativity — a negative waveform in the deviant ERP response that occurs about 150–200 ms after stimulus onset, evoked by an ‘oddball’ stimulus in a sound sequence in which rare sounds (‘deviants’) ‘pop-out’ in contrast to the repeated ‘standard’ sound. MMN is based on largely pre-attentive mechanisms, but can be influenced by attention. N1 (also known as the N100): First negative wave in ERP, occurring 100 ms after sound onset, followed by the N2 wave, occurring 200 ms after sound onset (later N2b is associated with auditory attention). P2 and P3 (also known as the P200 and P300): ERP positive waves following the N1, occurring 180 ms and 300 ms after sound onset (like N2b, later P3b is also associated with auditory attention). PFC: Prefrontal cortex — important component of frontal–parietal attentional network and likely to play a major role in setting goals and expectations, allocating and directing attentional resources, monitoring ongoing events in a short-term memory buffer. PFC is also a source of top-down projections that can dynamically shape sensory cortex in accord with changing task demands. SSA: Stimulus-specific adaptation — in probabilistic settings, in which one stimulus is common and another is rare, responses to common sounds adapt more strongly than responses to rare sounds. SSA, measured at a cellular level in auditory cortex, precedes and may induce the neural activity giving rise to MMN. SNR: Signal-to-noise ratio — in a physical acoustics sense, the ratio of the target signal amplitude in comparison to background noise (clutter) amplitude. STRF: Spectrotemporal receptive field — a characterization of both the spectral and temporal tuning properties of an auditory neuron, usually measured with reverse correlation or related regression techniques. difficulties present real challenges for experimentalists studying auditory attention in both humans and animals. Relationship between pre-attentive and attentive processes in auditory scene analysis In order to focus auditory attention on specific acoustic objects of interest in the real world, we typically make use of a combination of auditory spatial cues and auditory feature cues to solve the pattern recognition problem of foreground–background decomposition (FBD). This is illustrated by one of the best known examples of auditory attention, the ‘cocktail party effect’, in which we can attend and selectively eavesdrop on different speakers in a crowded room brimming with multiple conversations. Cherry [21] speculated on possible cues to its solution, including location, lip-reading, mean pitch differences, different speaking speeds, male/female speaking voices, and distinctive accents. Beyond the cocktail party example, sound sources may differ in a variety of acoustic Current Opinion in Neurobiology 2007, 17:437–455 dimensions (such as location or trajectory, instantaneous fundamental frequency, harmonicity, intensity, duration, rhythm, or the patterns of energy envelope modulation in different frequency bands) that facilitate grouping. Whatever the combination of cues, or the exact mechanisms involved in deciphering them [22], we accomplish this remarkable feat of selective attention to a single stream on a daily basis in varied acoustic environments with multiple sound sources. In order to do this, listeners must develop great proficiency at auditory scene analysis (or ASA), the process of segregating and grouping sounds from the mixture of sources that typify our acoustic environment to form representations of relevant auditory streams or objects [23–25]. This process of selectively directing attention to a single auditory stream in a complex, multisource auditory scene with different auditory elements vying and jostling for attention, may actually shape our perceptual organization of the elements in the scene [26]. Another familiar variation of the cocktail party effect occurs in the musical version when a listener focuses on the ‘voice’ of a single musical instrument playing in an ensemble [27]. Many animals are also extraordinarily adept at ASA, and there are numerous ethological parallels to the cocktail party, such as emperor penguins identifying the display call of their mate or offspring in the midst of a raucous cacophony of colony babble [28], or vampire bats identifying characteristic individual human snoring and breathing sounds [29] in polyphonic jungle soundscapes. Some have argued that an auditory stream can be formed completely without attention [30], but once formed it can become an object of attention. Others have presented experimental evidence in favor of a role for attention in the formation of auditory streams [31]. The controversy about the contribution of attention to ASA and stream formation has led to considerable experimentation on this fundamental question, which continues to be the focus of intensive research. Overall, the extraction of signal from noise and the separation of foreground from background is likely to be a multi-stage process that draws on bottomup gestalt grouping primitives, on auditory memory (our prior knowledge or expectations of the auditory ‘players’ in the acoustic scene), on attention, as well as other forms of top-down control (Elhilali et al., unpublished data) [32,33]. A crucial and ubiquitous survival skill in the toolkit of animal hearing is the ability to detect the presence of novel or ‘deviant’ sounds amidst the familiar hum of background environmental noise. There is evidence that the brain has evolved a fairly sophisticated novelty detection system that includes an automatic, pre-attentive component that assists in parsing the acoustic scene into streams and analyzes stability and novelty, even for taskirrelevant streams [34–37]. In this system, repetitive stimuli are generally ignored and deviant or ‘oddball’ www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 439 stimuli ‘pop-out’. Oddball detection has been particularly well studied in the auditory system, but is likely to share a common neural network with deviance detection in other modalities [38,39]. The acoustic change-detection system comprises an interconnected set of elements, including ‘adaptive’ neurons, with generalized stimulus adaptation responses, and also more specialized acoustic ‘novelty’ detection neurons that encode stimulus deviation from the pattern of preceding stimuli. ‘Novelty’ signals have been studied primarily at a cortical level, but may occur as early as inferior colliculus [40], suggesting the possibility that subcortical pathways for change detection may play a role in directing attention to novel sounds. However, an alternative explanation [41] is that the ‘novelty’ responses in the inferior colliculus arise from top-down cortical projections. Two EEG signatures for change detection have been described in the human brain: first, the attenuation of an early (100 ms) N1 response with repeated stimulation and secondly, the evocation of a later (100–200 ms) mismatch negativity (MMN) response when a novel stimulus occurs after a repetitive sequence of acoustic ‘standard’ stimuli [42]. Although the N1 wave may represent change-detection processes distinct from MMN [43], the main focus of change-detection studies has been on MMN. As mentioned, MMN is evoked in response to any infrequent, discriminable acoustic change in the stimulus stream and can be elicited by deviations in stimulus frequency, intensity, duration or spatial location, or by irregularities in spectrotemporal sequences (over periods as long as 20 s), or in other patterns of complex sounds including speech [44,45] and music [46,47]. MMN has been shown to be sensitive to changes in global acoustic context. Since MMN to elementary acoustic events can be evoked in sleep or under anesthesia, or when attention is diverted to other modalities, these novelty responses are believed to be largely pre-attentive. This deviancy detection system continuously monitors the auditory environment, tracks changes, and dynamically updates its representation of the acoustic scene [44] and is likely to be composed of parallel sensory (refractory-response-based) and cognitive (memory-comparison-based) components [48]. The source of MMN may shift depending upon the auditory areas analyzing the deviant acoustic change. The underlying basis of MMN is thought to be that incoming sounds are compared with the current neural representation of regularities in the acoustic scene and ‘oddball’ sounds that do not match the representation elicit MMN. Once the novel sounds are identified by the automatic detection system, they activate an attentional ‘interrupt’ involving frontal activation. These ‘flagged’ novel sounds can then be analyzed further to see whether they may merit attention and behavioral response (the pre-attentive salient filters may also automatically enhance responses to stimuli that have instinctive or learned biological importance). Although MMN can be elicited independent of attentional state, under certain www.sciencedirect.com conditions it can also be modulated by attention [18,30,49– 53] and hence may be thought of as ‘semi-automatic’. Topdown control may trump involuntary attention switching to task-irrelevant distractor sounds through attentional modulation by the prefrontal cortex of the deviance detection system in the auditory cortex [54]. In target detection, the ‘pre-attentive’ MMN can also occur in conjunction with other later ERP components associated with focused attention (the N2b (200–300 ms after stimulus onset) and the P3b (300–350 ms after stimulus onset), which may be generated by activity in the anterior cingulate and prefrontal cortices [55]. The presence or absence of these late ERP components can be used to ascertain whether or not subjects actually attended to sounds detected by their preattentive deviance detection system. There is also a ‘cognitive’ component of the MMN as shown by its sensitivity to linguistic change. Recent studies have reported a leftlateralized ‘phonological MMN’ for native phonetic features. MMN has been reported to reflect categorical perception of consonants and vowels, and MMN response characteristics are influenced by the lexical status and grammaticality of a word string, and even the semantic meaning of words used as deviant stimuli [45]. A preattentive ‘cognitive’ ERP, similar to MMN, can be elicited by violations of musical harmony or syntax [56]. Thus, MMN can be evoked by changes in a series of highly complex acoustic stimuli, and by cognitive rules, as well as by low-level acoustic changes in tone-pip frequency or intensity. Considerable effort has been directed to discovering the neural basis of this fast, pre-attentive, ‘bottom-up’ novelty detection system in the human brain, and there is debate as to whether the dominant locus of MMN cortical generation in auditory cortex might vary as a function of changing acoustic features. A recent study [57], combining EEG and fMRI, suggests that at least three cortical regions are involved in MMN, including primary auditory cortex, cortical areas in the planum temporale and neighboring posterior superior temporal gyrus, and ventrolateral prefrontal cortex. The authors conjecture that these regions may comprise a functional hierarchical network, with corresponding initial detection of acoustic change in A1 (or below), further feature analysis of the identified change in higher auditory areas, and attentional gating in prefrontal cortex if the acoustic change is deemed sufficiently novel or salient [57]. Although the MMN has been attributed to a comparison of incoming sounds with a sensory memory ‘trace’ of previous repetitive acoustic features, questions remain about the relative contributions of the temporal and frontal lobe generators [58], and a recent combined MEG and fMRI neuroimaging study [59] suggests instead that the MMN is generated as a result of differential stimulusspecific adaptation of two distinct auditory cortical N1 sources. Recent promising animal studies have emphasized the importance of stimulus-specific adaptation (SSA) in A1 as a possible neuronal mechanism underlying this Current Opinion in Neurobiology 2007, 17:437–455 440 Sensory systems initial stage of ‘oddball’ detection [60,61]. In these studies, SSA was not observed at the thalamic level [60]. As few neurophysiological studies of SSA have been performed in higher auditory areas [41,62], it remains to be seen whether single-unit studies of SSA can help pinpoint the source of MMN generation. Although detailed mechanisms are still unknown, MMN generation, but not early stimulus onset responses, is suppressed by blockade of NMDA receptors [63]. Although promising, further studies of the molecular and cellular basis of MMN must await further development of a good animal model system, ideally one in which simultaneous event-related potentials (with results comparable to human MMN) and single-unit recordings can be conducted simultaneously [41]. The common behavioral design of many human MMN studies consists of subjects listening passively to a stream of auditory stimuli in the oddball paradigm — with no measure of the behavioral effects of the deviant acoustic stimulus in the stream. Without such a measure, it is not possible to distinguish between automatic neural responses arising from acoustic variability and responses related to ‘attentional capture’. In fact, the network activated during involuntary auditory stimulus-driven ‘attentional capture’ [64] is neuroanatomically similar to the dorsal fronto-temporal spatial attention network [65,66] activated during voluntary focus of auditory attention and may be complementary to the ventral ‘MMN’ network composed of bilateral superior temporal gyri and inferior frontal gyri, which automatically responds to acoustic variability independent of task salience or auditory attention. The automatic component of the change-detection system may rely upon a concatenated set of basic habituation mechanisms and what Bregman referred to as a ‘bottomup’ or ‘primitive’ grouping [23]. Automatic pre-attentive ASA is certainly not the only route to acoustic scene segregation — for example, attention may play an important role by limiting the processing of unattended input in favor of attended streams of input [37]. In addition, Bregman suggested a set of top-down grouping processes that he termed ‘schema-driven’ mechanisms on the basis of acquired expectations from prior experience or knowledge. Recent results also suggest the presence of at least two cortical mechanisms of streaming — an automatic ‘preattentive’ segregation of sounds and a streaming mechanism that builds up over a period of up to several seconds that can either be pre-attentive or modulated by attention [24,30,31,34,67–69]. Additional areas may be recruited, such as the intra-parietal sulcus [70], which was differentially activated depending upon whether subjects heard either one or two streams (this same area is activated in visual scene segregation and may be involved in supramodal scene analysis). The process of auditory scene analysis sets the stage for further attentional selection and seamlessly interacts with the auditory attention system Current Opinion in Neurobiology 2007, 17:437–455 [36,48,71]. The existence of this capacity for automatic pre-attentive scene analysis can free up attentional resources to ‘fine-tune’ segmentation of a complex acoustic scene [31], or focus on individual streams and extract meaning from the attended stream [30]. Thus, even a simplified explanation of the cocktail party effect must include an understanding of the interplay between ASA, and our abilities to selectively direct spatial attention to specific sound sources within the acoustic scene and to direct featural attention by focusing on distinctive acoustic vocal features (such as fundamental frequency, timbre, accent, intonation) in order to identify individual speaker voices, all interwoven with top-down disambiguation processes, that assist in the retrieval of lexical information in noisy speech conditions [72] and help parse phonetic input in accord with the semantic line of the conversation. Auditory spatial attention Depending upon whether an auditory task requires attending to a spatial location, or to an auditory feature or object, there may be differential activation of the auditory ‘what’ and ‘where’ pathways [73–75]. Attentional mechanisms can modulate neural activity encoding the spatial location and/or the acoustic attributes of the selected targets and the early sensory representation of attended stimuli [2]. For simplicity, we shall distinguish between auditory spatial and non-spatial featural attention in the present and subsequent sections, although as our discussion of the cocktail party problem illustrates, we are usually confronted in the real world with acoustic challenges that require a combination of both. Spatial attention is supramodal — in the sense that crossmodal (visual or tactile) spatial cues can enhance the auditory ERP for acoustic stimuli presented in the same location [76]. A series of recent neuroimaging papers has emphasized the presence of a shared frontoparietal neural network for both visual and auditory spatial attention [65,66,77]. Impairment of this network, which includes medial frontal cortex and frontal eye fields, cingulate and posterior parietal cortex and anterior insula, can lead to combined visual and auditory neglect [78–80]. In the overlapping auditory and visual spatial attention network, the PFC plays important roles in tracking task goals and biasing sensory cortices toward task-relevant stimuli, the ACC is critical for executive attentional control, the FEF contributes to attentional orienting, and posterior parietal cortex in the human superior parietal lobule (and homologous monkey LIP) also shows enhanced responses to salient stimuli [81,82]. Top-down modulation of spatial attention in the visual system appears to work at multiple cortical and thalamic processing levels to achieve different functional goals [4]. The latency of attentional effects can be variable depending upon the task — a recent study [83] of top-down www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 441 feedback underlying visual spatial attention demonstrated relatively long-latency (150–250 ms) attentional effects in V1, occurring well after the initial stimulusdriven response (60–90 ms). A recent study has shown that microstimulation of the FEF (frontal eye fields) at levels too small to elicit eye movements leads to increased behavioral sensitivity for visual motion or stimulus change at that location and other attention-like enhancement of V4 responses [84]. Although these results suggest that a tight coupling can exist between planned eye movements and predictive attentional gain increases in the primate visual system, however other recent work has shown that the oculomotor system can be decoupled from the attention system. There is evidence that FEF is a component of a spatial attention system independent of eye movements [82,85,86] and plays a role in covert spatial attention (without eye movements). Nevertheless, given the presence of increased FEF activity preceding aurally guided saccades [87], one might predict selective gain increases in auditory cortical activity following FEF microstimulation. In fact, evidence for top-down modulation of spatial attention in the auditory system has recently been obtained in studies of the barn owl [88]. The barn owl is a superb animal model for auditory attention since it has developed neural mechanisms for exquisitely focused spatial auditory attention in order to optimize sensory processing by using an ‘auditory searchlight’ directed toward its prey. Microstimulation of the forebrain gaze control field in the barn owl (analogous to the frontal eyefields in primates) sharpened the spatial selectivity and enhanced the responsiveness of matched space-specific neurons in the topographic map of auditory space in the deep layers of the midbrain tectum. By contrast, responses of neurons with preferred sound receptive fields outside the stimulated arcopallial gaze field (AGF) location suppressed responses. Moreover, since the AGF controls both visual and auditory gaze, this suggests multimodal integration and shared mechanisms for visual and auditory attention (Winkowsky and Knudsen, unpublished data) [89]. In a natural context, such top-down attentional signals in the owl could spotlight a spatial location and by sharpening auditory as well as visual tuning, it could enhance precision of spatial localization for sounds and visual stimuli emanating from this point in space. As mentioned, these results are especially intriguing because the pathways for top-down modulation of auditory and visual attention in the owl are so closely parallel those described for descending modulation of visual attention in the primate from the frontal eyefields to the superior colliculus [84]. These results also suggest that the strategy that the brain uses to direct the spatial attentional spotlight is common to both sensory modalities, and the pattern of top-down modulation may be highly conserved across species [5]. www.sciencedirect.com Auditory feature and object attention — extracting signals from background A variety of complex, shifting acoustic soundscapes present enormous challenges for acoustic scene analysis and for attentional focus on auditory features or objects such as environmental soundscapes (such as a morning chorus of birds), polyphonic music and speech. Top-down attention can selectively focus on a limited range of an acoustic feature dimensions [7], or can even focus on the expected (or recalled) features of an auditory target [15]. Although bottom-up salience certainly plays a vital role, voluntary auditory attention is the key to highlighting foreground over background and switching attentional focus to different features, objects, or streams of interest within the acoustic scene [10,11,13,27,34,90,91]. In an ongoing set of animal studies on auditory attention, selective spectral attention in single tone and multi-tone detection and two-tone discrimination tasks has been shown to rapidly reshape neuronal receptive fields in primary auditory cortex to enhance responsiveness at the target frequency and suppress responsiveness at adjacent spectral frequencies [10,11,12,92,93]. Similar spectral receptive-field effects were observed when the task was to detect a target tone signal in the midst of a noisy background [94]. By contrast, a distinctive set of temporal changes in cortical spectrotemporal receptive fields was found [95] when the animals engaged in temporal tasks (such as silent gap detection, duration discrimination, or click rate discrimination). These results suggest that top-down signals can adjust attentional filters precisely and rapidly to dynamically reshape receptive fields in the primary auditory cortex in accord with salient target features and task demands [12]. Recordings from the orbital prefrontal cortex of the behaving ferret [96] have shown rapid onset (75– 150 ms latency) phasic and sustained target responses, independent of the acoustic characteristics or identity of the target stimulus. These prefrontal target responses were often behaviorally gated, developed rapidly and may contribute to target recognition during task performance. Future experiments will investigate whether such prefrontal activity plays a role in top-down influence over primary sensory filter properties. Most studies of attentional effects in visual cortex have emphasized modulatory changes in background firing rate and contrast gain control, and hence an additive or multiplicative enhancement of feature tuning curves [97]. However, there is recent evidence in V4 for modulatory effects of attention, leading to shifts in neuronal tuning, and hence in the neural representation of stimuli [98,99]. These results in visual cortex, and parallel findings in the auditory cortex, are consistent with a matched filter model, in which neurons shift their tuning properties toward attended features in order to increase processing efficiency (David et al., unpublished data) [10,11,12]. Current Opinion in Neurobiology 2007, 17:437–455 442 Sensory systems Human behavioral studies have shown a dissociation between a pre-attentive, low-level sound segregation process and an attention-dependent process that can be called into play in forming perceptual objects and streams. The perception of streaming can take up to several seconds to build up, and this attention-dependent streaming mechanism can be reset by an attentional shift [31]. There is a similar build-up at the neural level, measured by modulations in ERP in auditory temporal cortex [69] and in a recent MEG study [34]. Considerable work on the neural basis of attention to speech has recently been reviewed [100], focusing on neuroimaging studies of the attentional selection of foreground speech, differing by location or speaker identity from concurrent background speech. Selective attention to the human voice (compared with a silent reading condition) enhanced brain activity bilaterally in the superior temporal sulcus, higher auditory cortex, inferior frontal cortex, though not in the prefrontal cortex [101]. This was not the case for a selective-attention working memory task where subjects were asked to attend to either voice identity or location [102]. In this study, attention to voice location evoked greater activation in the dorsal prefrontal cortex, whereas relatively greater activation for voice identity was observed in the ventral prefrontal cortex. As might be expected, task conditions could also change the pattern of brain activation for attention to speech — for example, the left hemisphere temporal areas were dominant during speech comprehension tasks, whereas right hemisphere temporal areas were activated preferentially during attention to prosody [100]. If subjects were presented with the same set of speech stimuli but were asked to attend to specific linguistic stimulus categories in different task conditions, striking differences were observed in the pattern of activation with temporal lobe auditory areas [103]. Moreover, our greater familiarity with speech than with other acoustic stimuli may cause differential effects in otherwise similar task conditions. For example, although the PFC is activated in listening tasks that require selective attention to location, pitch cues, or even attentional listening to dichotic CV syllables [104], PFC was not activated during selective attention to the human voice [101]. ERP recordings showed a very different pattern of activity when listeners either were asked to identify concurrent vowels, or were asked whether one or two auditory objects were present using mistuned harmonic information [25]. Given the paramount importance of speech to our daily lives, research on both pre-attentive and attentional processing of speech is valuable — to study speech comprehension and auditory selective attention to speech as an information channel in the presence of background noise (which may help to develop improved automatic speech recognition systems) and also to understand the process of attention to the vocal features that reveal speaker vocal identity and vocal prosodic qualities that color speech (such as mood, emotional inflection, and nuance). Current Opinion in Neurobiology 2007, 17:437–455 Research has also begun to explore the neural basis of auditory attention to music, which presents a complex challenge similar to that presented by selective auditory attention to speech [56]. As an example of future directions, a recent study [105] has shown that attention can enhance ERPs elicited by deviations in harmonic context. Auditory attention in time Precisely focused temporal expectancies, such as musical expectancies, are likely to be very important in auditory processing since many auditory patterns unfold in time. A recent ERP study has shown that auditory attention can be temporally directed to focus on events that are projected to occur at a particular future point in time [106]. In another study, subjects engaged in an auditory task could avoid involuntary attentional capture by distracting acoustic stimuli, by foreknowledge of when the taskirrelevant acoustic changes would occur [107]. Performance improved for subjects who were cued when to listen for an acoustic target [108]. A combination of temporal and spatial cues was particularly effective [106,108]. Temporal foreknowledge of acoustic events can lead not only to enhancement of cortical responses but also to their suppression, particularly when the sounds are self-triggered [109]. Recent research has also begun to explore other aspects of timing in auditory attention, such as the role of attention in auditory temporal discrimination [110] and in event segmentation in music [111]. General principles are likely to apply to both auditory and visual attention selectively directed to different points in the dimension of time [8]. There is evidence that the neural basis for temporal expectancies and temporal discrimination is supramodal [112–114] and involves a network including prefrontal cortex, basal ganglia, and cerebellum. Top-down modulation of neural responsiveness during temporal attention, can be precise and influence timing in primary and higher order visual cortex [115,116]. There is abundant experimental evidence for unimodal and crossmodal temporal attentional effects for auditory as well as for somatosensory and visual tasks. Attentive imagery in silence and hallucinations In auditory induction (such as the familiar psychoacoustic phenomena of FM completion or phonemic restoration) the auditory system fills in occluded information, as when missing foreground sound segments are perceptually restored in the presence of background sound [23,117,118]. Although it might seem at first glance like a reasonable candidate for top-down effects, there is considerable evidence for pre-attentive mechanisms in auditory induction [118], though this may be influenced by attention in phonemic restoration [119]. However, there can be remarkably strong effects of top-down attention on auditory processing in active listening, as observed in studies that have shown that human auditory cortex is activated in silence, in the complete absence of any www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 443 real-world acoustic stimulation, when there is simply an inner expectation of sound [66,120,121], during the short, quiet interlude of musical transitions [111], or when subjects are imagining auditory stimuli [122], or when they are prompted by silent visual stimuli that are usually accompanied by sound [123,124]. In a recent neuroimaging study of auditory attention [125] subjects were asked to indicate when they heard a noise burst following a silent period of variable duration. In addition to activation of frontal cortical areas implicated in attentional control, imaging also revealed increased activity in auditory cortex contralateral to where the sound was expected to occur while the subjects listened in silence (as well as enhanced responses to the stimulus itself). In another recent neuroimaging study [126] subjects also listened to musical passages in which silent gaps had been introduced. If the subjects were familiar with the music, they heard ‘the music in their mind’ throughout the silence, although this was not the case with unfamiliar music. This is an extraordinary display of the importance of attentive expectation in shaping cortical responses. Activation of auditory cortex was greater during the silent gaps inserted in familiar songs than during silence in unknown songs, showing the continuous interaction of attention and memory during active, expectant listening. Auditory cortical activation has also been reported during silent lipreading [123,127,128] or during observation of silent piano playing [124]. There are parallel findings in the visual system, indicating top-down attentional increases in cortical activity in V1 [129] and enhanced reward timing expectancies [116] even in the absence of a visual image. These compelling results suggest a general set of attentional mechanisms for top-down priming of sensory cortices. Modulatory effects of attentive expectation may be very similar to the top-down effects of perceptual decisions and can influence the earliest stages of cortical auditory processing. This is shown by an ERP study of a difficult acoustic target detection task [130] in which activation was highest for subjective perceptual decisions leading to target-present behavioral responses (even in cases of false alarms in which the subjects incorrectly believed that the target was present, in the physical absence of the target stimulus). In order to differentiate between imaginary and realworld sound, the brain may rely on a validation system, dependent upon reciprocal interactions in a neural network including auditory cortex, frontal cortex, and anterior cingulate cortex (ACC). A recent imaging study [131] of speech-sensitive auditory cortex during silence found spontaneous, intermittent episodes of increased neural activity. In light of previous observations that the ACC is activated during auditory hallucinations, it may be that in the ‘default mode’ of the brain, endogenous activity within auditory regions is modulated by the ACC. Such spontaneous activity of the auditory cortex during silence may offer a neural substrate for the dewww.sciencedirect.com velopment of auditory hallucinations in patients with acquired deafness or schizophrenia [9,131–133,134]. Effects of auditory attention on receptive-field plasticity The adaptive functions of the cerebral cortex rely upon flexibility and plasticity of information processing networks. Since the topic of plasticity of auditory cortical processing has been recently reviewed [135], we will focus in this section on the evidence that attention may play a decisive role in triggering auditory plasticity, particularly in the adult brain [12,136]. Parallel studies have shown that attention can initiate plasticity in other sensory cortices, as well as in motor cortex [137]. Unlike experience-induced plasticity in juvenile animals, where experience alone can induce plasticity by mere exposure during the sensitive period, adult animals must generally attend to an acoustic cue for plasticity to be induced for the attended auditory feature [12,135,136]. A recent animal study [138] showed that two very different forms of auditory cortical global plasticity arose during perceptual learning over a period of weeks when rats were trained, and presumably attended, to different features (frequency or intensity) of the same acoustic stimulus set. In another study of global plasticity in A1, rats were trained on an operant auditory conditioning task. In a cleverly designed parametric set of task conditions, rats were variably motivated to respond to the conditioned stimulus, and a range of performance levels was obtained corresponding to behavioral importance of the CS [139]. Subsequent A1 mapping showed a CS-specific expanded representation whose area was directly correlated with performance level, showing that behaviorally important, and presumably attended, sounds gain relative representational area. A complementary set of experimental studies, designed to explore local neuronal plasticity over a much shorter timescale (a period of minutes) for animals trained on multiple tasks, indicates that shifting the selective attentional focus to different acoustic dimensions or features may be instrumental in dynamically shifting acoustic spectral filter properties of A1 neurons and swiftly changing from one cortical state to another [10,11]. These results suggest that rapid auditory task-related plasticity is an ongoing process that occurs as the animal switches between different tasks and changes its focus to new, salient acoustic cues and goals. These changes are persistent and widespread — as many as two-third of cortical neurons in A1 showed such frequency-selective enhancement during, as well as following, tone detection or tone discrimination task performance [10,140]. This suggests that adaptive changes in receptive fields and frequency response profiles of A1 neurons that shift cortical states or filter properties depending upon the behavioral demands of the ongoing task demands can be attentionally gated. In this view, attention is the key trigger that initiates a Current Opinion in Neurobiology 2007, 17:437–455 444 Sensory systems cascade of events leading to the dynamic receptive field changes to enhance figure/ground separation, by using a contrast matched filter to filter out the background, while simultaneously enhancing the salient acoustic target in the foreground [12]. A recent set of studies examined attentional modulation for auditory features in a frequency-independent task [141,142,143] and demonstrated a long-term increase in the proportion of neurons preferring downward contours in A1 of monkeys trained on a frequency-independent tone contour task (in which reward was associated with downward contours). Preliminary evidence from our current studies of A1 activity during a similar frequency-independent contour task indicates that changes in preferred directional contour also occur dynamically, at short-term as well as longterm time frames [144]. Three relevant sets of animal studies also emphasize the importance of auditory spatial attention in relation to plasticity. The optic tectum (OT) of the barn owl contains matched topographic maps of auditory and visual space. Barn owls raised during a crucial developmental period with horizontally displacing prisms rapidly acquire a new auditory space map in the OT that restores alignment with the prismatically displaced visual map. Although juvenile owls readily acquire these new aligned maps of auditory space as a result of experience, this plasticity is severely reduced in adults. Similar age dependencies have been shown for plasticity of the auditory space map in the superior colliculus in ferrets [145]. In previous studies in owls, the plasticity of the space map was tested in owls that were fed dead mice. However, when adult owls were given the opportunity to hunt live prey for a short period each day, drawing on their extraordinary nocturnal hunting skills, and powers of attentive listening, their auditory maps showed greatly increased adaptive plasticity. This increased adaptive map plasticity correlated with behavioral improvements in the owls’ hunting prowess [146]. There are multiple factors in the hunting condition that may have contributed to the increase in map plasticity, including increased arousal, and what is likely to be the key, greater auditory and visual attention to a highly salient bimodal source (movements of the live mice prey provided the owl with correlated auditory and visual information) that could enhance crossmodal integration and thus help calibrate the auditory and visual space maps. Additional recent experimental evidence also shows that training can induce enhanced plasticity of auditory localization in the adult mammalian brain [147]. Adult ferrets rapidly relearned to localize sounds following reversible occlusion of one ear, but only if they performed an auditory localization task that used these cues, not if they performed a comparable visual localization task. In both examples, auditory attention appears to be essential to elicit adult plasticity. The third study is a preliminary investigation of behaviorally driven plasticity in the spatial sensitivity of neurons in the dorsal zone of the cat primary auditory Current Opinion in Neurobiology 2007, 17:437–455 cortex [148]. The spatial tuning curves of dorsal zone neurons were sharpened when the animals performed a sound localization task, compared with the same neurons when the animal was either engaged in a simple sound detection task, or listened passively to the stimuli. Thus, selective attention to a spatial task can lead to rapid spatial receptive-field plasticity results from selective attention to the spatial task. In addition to the evidence from animal studies, attention-driven plasticity has also been shown to occur in the human auditory cortex for spectral, temporal, complex spectrotemporal, and spatial processing [32,110,149–152]. For example, in a study of the neural basis of rapid perceptual learning, listeners were trained to segregate concurrent double-vowel stimuli [33]. In parallel with improved performance, there were rapid changes in ERP amplitude within the first hour of training, consistent with top-down modulation. By contrast, no changes in ERP amplitude were observed in the absence of attention to the double-vowel stimuli (this was shown in a separate group of participants, given the same acoustic exposure, but instructed to ignore the double-vowel stimuli and attend visually to a muted movie of their choice). The presence of dynamic plasticity in cortical representation of acoustic space has been suggested by studies of the ventriloquism after effect [153] and supported by subsequent research showing that spatial auditory attention can also drive auditory spatial plasticity [152]. Attention to target frequency in a discrimination task rapidly changed the tonotopic map in primary auditory cortex, expanding the distance between the two discriminant tone pair frequencies [152]. In all of the studies described in this section, attention appears to have fast and also slow, lingering plastic effects — raising twin questions: specifically, what determines the duration of the persistent plasticity triggered by attention, and more generally, what is the nature of the neural trace that attention leaves in its wake? Intermodal and crossmodal interactions between auditory and visual attention There are many similarities between attention in the auditory and visual modalities, where a two-component framework for attentional selection (top-down and bottom-up) has also emerged from psychophysical, behavioral, and neurobiological studies. Two sets of mechanisms are thought to operate in parallel in both modalities: using either bottom-up, automatic, imagebased saliency cues or top-down, attentional, task-dependent cues. Another fundamental similarity is that attention can modulate both spatial and non-spatial feature processing in both modalities. Moreover, in addition to these similarities, there is now mounting neuroimaging evidence for visual modulation of activity in many auditory cortical fields [154] and a growing realization that all cortex is multisensory [155] that was presaged by earlier www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 445 work, such as a pioneering neurophysiological study of auditory and visual responses in auditory cortex of monkeys performing a modal selective-attention task [156]. If the brain were to use common but limited attentional resources, then intermodal attention (attending to only one relevant sensory modality) might necessitate suppressing responses to the irrelevant sensory input. Several studies have examined how responses in auditory cortex to an acoustic stimulus are affected by other (attended or unattended) ongoing sensory events. In keeping with a limited resources model, a common finding is that when attention is drawn away from an auditory event (by the presence of a visual stimulus and particularly, by attending to a visual task (compared with a no-competitivestimulus baseline)) then auditory cortex generally shows decreased activity in acoustic stimuli [157–160,161], but not always [162,163]. Conversely, many of these studies also find that attention to auditory stimuli enhances activity in auditory cortex. These basic results were confirmed and extended in studies [3,164] that examined unimodal and bimodal task conditions. In the unimodal auditory task, there are generally greater responses, particularly in secondary auditory cortical areas when the subjects were actively, rather than passively, listening to the acoustic stimuli. In the bimodal case, enhanced responses in auditory cortex were seen during the auditory attention task and suppressed responses observed during the visual attention task. Analysis of the functional connectivity between auditory and visual cortical regions in visual and auditory tasks indicated a reciprocal inverse relationship — increases in auditory activation were directly correlated with decreases in visual activation (and vice versa). The ability to divide (and switch) attention between unrelated visual and auditory stimuli was decreased following transcranial magnetic stimulation that disrupted function of the dorsolateral PFC [165], thus emphasizing the importance of the PFC in allocating limited attention and working memory resources and in balancing simultaneous multiple attentional demands. A related neuroimaging study also used a bimodal behavioral paradigm [166] to create a conflict between an auditory or visual target, and a crossmodal distractor. As the distracting stimulus in the task-irrelevant sensory channel was increased, there was a compensatory increase in selective attention to the target in the relevant channel and a corresponding increase in activation in the relevant sensory cortex. Moreover, the larger this increase, the less behavioral interference was observed. The results of these studies suggest a form of top-down sensitivity control that regulates the flow of attended information by modulating the relative strengths of different sensory information channels. There are multiple possible levels for intermodal effects on auditory processing and attention. It is truly remarkable that many of these attentional effects can be www.sciencedirect.com observed not only at the cortical level but also at the auditory periphery. A recent study [167] confirmed earlier work suggesting that visual attention can modulate peripheral auditory responsiveness in the cochlea and in the cochlear nucleus. In cats, acoustic attention can enhance auditory responses in the dorsal cochlear nucleus. In contrast, visual attention to a mouse or olfactory attention to fish odors reduces auditory responses in the dorsal cochlear nucleus. In contrast, visual attention [168], and a visual discrimination task reduces auditory nerve responses to clicks [169,170]. In humans, evoked otoacoustic emissions can be modulated by auditory attention in a frequency-specific manner [171]. The massive auditory corticofugal system is ideally suited for attentional modulation and hence many of these early peripheral effects are likely to reflect top-down influences. For example, in mustached bats, cochlear hair cells can be modulated by activity in the auditory cortex [172]. Another intriguing set of results also bears on the potentially distracting effects of visual stimuli on auditory attention. In trace auditory fear conditioning there is a time gap between the end of the conditioned stimulus (such as a conditioning tone) and the start of the unconditioned stimulus (such as tailshock) in mice. Recent studies have shown that trace auditory fear conditioning requires attention in mice [173] and also in humans [174]. Supporting this attentional requirement, trace auditory fear conditioning is associated with increased activity in the anterior cingulate cortex (ACC) and is impaired by lesions of the ACC that may disrupt attention to the tone-shock contingency [173]. The key point is that trace auditory fear conditioning can be impaired by distracting visual stimuli, adding yet another twist to the story of manifold visual influences on auditory attention, auditory behavior, and auditory-driven brain activity. Although our previous discussion has emphasized competition between sensory channels in the limited resources model, in relatively simple low-level task contexts (such as pitch discrimination or contrast discrimination) there may be no conflict over limited attentional resources since there are apparently sufficient separate attentional resources for both vision and audition [175]. There are clearly other cases where auditory and visual inputs both contribute to information processing. Such cooperative interactions lead to early multimodal integration [176] or to multisensory enhanced activation in primary and secondary auditory cortex, as in lipreading [123,127,128], attention to complex audiovisual combination stimuli [177], source localization with bimodal cues [108,178], ventriloquism [179], or visual cueing in auditory scene analysis [180]. The neural representation of human walking in the temporal biological motion area is another example of higher level audiovisual integration — in which both visual and auditory inputs (sound of footsteps) activate the same area [181]. An early study Current Opinion in Neurobiology 2007, 17:437–455 446 Sensory systems [156] showed that neuronal responses in auditory cortex (during a selective-attention task in which different auditory and visual cues were associated with a two-choice lever push) were stronger when visual and auditory cues were in agreement and were reduced when the bimodal cues were in conflict [156]. A recent physiological study [143] demonstrated enhanced spike activity in monkey auditory cortical neurons to task-related visual inputs (that signaled task onset), but only in an auditory behavior task-context, suggesting attentional gating of relevant visual input to auditory cortex. Such attentional or behavioral gating of task-relevant visual input was also observed in the inferior colliculus of monkeys trained to saccade to an acoustic target [182]. Neural networks of auditory attention Auditory attention can be selectively directed to a rich variety of acoustic features including spatial location, auditory pitch, frequency or intensity, tone duration, timbre, FM direction or slope, speech versus nonspeech streams, and characteristics of individual voices. Given the multiplicity of acoustic dimensions to which we can attend and the richly interconnected auditory processing networks, there are likely to be multiple neural loci for auditory attention. In fact, the locations of the multiple loci of attentional influence on auditory information processing are flexible and are likely to be dependent upon the specific demands of the behavioral task being performed. This has also been suggested to be the case in the visual system [9,183]. Neuroimaging studies examining the common neural circuitry underlying the control of both visual and auditory attention have revealed a largely overlapping frontoparietal network [66]. Depending upon task, there may be a segregation of attentional effects along the what/where pathways, as suggested by a recent MEG/fMRI paper [2] that provides further evidence for the presence of dual-selective-attention effects on sound localization and identification. Most functional imaging, EEG, MEG (but not physiological) studies find overall enhancement of auditory cortex activity by selective attention to sound [17,157,184–186]. However, one source of controversy has arisen over whether attentional effects are found predominantly in primary or secondary auditory cortex, or can be equal in magnitude throughout auditory cortex depending upon attentional demands. In the visual system, there is some evidence consistent with increased attentional effects at higher cortical processing areas compared with earlier cortical areas, but high levels of thalamic modulation [4] are inconsistent with an ‘attentional progressive hierarchy’, a concept that has recently been critiqued [9]. In any case, physiological studies of cortical plasticity induced by auditory attention have shown clear modulation of neuronal responses in primary auditory cortex [10,11]. Although some human imaging studies have also shown attentional modulatory effects in A1 [157,184], as well as in other Current Opinion in Neurobiology 2007, 17:437–455 primary and secondary auditory cortical regions, another study [160] has reported greater effects of auditory attention in higher auditory association areas, at least in an intermodal, dual task paradigm (comparing responses when one sensory modality is attended and the other is ignored). Since attentional effects are highly task-dependent, it may be premature to accept the attentional progressive hierarchy model in auditory cortex. As recent studies have shown, task-specificity of processing and attentional demands can differentially activate selective areas of prefrontal and auditory cortex during the performance of different auditory tasks. A preliminary study in the ferret [187] showed differential patterns of brain activation in prefrontal cortex and in primary and secondary auditory cortices using expression of the immediate early gene, cFos, while the animals were engaged in one of two listening tasks (sound localization or detection of tones embedded in a noise). The results of this animal study parallel findings of two recent human neuroimaging studies that also mapped differential activation in ‘what’ and ‘where’ tasks [2,102]. The task-dependent shift in the distribution of attention leads to dynamic re-allocation of cortical resources depending upon task demands and underlines the flexibility in auditory processing. In addition to auditory cortical areas, there are cortical association areas whose activity is influenced by auditory attention. Association areas in the supramodal frontoparietal attentional network [188] are also activated in auditory attention — such the left precentral gyrus and the right posterior parietal cortex [65,77,161]. A study of the neural dynamics of event segmentation in musical symphonies revealed a right-lateralized network , with peak cortical activation during the silent period between musical movements [111]. There were successive waves of activity in two distinct functional networks – first in a ventral frontotemporal network involved in the automatic detection of salient acoustic events , swiftly followed by activation of a dorsal frontoparietal network, which may direct attention to the acoustic event boundary and update the perceptual scene. This study illustrates the broad range of brain regions activated during auditory attention. Even a partial list of additional areas includes limbic cortex, anterior cingulate cortex, basal ganglia, thalamus (medial geniculate nucleus, pulvinar nucleus), superior colliculus, inferior colliculus, cerebellum, dorsal cochlear nucleus, and cochlea. Recruitment of additional brain areas may be dependent upon task conditions — for example, orbitofrontal cortex and hippocampal paralimbic belt areas are activated during auditory target detection tasks where the stimulus decision is based upon ambiguous sensory information [189]. On the basis of computational modeling [190], neuroimaging [191], physiological [192], and neuroanatomical evidence [193,194] the reticular nucleus of the thalamus may also be an important site of attentional modulation, but it has not yet been studied physiologically during auditory attention. www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 447 Thus, attentional effects in the auditory system can occur selectively and at multiple stages throughout auditory processing [195] and may even occur as early as the cochlear nucleus or even earlier, in the sensory transduction phase in the cochlea, as demonstrated by studies of crossmodal selective attention [167,168]. These peripheral attentional effects may be partly driven by local ‘bottom-up’ processes such as habituation, but may also be influenced by top-down effects mediated by the descending olivocochlear bundle projections. Given the range of neural loci where auditory attention is likely to modulate processing, there may very well be a variety of mechanisms in play, giving rise to the question of how these multiple mechanisms are coordinated, orchestrated, and enacted in concert. It is, of course, possible, that these multiple levels of attentional modulation operate relatively independently. In fact, recent studies underline the point that attentional effects on auditory processing are likely to occur in a distributed and widespread pattern throughout the auditory cortex. Research on a ‘deaf-hearing’ neurological patient with extensive bilateral destruction of auditory cortices (including the primary auditory fields) demonstrated that the patient was still able to marshall sufficient auditory attention to perceive sound onsets and offsets. Conscious attentive perception of sound occurrence in this patient may have arisen from top-down projections from prefrontal cortex to the remaining non-primary auditory cortex or multimodal association cortex. Other insights into attention have arisen from neurological studies of two forms of auditory neglect: one an attentional deficit associated with basal ganglia lesions, and the other an auditory spatial deficit associated with parietoprefrontal lesions [78,79]. At a global level, selective attention may channel information into specialized cortical modules localized in one hemisphere and hence lead to lateralized patterns of activation. There is considerable evidence for hemispheric specialization of the attentional system — for example, a study by Zatorre et al. [158] suggests that auditory attention to either spatial location or tonal frequency activates a common network of right hemisphere cortical regions (although one may argue that lateralized functional specialization arises first and that the hemispheric differences in attentional modulation are a consequence). Additional evidence for lateralization was provided by a recent ERP study [32] that observed plastic changes in event-related potentials during rapid perceptual learning that occurred in right auditory cortex and right anterior STG/inferior prefrontal cortex and were dependent upon auditory attention to the phonetic discrimination task. Clearly, attentional effects may be highly dependent on task paradigm. For example, differentially lateralized patterns of hemispheric activation were demonstrated in subjects attending to one of two www.sciencedirect.com different features (duration or contour) of the same acoustic stimulus set [13]. A neuroimaging study [27] explored the neural basis of foreground–background decomposition, by comparing brain activation patterns when listeners performed a match-to-sample task for harmonic target tones (drawn from a stimulus set of tones with distinctive timbres from 15 different musical instruments) against a continuous FM (frequency modulated) background, to activity arising the FM background-alone stimulation. They reported increased foreground-signal related activity in posterior regions of left auditory cortex, which was insensitive to masking influence of the background. Admittedly, one potential complication of foreground–background decomposition neuroimaging studies, even when using low-noise fMRI, is that the subjects are already engaged in distinguishing task-related foreground signals from background magnet noises (created by switching of magnetic gradients during imaging). However, notwithstanding these technical challenges, a subsequent neuroimaging study [91] of intentional stream segregation using timbre cues (comparing activation patterns in an alternating dual stream of ABAB sequences from two different musical instruments — trumpet and organ versus a single stream AAAA or BBBB from either instrument alone), also found enhanced left hemisphere activation in posterior areas of the auditory cortex, similar to the activation pattern described in earlier studies of foreground/background decomposition [27] and of selective tracking of individual melodic streams in polyphonic music [196]. This pattern of activation may be the result of the involvement of working memory as well as selective auditory attention in performing these tasks. A recent MEG study explored the neural basis of the nonspatial aspects of attention in the cocktail party effect using a clever reversal of foreground and background attentional foci [34]. Subjects engaged in a set of complementary tasks that changed foreground and background using the same acoustic stimulus set, involving either listening for frequency change in a rhythmically repeating, constant-frequency target stream amidst a dense background texture of irregular, random-frequency notes (‘target’ task), or instead by listening for duration changes in the dense texture of changing notes and ignoring the rhythmic, constant-frequency stream (‘masker’ task). The subjects’ behavior was correlated with their MEG neural responses indicating that auditory attention strongly modulated the relative neural representation of the target. Furthermore, the time course of the neural build-up of target representation correlated with the subjects’ gradual perceptual learning and improvement in target detection. These data suggest that one important mechanism for top-down attentional control is through enhancement of Current Opinion in Neurobiology 2007, 17:437–455 448 Sensory systems coherent or synchronous neural activity in sensory cortex, a finding supported by recent results in the somatosensory [197] and the visual system [198–200]. In the auditory system, the early ‘transient’ gamma-band response in primary and secondary auditory cortex has also been shown to be related to top-down selective attention to auditory stimuli [72], which may be mediated by the dorsal anterior cingulate cortex [201,202]. Attention can influence neural activity not only through synchrony but also through an array of other mechanisms. Previous neurophysiological studies of visual attention have suggested that a possible biasing mechanism for top-down selective attention is an increase in baseline spiking activity in relevant sensory cortex [203]. Although such attentional increases have been observed in the visual cortex, they have not yet been demonstrated in auditory cortex. Preliminary data from A1 recordings in the behaving rat [204] indicate that there were no consistent changes in spontaneous activity during a two-tone frequency discrimination task. Surprisingly, evoked multi-unit and LFP responses were larger in the non-attending condition than in the attending condition. However, opposite results were obtained in recordings from the medial geniculate (auditory) thalamus, where an increase in spontaneous activity was observed during auditory task performance. At present, there is no compelling single-unit neurophysiological evidence for attention-related increased baseline firing rate in auditory cortex. On the contrary, all recent single-unit physiological studies indicate either a decrease in baseline firing in the attentive state or a lack of consistent gain changes during auditory attention [10,11,156,205] with one earlier exception [206]. One puzzle to be resolved in future research is how to reconcile these physiological findings from single-cell recordings in the auditory cortex that indicate an absence of gain changes during attention, with the results from many physiological studies in the visual system that show the opposite. Also, how can the single-unit data from auditory cortex be reconciled with neuroimaging data that suggest enhanced activity in auditory cortex during attention? Other mechanisms of attention, observed in physiological studies of visual attention, are multiplicative modulation of neuronal responses and attentional increase in effective stimulus contrast (or contrast gain). Although both are perfectly plausible mechanisms in auditory attention, such systematic multiplicative changes in response gain or changes in stimulus contrast gain during attention without any change in receptive-field tuning have not yet been observed in the auditory system. These differences could be a matter of task design or data analysis, or simply reflect the paucity of physiological studies of auditory attention that have been conducted at a single-unit level. However, as mentioned earlier, there is compelling evidence for an alternate mechanism in primary auditory Current Opinion in Neurobiology 2007, 17:437–455 cortex in which attention plays a role in adaptively reshaping receptive fields, depending upon salient task cues and behavioral context [10,11,12]. Convergent evidence for matched filter changes in receptive-field tuning has come from studies in the visual system [98,99]. Task-dependent sharpening of auditory spatial receptive fields was also observed in primary auditory cortex [148] and also been described in visual cortex [206]. Such adaptive mechanisms enable neurons to rapidly multiplex in a task-dependent (or state-dependent) manner as has been shown in the visual system [9]. Thus selective attention could be based on short-term feature-specific plasticity of auditory cortical neurons, enhancing their selectivity for task-relevant information, rather than amplifying overall responses. It is an open challenge to determine the role that these various mechanisms may play in auditory attention. Summary Auditory attention involves a distributed network of auditory cortical and subcortical structures that are activated selectively in a task-specific manner during auditory processing, which also integrate with a generalized multisensory attentional network that includes parietal, frontal, and anterior cingulate cortical regions [74,207– 210]. Recent research has revealed a richly interconnected network for auditory attention that assists in the computation of early auditory features and acoustic scene analysis, the identification and recognition of salient acoustic objects, enhancement of signal processing for the attended features or objects, priming of persistent plastic changes that may enhance future processing, and the planning of actions in response to incoming auditory information. Auditory attention is dynamic and flexible, modulates many levels of auditory processing from association cortex to cochlea, and may rely upon adaptive mechanisms that rapidly reshape receptive fields in accord with current task demands and behavioral context. Many outstanding questions remain to be answered by future research. We still do not know the synaptic mechanisms and cellular architecture [211] underlying auditory attention, nor the manner in which attentional effects at multiple levels in the distributed attentional system are orchestrated and directed. How much of the acoustic novelty system can be explained by simple habituation mechanisms? How are learned ‘bottom-up’ salience filters formed? (for highly meaningful and over-learned stimuli such as one’s own name). Do attentional effects in the auditory system increase with task-difficulty as they do in the visual system? [19,20]. What are the differences and similarities between visual and auditory attention? What are the pathways for crossmodal and intermodal attention? What is the interaction between the neural systems for arousal, vigilance, and attention? [212]. What is the relationship between attention and its close companions — expectation, reward, short-term memory, and www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 449 plasticity? [5,136,213]. How does top-down auditory attention modulate acoustic scene analysis, interact with the ‘pre-attentive’ acoustic novelty detection system and also with bottom-up ‘pop-out’ auditory attention? This fascinating array of questions will keep neuroscientists interested in auditory attention quite busy for years to come. Acknowledgement We gratefully acknowledge funding from NIH R01 DC005779. References and recommended reading Some articles have been marked as worthy of special interest. All papers in this subjective category are recent (publication within the last five years) and relevant (results making an important contribution to the field of auditory attention).  of special interest 1. Fan J, Posner M: Human attentional networks. Psychiatr Prax 2004, 31(Suppl 2):S210-S214. 2.  Ahveninen J, Jaaskelainen IP, Raij T, Bonmassar G, Devore S, Hamalainen M, Levanen S, Lin FH, Sams M, ShinnCunningham BG et al.: Task-modulated ‘what’ and ‘where’ pathways in human auditory cortex. Proc Natl Acad Sci USA 2006, 103:14608-14613. Human neuroimaging study of selective attention to auditory object identification and localization using dual approaches (fMRI and MEG). Subjects were asked to attend either to the phonetic content of sound pairs or to their location. The key finding was a double dissociation in response adaptation to sound pairs with phonetic versus spatial sound changes dependent upon attentive focus, and the demonstration of parallel ‘what’ and ‘where’ pathways for sound processing. Further, with the time resolution of MEG, it was possible to show that activation of these parallel pathways occurred as early as 70–150 ms from stimulus onset and that the ‘where’ pathway was activated 30 ms earlier than the ‘what’ pathway. These results suggest that auditory selective attention effects are feature specific and also that they may arise from enhanced tuning of receptive fields of task-relevant neuronal populations. 3.  Johnson JA, Zatorre RJ: Neural substrates for dividing and focusing attention between simultaneous auditory and visual events. Neuroimage 2006, 31:1673-1681. In order to map the neural basis of bimodal divided attention–attending to one sense while ignoring another — an fMRI study was conducted in which subjects simultaneously heard novel melodies and viewed geometric shapes, and in different conditions, were instructed to attend to only one sense (bimodal selective attention) or to both senses at the same time (bimodal divided attention). Bimodal selective attention lead to increased activity in relevant sensory cortices and simultaneous decrease in irrelevant sensory cortex. Thus top-down attentional effects modulate the interaction of sensory cortical areas by enhancing processing in one modality at the expense of the other. Subjects with the best performance during the selective-attention task showed the greatest enhancement of activity in relevant sensory cortices. However, unlike the selective-attention conditions, divided attention recruited heteromodal areas in dorsolateral prefrontal cortex (DLPFC). These results suggest that selective attention was achieved by a different neural set of neural processes. Selective attention acted by modulation of sensory cortices, whereas bimodal attention recruited DLPFC. 9. Gilbert CD, Sigman M: Brain states: top-down influences in sensory processing (review). Neuron 2007, 54:677-696. 10. Fritz J, Shamma S, Elhilali M, Klein D: Rapid task-related  plasticity of spectrotemporal receptive fields in primary auditory cortex. Nat Neurosci 2003, 6:1216-1223. This study developed an innovative approach that made it possible to simultaneously measure spectral-temporal receptive fields (STRFs) with task performance, providing multiple snapshots of the dynamically changing STRF during ongoing behavior. Ferrets were trained on a generalized tone detection task, in which they learned to detect tones of any frequency, in the context of background noisy stimuli. Attending to a specific target frequency during the detection task consistently induced localized facilitative changes in STRF shape, which were swift in onset (<2 min) and could persist for hours, and provide a form of sensory memory. The authors propose that such modulatory changes could enhance overall cortical responsiveness to the target tone and increase the likelihood of ‘capturing’ the attended target during the task. 11. Fritz JB, Elhilali M, Shamma SA: Differential dynamic plasticity of  A1 receptive fields during multiple spectral tasks. J Neurosci 2005, 25:7623-7635. In this behavioral physiology study, ferrets were initially trained on generalized frequency-independent tasks — a single-tone detection task, and a two-tone discrimination task. While recording from the same neurons, spectral–temporal receptive fields (STRFs) were measured in A1 under resting (non-task conditions) and also while ferrets successively performed frequency discrimination or tone detection tasks. Both tasks enhanced STRFs at the target frequency. In the discrimination task, however, STRF suppression was observed at the reference frequency. STRF changes were rapid and frequency-selective for both task conditions. In successive tasks, neurons responded differentially to identical tones, depending upon whether the tone was a reference or target. These task-dependent differences in receptive-field plasticity reflect differences in the meaning attributed to identical stimuli according to the context in which they were presented. 12. Fritz JB, Elhilali M, David SV, Shamma SA: Does attention play a role in dynamic receptive field adaptation to changing acoustic salience in A1? (Review). Hear Res 2007, 229:186-203. 13. Brechmann A, Scheich H: Hemispheric shifts of sound  representation in auditory cortex with conceptual listening. Cereb Cortex 2005, 15:578-587. A neuroimaging approach was used to test the prediction that differential attentional focus on the same stimulus set would differentially activate auditory cortex. Subjects were presented with the same set of frequencymodulated tone sweeps, which the subjects were asked to categorize, either in pitch direction (rising or falling), or in duration (short or long). When the task involved attention to pitch direction, there was greater activation in right posterior auditory cortex than passive stimulus exposure. By contrast, there was greater left posterior auditory cortex activation when the task involved attention to sweep duration. These results provide strong evidence that top-down influences can differentially shape responses in the two hemispheres, leading to lateralized patterns of activation, dependent upon task constraints and attentional focus. 14. Scheich H, Brechmann A, Brosch M, Budinger E, Ohl FW: The cognitive auditory cortex: task-specificity of stimulus representations. Hear Res 2007, 229:213-224. 15. Zatorre RJ: There’s more to auditory cortex than meets the ear (review). Hear Res 2007, 229:24-30. 16. Hubel DH, Henson CO, Rupert A, Galambos R: Attention units in the auditory cortex. Science 1959, 129:1279-1280. 17. Hillyard SA, Hink RF, Schwent VL, Picton TW: Electrical signs of selective attention in the human brain. Science 1973, 182:177180. 4. Kastner S, Pinsk MA: Visual attention as a multilevel selection process (review). Cogn Affect Behav Neurosci 2004, 4:483-500. 5. Knudsen EI: Fundamental components of attention. Annu Rev Neurosci 2007, 30:57-78. 18. Woldorff MG, Hillyard SA: Modulation of early auditory processing during selective listening to rapidly presented tones. Electroencephalogr Clin Neurophysiol 1991, 79:170-191. 6. Kayser C, Petkov CI, Lippert M, Logothetis NK: Mechanisms for allocating auditory attention: an auditory saliency map. Curr Biol 2005, 15:1943-1947. 19. Spitzer H, Desimone R, Moran J: Increased attention enhances both behavioral and neuronal performance. Science 1988, 240:338-340. 7. Hafter ER, Sarampalis A, Loui P: Auditory attention and filters (review). In Auditory Perception of Sound Sources, vol 29. Edited by Yost W. Springer-Verlag; 2007. 20. Boudreau CE, Williford TH, Maunsell JH: Effects of task difficulty and target likelihood in area V4 of macaque monkeys. J Neurophysiol 2006, 96:2377-2387. 8. Nobre AC, Correa A, Coull JT: Temporal effects of attention (review). Curr Opin Neurobiol 2007, (this issue). 21. Cherry EC: Some experiments on the recognition of speech, with one and two ears. J Acoust Soc Am 1953, 25:975-979. www.sciencedirect.com Current Opinion in Neurobiology 2007, 17:437–455 450 Sensory systems 22. Asari H, Pearlmutter BA, Zador AM: Sparse representations for the cocktail party problem. J Neurosci 2006, 26:7477-7490. 44. Sussman E, Winkler I: Dynamic sensory updating in the auditory system. Brain Res Cogn Brain Res 2001, 12:431-439. 23. Bregman AS: Auditory Scene Analysis: The Perceptual Organization of Sounds London: MIT Press; 1990. 45. Pulvermuller F, Shtyrov Y: Language outside the focus of attention: the mismatch negativity as a tool for studying higher cognitive processes (review). Prog Neurobiol 2006, 79:49-71. 24. Carlyon RP: How the brain separates sounds. Trends Cogn Sci 2004, 8:465-471. 25. Alain C: Breaking the wave: effects of attention and learning on concurrent sound perception (review). Hear Res 2007, 229:225-236. 26. Shinn-Cunningham BG, Lee AK, Oxenham AJ: A sound element gets lost in perceptual competition. Proc Natl Acad Sci USA 2007, 104:12223-12227. 27. Scheich H, Baumgart F, Gaschler-Markefski B, Tegeler C, Tempelmann C, Heinze HJ, Schindler F, Stiller D: Functional magnetic resonance imaging of a human auditory cortex area involved in foreground–background decomposition. Eur J Neurosci 1998, 10:803-809. 28. Aubin T: Penguins and their noisy world. An Acad Bras Cienc 2004, 76:279-283. 29. Groger U, Wiegrebe L: Classification of human breathing sounds by the common vampire bat, Desmodus rotundus. BMC Biol 2006, 4:18. 30. Sussman ES, Horvath J, Winkler I, Orr M: The role of attention in the formation of auditory streams. Percept Psychophys 2007, 69:136-152. 31. Cusack R, Deeks J, Aikman G, Carlyon RP: Effects of location, frequency region, and time course of selective attention on auditory scene analysis. J Exp Psychol Hum Percept Perform 2004, 30:643-656. 32. Alain C, Snyder JS, He Y, Reinke KS: Changes in auditory cortex parallel rapid perceptual learning. Cereb Cortex 2007, 17:1074-1084. 33. Xiang J, Elhilali M, Shamma SA, Simon JZ: The interaction between attention and auditory pop-out. Association for Research in Otolaryngology Midwinter Meeting Abstracts. 2007. 34. Winkler I, Teder-Salejarvi WA, Horvath J, Naatanen R, Sussman E: Human auditory cortex tracks task-irrelevant sound sources. Neuroreport 2003, 14:2053-2056. 35. Winkler I, Czigler I, Sussman E, Horvath J, Balazs L: Preattentive binding of auditory and visual stimulus features. J Cogn Neurosci 2005, 17:320-339. 36. Sussman ES: Integration and segregation in auditory scene analysis. J Acoust Soc Am 2005, 117:1285-1298. 37. Sussman ES, Bregman AS, Wang WJ, Khan FJ: Attentional modulation of electrophysiological activity in auditory cortex for unattended sounds within multistream auditory environments. Cogn Affect Behav Neurosci 2005, 5:93-110. 38. Downar J, Crawley AP, Mikulis DJ, Davis KD: A multimodal cortical network for the detection of changes in the sensory environment. Nat Neurosci 2000, 3:277-283. 39. Huang MX, Lee RR, Miller GA, Thoma RJ, Hanlon FM, Paulson KM, Martin K, Harrington DL, Weisend MP, Edgar JC et al.: A parietal– frontal network studied by somatosensory oddball MEG responses, and its cross-modal consistency. Neuroimage 2005, 28:99-114. 40. Perez-Gonzalez D, Malmierca MS, Covey E: Novelty detector neurons in the mammalian auditory midbrain. Eur J Neurosci 2005, 22:2879-2885. 41. Nelken I, Ulanovsky N: Mismatch negativity and stimulusspecific adaptation in animals. J Psychophysiol 2007, in press. 42. Naatanen R, Gaillard AW, Mantysalo S: Early selective-attention effect on evoked potential reinterpreted. Acta Psychol (Amst) 1978, 42:313-329. 43. Rinne T, Sarkka A, Degerman A, Schroger E, Alho K: Two separate mechanisms underlie auditory change detection and involuntary control of attention. Brain Res 2006, 1077:135-143. Current Opinion in Neurobiology 2007, 17:437–455 46. Tervaniemi M, Rytkonen M, Schroger E, Ilmoniemi RJ, Naatanen R: Superior formation of cortical memory traces for melodic patterns in musicians. Learn Mem 2001, 8:295-300. 47. Brattico E, Tervaniemi M, Naatanen R, Peretz I: Musical scale properties are automatically processed in the human auditory cortex. Brain Res 2006, 1117:162-174. 48. Opitz B, Schroger E, von Cramon DY: Sensory and cognitive mechanisms for preattentive change detection in auditory cortex. Eur J Neurosci 2005, 21:531-535. 49. Woldorff MG, Hillyard SA, Gallen CC, Hampson SR, Bloom FE: Magnetoencephalographic recordings demonstrate attentional modulation of mismatch-related neural activity in human auditory cortex. Psychophysiology 1998, 35:283-292. 50. Alain C, Woods DL: Attention modulates auditory pattern memory as indexed by event-related brain potentials. Psychophysiology 1997, 34:534-546. 51. Arnott SR, Alain C: Stepping out of the spotlight: MMN attenuation as a function of distance from the attended location. Neuroreport 2002, 13:2209-2212. 52. Sussman E, Ritter W, Vaughan HG Jr: An investigation of the auditory streaming effect using event-related brain potentials. Psychophysiology 1999, 36:22-34. 53. Sussman E, Winkler I, Schroger E: Top-down control over involuntary attention switching in the auditory modality. Psychon Bull Rev 2003, 10:630-637. 54. Doeller CF, Opitz B, Mecklinger A, Krick C, Reith W, Schroger E: Prefrontal cortex involvement in preattentive auditory deviance detection: neuroimaging and electrophysiological evidence. Neuroimage 2003, 20:1270-1282. 55. Crottaz-Herbette S, Menon V: Where and when the anterior cingulate cortex modulates the attentional response: combined fMRI and ERP evidence. J Cogn Neurosci 2006, 18:766-780. 56. Koelsch S: Neural substrates of processing syntax and semantics in music. Curr Opin Neurobiol 2005, 15:207-212. 57. Schonwiesner M, Novitski N, Pakarinen S, Carlson S,  Tervaniemi M, Naatanen R: Heschl’s gyrus, posterior superior temporal gyrus, and mid-ventrolateral prefrontal cortex have different roles in the detection of acoustic changes. J Neurophysiol 2007, 97:2075-2082. The neural basis of MMN, an automatic, pre-attentive system for changedetection in the acoustic environment has been the object of considerable interest. The results of this neuroimaging study, utilizing a combination of event-related fMRI and EEG, suggest that automatic change processing consists of three stages: (1) initial detection in A1; (2) further analysis in secondary and tertiary auditory cortical areas (posterior superior temporal gyrus and planum temporale), and (3) judgement of suffifcient novelty for allocation of attentional resources in the midventrolateral PFC. 58. Shalgi S, Deouell LY: Direct evidence for differential roles of temporal and frontal components of auditory change detection. Neuropsychologia 2007, 45:1878-1888. 59. Jaaskelainen IP, Ahveninen J, Bonmassar G, Dale AM, Ilmoniemi RJ, Levanen S, Lin FH, May P, Melcher J, Stufflebeam S et al.: Human posterior auditory cortex gates novel sounds to consciousness. Proc Natl Acad Sci USA 2004, 101:6809-6814. 60. Ulanovsky N, Las L, Nelken I: Processing of low-probability  sounds by cortical neurons. Nat Neurosci 2003, 6:391-398. The ability to detect rare auditory events was studied at the single cell level in anesthetized cat A1. Cortical neurons responded more strongly to a rarely presented sound than to the same sound when it was common, as a result of stimulus-specific adaptation (SSA). By contrast, medial geniculate auditory thalamic neurons were insensitive to the probability of frequency deviants and did not show SSA. Many similarities between SSA and MMN suggest that SSA may be a single-neuron correlate of the EEG www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 451 mismatch negativity (MMN) that is evoked by deviant (rare) sounds. These data suggest that A1 neurons play a role in novelty detection and also in sensory memory. 82. Gottlieb J: From thought to action: the parietal cortex as a bridge between perception, action, and cognition (review). Neuron 2007, 53:9-16. 61. Ulanovsky N, Las L, Farkas D, Nelken I: Multiple time scales of adaptation in auditory cortex neurons. J Neurosci 2004, 24:10440-10453. 83. Noesselt T, Hillyard SA, Woldorff MG, Schoenfeld A, Hagner T, Jancke L, Tempelmann C, Hinrichs H, Heinze HJ: Delayed striate cortical activation during spatial attention. Neuron 2002, 35:575-587. 62. Pincze Z, Lakatos P, Rajkai C, Ulbert I, Karmos G: Separation of mismatch negativity and the N1 wave in the auditory cortex of the cat: a topographic study. Clin Neurophysiol 2001, 112:778-784. 63. Javitt DC, Steinschneider M, Schroeder CE, Arezzo JC: Role of cortical N-methyl-D-aspartate receptors in auditory sensory memory and mismatch negativity generation: implications for schizophrenia. Proc Natl Acad Sci USA 1996, 93:11962-11967. 64. Watkins S, Dalton P, Lavie N, Rees G: Brain mechanisms mediating auditory attentional capture in humans. Cereb Cortex 2007, 17:1694-1700. 65. Shomstein S, Yantis S: Parietal cortex mediates voluntary control of spatial and nonspatial auditory attention. J Neurosci 2006, 26:435-439. 66. Wu CT, Weissman DH, Roberts KC, Woldorff MG: The neural circuitry underlying the executive control of auditory spatial attention. Brain Res 2007, 1134:187-198. 67. Bregman AS: Auditory streaming is cumulative. J Exp Psychol Hum Percept Perform 1978, 4:380-387. 68. Molholm S, Martinez A, Ritter W, Javitt DC, Foxe JJ: The neural circuitry of pre-attentive auditory change-detection: an fMRI study of pitch and duration mismatch negativity generators. Cereb Cortex 2005, 15:545-551. 69. Snyder JS, Alain C, Picton TW: Effects of attention on neuroelectric correlates of auditory stream segregation. J Cogn Neurosci 2006, 18:1-13. 70. Cusack R: The intraparietal sulcus and perceptual organization. J Cogn Neurosci 2005, 17:641-651. 71. Naatanen R, Tervaniemi M, Sussman E, Paavilainen P, Winkler I: ‘Primitive intelligence’ in the auditory cortex. Trends Neurosci 2001, 24:283-288. 72. Hannemann R, Obleser J, Eulitz C: Top-down knowledge supports the retrieval of lexical information from degraded speech. Brain Res 2007, 1153:134-143. 73. Arnott SR, Binns MA, Grady CL, Alain C: Assessing the auditory dual-pathway model in humans. Neuroimage 2004, 22:401-408. 74. Bidet-Caulet A, Bertrand O: Dynamics of a temporo-frontoparietal network during sustained spatial or spectral auditory processing. J Cogn Neurosci 2005, 17:1691-1703. 75. De Santis L, Clarke S, Murray MM: Automatic and intrinsic auditory ‘what’ and ‘where’ processing in humans revealed by electrical neuroimaging. Cereb Cortex 2007, 17:9-17. 76. Hotting K, Rosler F, Roder B: Crossmodal and intermodal attention modulate event-related brain potentials to tactile and auditory stimuli. Exp Brain Res 2003, 148:26-37. 77. Mayer AR, Harrington D, Adair JC, Lee R: The neural networks underlying endogenous auditory covert orienting and reorienting. Neuroimage 2006, 30:938-949. 78. Bellmann A, Meuli R, Clarke S: Two types of auditory neglect. Brain 2001, 124:676-687. 79. Clarke S, Thiran AB: Auditory neglect: what and where in auditory space. Cortex 2004, 40:291-300. 80. Spierer L, Meuli R, Clarke S: Extinction of auditory stimuli in hemineglect: space versus ear. Neuropsychologia 2007, 45:540-551. 81. Goldberg ME, Bisley JW, Powell KD, Gottlieb J: Saccades, salience and attention: the role of the lateral intraparietal area in visual behavior (review). Prog Brain Res 2006, 155:157-175. www.sciencedirect.com 84. Moore T, Armstrong KM: Selective gating of visual signals by microstimulation of frontal cortex. Nature 2003, 421:370-373. 85. Wardak C, Ibos G, Duhamel JR, Olivier E: Contribution of the monkey frontal eye field to covert visual attention. J Neurosci 2006, 26:4228-4235. 86. Garg A, Schwartz D, Stevens AA: Orienting auditory spatial attention engages frontal eye fields and medial occipital cortex in congenitally blind humans. Neuropsychologia 2007, 45:2307-2321. 87. Russo GS, Bruce CJ: Frontal eye field activity preceding aurally guided saccades. J Neurophysiol 1994, 71:1250-1253. 88. Winkowski DE, Knudsen EI: Top-down gain control of the  auditory space map by gaze control circuitry in the barn owl. Nature 2006, 439:336-339. Top-down control of the auditory space map was demonstrated by using low-level electrical stimulation of the gaze control circuitry in the forebrain of barn owls (below the threshold to elicit changes in gaze) to show that such stimulation regulates the gain of midbrain auditory responses in the tectum in an attention-like manner. When the forebrain circuit was stimulated, midbrain responses to auditory stimuli at the location encoded by the forebrain site were enhanced and spatial selectivity was sharpened. By contrast, the same stimulation suppressed responses to auditory stimuli represented at other locations in the midbrain map. Such space-specific top-down regulation of auditory responses by gaze control circuitry in the barn owl parallels similar results obtained in the primate visual system following stimulation of the frontal gaze control system. 89. Winkowski DE, Knudsen EI: Modulation of visual responses in optic tectum by microstimulation of forebrain gaze fields in barn owls. Society for Neuroscience Meeting Abstracts. 2005. 90. Alain C, Arnott SR: Selectively attending to auditory objects. Front Biosci 2000, 5:D202-D212. 91. Deike S, Gaschler-Markefski B, Brechmann A, Scheich H: Auditory stream segregation relying on timbre involves left auditory cortex. Neuroreport 2004, 15:1511-1514. 92. Fritz J, Elhilali M, Shamma S: Active listening: task-dependent plasticity of spectrotemporal receptive fields in primary auditory cortex (review). Hear Res 2005, 206:159-176. 93. Fritz JB, Elhilali M, Shamma SA: Adaptive changes in cortical receptive fields induced by attention to complex sounds. J. Neurophysiology, 2007 (in press). 94. Atiani S, Elhilali M, Shamma SA, Fritz JB: Rapid receptive-field plasticity in A1 during signal detection in noise. Society for Neuroscience Meeting Abstracts. 2006. 95. Fritz JB, Elhilali M, Harper N, Haisfield C, Yin PB, Shamma SA: Do temporal cues play a role in dynamic receptive-field plasticity in A1? Association for Research in Otolaryngology Midwinter Meeting Abstracts. 2005. 96. Fritz JB, David SV, Donaldson K, Shamma SA: Behavioral gating of acoustic information in ferret frontal cortex during auditory target recognition. Society For Neuroscience Meeting Abstracts. 2007. 97. Reynolds JH, Chelazzi L: Attentional modulation of visual processing. Annu Rev Neurosci 2004, 27:611-647. 98. David SV, Mazer JA, Gallant JL: Pattern specific attentional modulation of V4 spatiotemporal receptive fields during freeviewing visual search. Society for Neuroscience Meeting Abstracts. 2002. 99. Gustavsen KA, Gallant JL: Feature attention changes shape tuning in area V4. Society for Neuroscience Meeting Abstracts. 2006. 100. Alho K, Vorobyev VA: Brain activity during selective listening to natural speech (review). Front Biosci 2007, 12:3167-3176. Current Opinion in Neurobiology 2007, 17:437–455 452 Sensory systems 101. Alho K, Vorobyev VA, Medvedev SV, Pakhomov SV, Starchenko MG, Tervaniemi M, Naatanen R: Selective attention to human voice enhances brain activity bilaterally in the superior temporal sulcus. Brain Res 2006, 1075:142-150. 102. Rama P, Poremba A, Sala JB, Yee L, Malloy M, Mishkin M, Courtney SM: Dissociable functional cortical topographies for working memory maintenance of voice identity and location. Cereb Cortex 2004, 14:768-780. 103. Hugdahl K, Thomsen T, Ersland L, Rimol LM, Niemi J: The effects of attention on speech perception: an fMRI study. Brain Lang 2003, 85:37-48. 104. Thomsen T, Rimol LM, Ersland L, Hugdahl K: Dichotic listening reveals functional specificity in prefrontal cortex: an fMRI study. Neuroimage 2004, 21:211-218. 105. Loui P, Grent-’t-Jong T, Torpey D, Woldorff M: Effects of attention on the neural processing of harmonic syntax in Western music. Brain Res Cogn Brain Res 2005, 25:678-687. 106. Lange K, Roder B: Orienting attention to points in time improves stimulus processing both within and across modalities. J Cogn Neurosci 2006, 18:715-729. 107. Sussman E, Sheridan K, Kreuzer J, Winkler I: Representation of the standard: stimulus context effects on the process generating the mismatch negativity component of event-related brain potentials. Psychophysiology 2003, 40:465-471. 108. Best V, Ozmeral EJ, Shinn-Cunningham BG: Visually-guided attention enhances target identification in a complex auditory scene. J Assoc Res Otolaryngol 2007, 8:294-304. 109. Martikainen MH, Kaneko K, Hari R: Suppressed responses to self-triggered sounds in the human auditory cortex. Cereb Cortex 2005, 15:299-302. 110. van Wassenhove V, Nagarajan SS: Auditory cortical plasticity in learning to discriminate modulation rate. J Neurosci 2007, 27:2663-2672. 111. Sridharan D, Levitin DJ, Chafe CH, Berger J, Menon V: Neural dynamics of event segmentation in music: converging evidence for dissociable ventral and dorsal networks. Neuron 2007, 55:521-532. 112. Nagarajan SS, Blake DT, Wright BA, Byl N, Merzenich MM: Practice-related improvements in somatosensory interval discrimination are temporally specific but generalize across skin location, hemisphere, and modality. J Neurosci 1998, 18:1559-1570. 113. Ivry RB, Spencer RM: The neural representation of time. Curr Opin Neurobiol 2004, 14:225-232. 114. Pastor MA, Macaluso E, Day BL, Frackowiak RS: The neural basis of temporal auditory discrimination. Neuroimage 2006, 30:512-520. 115. Ghose GM, Maunsell JH: Attentional modulation in visual cortex depends on task timing. Nature 2002, 419:616-620. 116. Shuler MG, Bear MF: Reward timing in the primary visual cortex. Science 2006, 311:1606-1609. 117. Petkov CI, O’Connor KN, Sutter ML: Illusory sound perception in macaque monkeys. J Neurosci 2003, 23:9155-9161. 118. Petkov CI, O’Connor KN, Sutter ML: Encoding of illusory continuity in primary auditory cortex. Neuron 2007, 54:153-165. 119. Sivonen P, Maess B, Friederici AD: Semantic retrieval of spoken words with an obliterated initial phoneme in a sentence context. Neurosci Lett 2006, 408:220-225. 120. Raij T, McEvoy L, Makela JP, Hari R: Human auditory cortex is activated by omissions of auditory stimuli. Brain Res 1997, 745:134-143. 121. Hughes HC, Darcey TM, Barkan HI, Williamson PD, Roberts DW, Aslin CH: Responses of human auditory association cortex to the omission of an expected acoustic event. Neuroimage 2001, 13:1073-1089. Current Opinion in Neurobiology 2007, 17:437–455 122. Halpern AR, Zatorre RJ: When that tune runs through your head: a PET investigation of auditory imagery for familiar melodies. Cereb Cortex 1999, 9:697-704. 123. Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS: Activation of auditory cortex during silent lipreading. Science 1997, 276:593-596. 124. Haslinger B, Erhard P, Altenmuller E, Schroeder U, Boecker H, Ceballos-Baumann AO: Transmodal sensorimotor networks during action observation in professional pianists. J Cogn Neurosci 2005, 17:282-293. 125. Voisin J, Bidet-Caulet A, Bertrand O, Fonlupt P: Listening in  silence activates auditory areas: a functional magnetic resonance imaging study. J Neurosci 2006, 26:273-278. In this neuroimaging study of auditory attention, subjects were instructed to direct their attention to an anticipated future sound in the context of complete silence. Subjects had to detect a sound after silent periods of variable duration. The location of the anticipated sound could be cued. The key result was that the auditory cortex showed two types of activation contralateral to the expected sound: first, an increased baseline shift during the silent, expectant period, and second, an enhanced response to a sound delivered after cueing. These results indicate that top-down attention selectively activated primary and secondary auditory areas in readiness for processing an expected stimulus. Frontal cortical activation was also observed (in inferior frontal and dorsolateral prefrontal cortex) regardless of which side the anticipated sounds was searched for. 126. Kraemer DJ, Macrae CN, Green AE, Kelley WM: Musical imagery: sound of silence activates auditory cortex. Nature 2005, 434:158. 127. Pekkola J, Ojanen V, Autti T, Jaaskelainen IP, Mottonen R,  Tarkiainen A, Sams M: Primary auditory cortex activation by visual speech: an fMRI study at 3 T. Neuroreport 2005, 16:125-128. There has been considerable debate as to whether visual speech perception (watching lip movements and other articulatory gestures) can activate human primary auditory cortex. A neuroimaging study that revisited this controversial issue was conducted with a novel control. After mapping individual subjects’ Heschl’s gyri, signal changes in this region were measured during visual speech perception and also during observation of moving circles. Activation was stronger during visual speech perception, particularly in the left hemisphere, than during scrutiny of the moving circles (which had no auditory associations). These results suggest that multimodal inputs are recruited during speech and confirm visual activation of A1 during speech perception. 128. Pekkola J, Ojanen V, Autti T, Jaaskelainen IP, Mottonen R, Sams M: Attention to visual speech gestures enhances hemodynamic activity in the left planum temporale. Hum Brain Mapp 2006, 27:471-477. 129. Kastner S, Pinsk MA, De Weerd P, Desimone R, Ungerleider LG: Increased activity in human visual cortex during directed attention in the absence of visual stimulation. Neuron 1999, 22:751-761. 130. Pollmann S, Maertens M: Perception modulates auditory cortex activation. Neuroreport 2006, 17:1779-1782. 131. Hunter MD, Eickhoff SB, Miller TW, Farrow TF, Wilkinson ID, Woodruff PW: Neural activity in speech-sensitive auditory cortex during silence. Proc Natl Acad Sci USA 2006, 103:189-194. 132. Griffiths TD: Musical hallucinosis in acquired deafness. Phenomenology and brain substrate. Brain 2000, 123:20652076. 133. Ford JM, Mathalon DH: Corollary discharge dysfunction in schizophrenia: can it explain auditory hallucinations? (Review). Int J Psychophysiol 2005, 58:179-189. 134. Shergill SS, Brammer MJ, Amaro E, Williams SC, Murray RM, McGuire PK: Temporal course of auditory hallucinations. Br J Psychiatry 2004, 185:516-517. 135. Dahmen JC, King A: Learning to hear: plasticity of auditory cortical processing. Curr Opin Neurobiol 2007, (this issue). 136. Keuroghlian AS, Knudsen EI: Adaptive auditory plasticity in developing and adult animals. Prog Neurobiol 2007, 82:109-121. www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 453 137. Stefan K, Wycislo M, Classen J: Modulation of associative human motor cortical plasticity by attention. J Neurophysiol 2004, 92:66-72. 138. Polley DB, Steinberg EE, Merzenich MM: Perceptual learning  directs auditory cortical map reorganization through topdown influences. J Neurosci 2006, 26:4970-4982. Adult rats were presented with a set of tones of variable frequency and intensity, and different groups were trained to attend to independent parameters, either frequency or intensity, within this stimulus set. After several weeks of training, rats in the frequency-recognition group exhibited an increased number of neurons in A1 that were selective for frequencies close to that of the target tone. By contrast, animals in the intensity-recognition group were characterized by a higher number of neurons with non-monotonic response-level functions that peaked at values close to the target intensity. Similar, but less pronounced changes were also observed in a secondary auditory field. The degree of map plasticity within the task-relevant stimulus dimension was correlated with the degree of perceptual learning for both tasks. By contrast, there were no changes in the task-irrelevant dimension in either field. These results suggest that long-term, top-down attentional effects can selectively shape adaptive global changes in the neural representation in A1 for single features that underlie learning-induced improvements in perceptual abilities. 139. Rutkowski RG, Weinberger NM: Encoding of learned  importance of sound by magnitude of representational area in primary auditory cortex. Proc Natl Acad Sci USA 2005, 102:13664-13669. To test the hypothesis that learning-induced representational expansion in A1 directly encodes the behavioral importance of a sound, rats trained on an operant auditory conditioning test were variably motivated to the conditioned stimulus (CS) frequency via different levels of thirst. Mean behavioral performance correlated with water deprivation level, and thus gave a range of behavioral importance for the CS in the different deprivation groups. Tonotopic mapping of A1 showed expanded relative representation in the CS high-frequency region. The magnitude of this expansion was correlated with behavioral performance. An intriguing and unexpected increased representation of low frequencies occurred in the trained rats, at frequencies corresponding to the acoustic spectrum of the reward delivery pump. Although not strictly task-relevant, these pump sounds were secondary reinforcers, and hence were behaviorally relevant to the rats. Presumably the rats attended to both the low (pump noise) and high CS frequencies during task performance. These results provide support for the conclusion that attended, behaviorally important sounds gain representational area in A1. 140. Elhilali M, Fritz JB, Chi T-S, Shamma SA: Auditory cortical receptive fields: stable entities with plastic abilities. J. Neurosci. 2007 (in press). 141. Brosch M, Selezneva E, Bucks C, Scheich H: Macaque monkeys discriminate pitch relationships. Cognition 2004, 91:259-272. 142. Brosch M, Selezneva E, Scheich H: Nonauditory events of a  behavioral procedure activate auditory cortex of highly trained monkeys. J Neurosci 2005, 25:6797-6806. Two monkeys were extensively trained on a cognitive auditory categorization task to distinguish between rising and falling contours in a sequence of tones. Each trial began with a light flash cue, signaling to the monkey that it could grasp a bar, which initiated a tone sequence. The monkeys received a reward if they correctly released the bar for a falling frequency contour. The key findings were: (1) that many acoustically responsive neurons in A1 and posterior belt areas exhibited multimodal responses. These responses in auditory cortex (light-cue-related firing, bar-grasp and bar-release-related firing) reflected visual and somatosensory-motor cues and influences. (2) Auditory cortical neurons responded to each of the behaviorally relevant events in the auditory task, independent of modality. (3) This multimodal representation and firing pattern disappeared when the monkeys shifted to a visual detection task, suggesting that this multimodal activation was task dependent and gated by performance of the auditory task. 146. Bergan JF, Ro P, Ro D, Knudsen EI: Hunting increases adaptive  auditory map plasticity in adult barn owls. J Neurosci 2005, 25:9816-9820. Prism rearing has proven to be a powerful approach for exploring the adaptive plasticity of the barn owl’s auditory localization system. Adjustments in the auditory space map in the optic tectum to match the optically displaced visual map take place during development, whereas plasticity in adults was thought to be much more limited. This study shows, however, that much greater map plasticity is possible in adult birds if they are allowed to hunt their prey (thus providing coordinated visual and auditory cues in the context of high arousal and focused attention) rather than being provided with dead mice. Moreover, the magnitude of the shifts in auditory spatial tuning correlated with improvements in the accuracy with which the owls strike their prey. 147. Kacelnik O, Nodal FR, Parsons CH, King AJ: Training-induced  plasticity of auditory localization in adult mammals. PLoS Biol 2006, 4:e71. Localization of sounds in the horizontal plane is impaired if binaural cues are distorted by occluding one ear. Adult ferrets generally do not recover auditory localization ability following occlusion. However, stimulus-specific training in the adult ferret allows substantial adaptation to take place, which can occur in the absence of vision or error feedback. Such training requires that the animal actively attend to auditory information, seeking reliable cues that can assist it in obtaining reward. The basis for this plasticity appears to involve a re-weighting of different localization cues such that the ferrets learn to ignore the altered cues and to rely instead on other auditory cues, especially the spectral information provided by the unoccluded ear, that are less affected by the earplug. 148. Lee CC, Macpherson EA, Harrington IA, Middlebrooks JC: Task dependence of spatial selectivity in the dorsal zone of cat auditory cortex. Association for Research in Otolaryngology Midwinter Meeting Abstract. 2007. 149. Menning H, Roberts LE, Pantev C: Plastic changes in the auditory cortex induced by intensive frequency discrimination training. Neuroreport 2000, 11:817-822. 150. Jancke L, Gaab N, Wustenberg T, Scheich H, Heinze HJ: Shortterm functional plasticity in the human auditory cortex: an fMRI study. Brain Res Cogn Brain Res 2001, 12:479-485. 151. Ozaki I, Jin CY, Suzuki Y, Baba M, Matsunaga M, Hashimoto I: Rapid change of tonotopic maps in the human auditory cortex during pitch discrimination. Clin Neurophysiol 2004, 115:1592-1604. 152. Spierer L, Tardif E, Sperdin H, Murray MM, Clarke S: Learninginduced plasticity in auditory spatial representations revealed by electrical neuroimaging. J Neurosci 2007, 27:5474-5483. 153. Recanzone GH: Rapidly induced auditory plasticity: the ventriloquism aftereffect. Proc Natl Acad Sci USA 1998, 95:869-875. 154. Kayser C, Petkov CI, Augath M, Logothetis NK: Functional imaging reveals visual modulation of specific fields in auditory cortex. J Neurosci 2007, 27:1824-1835. 155. Ghazanfar AA, Schroeder CE: Is neocortex essentially multisensory? Trends Cogn Sci 2006, 10:278-285. 156. Hocherman S, Benson DA, Goldstein MH Jr, Heffner HE, Hienz RD: Evoked unit activity in auditory cortex of monkeys performing a selective attention task. Brain Res 1976, 117:51-68. 157. Woodruff PW, Benson RR, Bandettini PA, Kwong KK, Howard RJ, Talavage T, Belliveau J, Rosen BR: Modulation of auditory and visual cortex by selective attention is modality-dependent. Neuroreport 1996, 7:1909-1913. 143. Selezneva E, Scheich H, Brosch M: Dual time scales for categorical decision making in auditory cortex. Curr Biol 2006, 16:2428-2433. 158. Zatorre RJ, Mondor TA, Evans AC: Auditory attention to space and frequency activates similar cerebral systems. Neuroimage 1999, 10:544-554. 144. Yin PB, Fritz JB, Shamma SA: Can ferrets perceive relative pitch? Association for Research in Otolaryngology Midwinter Meeting Abstracts. 2007. 159. Laurienti PJ, Burdette JH, Wallace MT, Yen YF, Field AS, Stein BE: Deactivation of sensory-specific cortex by cross-modal stimuli. J Cogn Neurosci 2002, 14:420-429. 145. King AJ, Carlile S: Changes induced in the representation of auditory space in the superior colliculus by rearing ferrets with binocular eyelid suture. Exp Brain Res 1993, 94:444-455. 160. Petkov CI, Kang X, Alho K, Bertrand O, Yund EW, Woods DL: Attentional modulation of human auditory cortex. Nat Neurosci 2004, 7:658-663. www.sciencedirect.com Current Opinion in Neurobiology 2007, 17:437–455 454 Sensory systems 161. Shomstein S, Yantis S: Control of attention shifts between  vision and audition in human cortex. J Neurosci 2004, 24:1070210706. Using neuroimaging techniques, human brain activity was studied during attentional shifts between vision and audition. Attention shifts from vision to audition caused increased activity in the attended sensory cortex and decreased activity in the unattended sensory cortex. Moreover, posterior parietal and superior prefrontal cortices showed short-lived increases in activity that were time-locked to the onset of voluntary attentional shifts between vision and audition. These results suggest that the frontal– parietal network implicated in visual attention also plays a role in auditory attention and crossmodal shifts of attention. 162. Shulman GL, Corbetta M, Buckner RL, Raichle ME, Fiez JA, Miezin FM, Petersen SE: Top-down modulation of early sensory cortex. Cereb Cortex 1997, 7:193-206. 163. Downar J, Crawley AP, Mikulis DJ, Davis KD: The effect of task relevance on the cortical response to changes in visual and auditory stimuli: an event-related fMRI study. Neuroimage 2001, 14:1256-1267. 164. Johnson JA, Zatorre RJ: Attention to simultaneous unrelated auditory and visual events: behavioral and neural correlates. Cereb Cortex 2005, 15:1609-1620. 165. Johnson JA, Strafella AP, Zatorre RJ: The role of the dorsolateral prefrontal cortex in bimodal divided attention: two transcranial magnetic stimulation studies. J Cogn Neurosci 2007, 19:907-920. 166. Weissman DH, Warner LM, Woldorff MG: The neural mechanisms for minimizing cross-modal distraction. J Neurosci 2004, 24:10941-10949. 167. Delano PH, Elgueda D, Hamame CM, Robles L: Selective  attention to visual stimuli reduces cochlear sensitivity in chinchillas. J Neurosci 2007, 27:4146-4153. Chinchillas were trained on a visual discrimination or in an auditory frequency discrimination task. By using a chronically implanted roundwindow electrode, it was possible to obtain two measures of cochlear sensitivity — sound-evoked auditory nerve compound action potentials and cochlear microphonics — during selective attention to either visual or auditory stimuli. A decrease in cochlear sensitivity was observed during visual attention, but not during auditory attention. The magnitude of this decrease correlated with the parametrically varied attentional demands of the visual task. These results demonstrate that afferent auditory activity can be modulated by selective attention as early as sensory transduction, perhaps through the influence of the olivocochlear efferent pathway. 168. Hernandez-Peon R, Scherrer H, Jouvet M: Modification of electric activity in cochlear nucleus during attention in unanesthetized cats. Science 1956, 123:331-332. 169. Oatman LC: Role of visual attention on auditory evoked potentials in unanesthetized cats. Exp Neurol 1971, 32:341-356. 170. Oatman LC, Anderson BW: Effects of visual attention on tone burst evoked auditory potentials. Exp Neurol 1977, 57:200-211. 171. Maison S, Micheyl C, Collet L: Influence of focused auditory attention on cochlear activity in humans. Psychophysiology 2001, 38:35-40. 172. Xiao Z, Suga N: Modulation of cochlear hair cells by the auditory cortex in the mustached bat. Nat Neurosci 2002, 5:57-63. 177. Degerman A, Rinne T, Pekkola J, Autti T, Jaaskelainen IP, Sams M, Alho K: Human brain activity associated with audiovisual perception and attention. Neuroimage 2007, 34:1683-1691. 178. Kidd G Jr, Arbogast TL, Mason CR, Gallun FJ: The advantage of knowing where to listen. J Acoust Soc Am 2005, 118:3804-3815. 179. Busse L, Roberts KC, Crist RE, Weissman DH, Woldorff MG: The spread of attention across modalities and space in a multisensory object. Proc Natl Acad Sci USA 2005, 102:1875118756. 180. Rahne T, Bockmann M, von Specht H, Sussman ES: Visual cues can modulate integration and segregation of objects in auditory scene analysis. Brain Res 2007, 1144:127-135. 181. Bidet-Caulet A, Voisin J, Bertrand O, Fonlupt P: Listening to a walking human activates the temporal biological motion area. Neuroimage 2005, 28:132-139. 182. Metzger RR, Greene NT, Porter KK, Groh JM: Effects of reward and behavioral context on neural activity in the primate inferior colliculus. J Neurosci 2006, 26:7468-7476. 183. Vogel EK, Woodman GF, Luck SJ: Pushing around the locus of selection: evidence for the flexible-selection hypothesis. J Cogn Neurosci 2005, 17:1907-1922. 184. Woldorff MG, Gallen CC, Hampson SA, Hillyard SA, Pantev C, Sobel D, Bloom FE: Modulation of early sensory processing in human auditory cortex during auditory selective attention. Proc Natl Acad Sci USA 1993, 90:8722-8726. 185. Grady CL, Van Meter JW, Maisog JM, Pietrini P, Krasuski J, Rauschecker JP: Attention-related modulation of activity in primary and secondary auditory cortex. Neuroreport 1997, 8:2511-2516. 186. Jancke L, Mirzazade S, Shah NJ: Attention modulates activity in the primary and the secondary auditory cortex: a functional magnetic resonance imaging study in human subjects. Neurosci Lett 1999, 266:125-128. 187. Bajo V, Nodal F, Overath T, King A: Effect of auditory task on the expression of the immediate-early gene c-Fos in the ferret cortex. Association for Research in Otolaryngology Midwinter Meeting Abstract. 2007. 188. Posner MI, Petersen SE: The attention system of the human brain. Annu Rev Neurosci 1990, 13:25-42. 189. Pollmann S, Lepsien J, Hugdahl K, von Cramon DY: Auditory target detection in dichotic listening involves the orbitofrontal and hippocampal paralimbic belts. Cereb Cortex 2004, 14:903-913. 190. Crick F: Function of the thalamic reticular complex: the searchlight hypothesis. Proc Natl Acad Sci USA 1984, 81:45864590. 191. Frith CD, Friston KJ: The role of the thalamus in ‘‘top down’’ modulation of attention to sound. Neuroimage 1996, 4:210-215. 192. McAlonan K, Cavanaugh J, Wurtz RH: Attentional modulation of thalamic reticular neurons. J Neurosci 2006, 26:4444-4450. 193. Sakoda T, Kimura A, Donishi T, Kitano H, Tamai Y: Presence and connections of auditory neurons in the rostrodorsal and rostrolateral parts of the thalamic reticular nucleus. Acta Otolaryngol Suppl 2004, 553:36-42. 173. Han CJ, O’Tuathaigh CM, van Trigt L, Quinn JJ, Fanselow MS, Mongeau R, Koch C, Anderson DJ: Trace but not delay fear conditioning requires attention and the anterior cingulate cortex. Proc Natl Acad Sci USA 2003, 100:13087-13092. 194. Zikopoulos B, Barbas H: Prefrontal projections to the thalamic reticular nucleus form a unique circuit for attentional mechanisms. J Neurosci 2006, 26:7348-7361. 174. Carter RM, Hofstotter C, Tsuchiya N, Koch C: Working memory and fear conditioning. Proc Natl Acad Sci USA 2003, 100:1399-1404. 195. Giard MH, Fort A, Mouchetant-Rostaing Y, Pernier J: Neurophysiological mechanisms of auditory selective attention in humans. Front Biosci 2000, 5:D84-D94. 175. Alais D, Morrone C, Burr D: Separate attentional resources for vision and audition. Proc Biol Sci 2006, 273:1339-1345. 196. Janata P, Tillmann B, Bharucha JJ: Listening to polyphonic music recruits domain-general attention and working memory circuits. Cogn Affect Behav Neurosci 2002, 2:121-140. 176. Talsma D, Doty TJ, Woldorff MG: Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? Cereb Cortex 2007, 17:679-690. Current Opinion in Neurobiology 2007, 17:437–455 197. Steinmetz PN, Roy A, Fitzgerald PJ, Hsiao SS, Johnson KO, Niebur E: Attention modulates synchronized neuronal firing in primate somatosensory cortex. Nature 2000, 404:187-190. www.sciencedirect.com Auditory attention — focusing the searchlight on sound Fritz et al. 455 198. Buschman TJ, Miller EK: Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science 2007, 315:1860-1862. 206. Connor CE, Gallant JL, Preddie DC, Van Essen DC: Responses in area V4 depend on the spatial relationship between stimulus and attention. J Neurophysiol 1996, 75:1306-1308. 199. Saalmann YB, Pigarev IN, Vidyasagar TR: Neural mechanisms of visual attention: how top-down feedback highlights relevant locations. Science 2007, 316:1612-1615. 207. Foxe JJ, Simpson GV, Ahlfors SP, Saron CD: Biasing the brain’s attentional set. I. Cue driven deployments of intersensory selective attention. Exp Brain Res 2005, 166:370-392. 200. Womelsdorf T, Schoffelen JM, Oostenveld R, Singer W, Desimone R, Engel AK, Fries P: Modulation of neuronal interactions through neuronal synchronization. Science 2007, 316:1609-1612. 208. Raz A, Buhle J: Typologies of attentional networks. Nat Rev Neurosci 2006, 7:367-379. 201. Debener S, Herrmann CS, Kranczioch C, Gembris D, Engel AK: Top-down attentional processing enhances auditory evoked gamma band activity. Neuroreport 2003, 14:683-686. 202. Mulert C, Leicht G, Pogarell O, Mergl R, Karch S, Juckel G, Moller HJ, Hegerl U: Auditory cortex and anterior cingulate cortex sources of the early evoked gamma-band response: relationship to task difficulty and mental effort. Neuropsychologia 2007, 45:2294-2306. 203. Driver J, Frith C: Shifting baselines in attention research. Nat Rev Neurosci 2000, 1:147-148. 209. Peers PV, Ludwig CJ, Rorden C, Cusack R, Bonfiglioli C, Bundesen C, Driver J, Antoun N, Duncan J: Attentional functions of parietal and frontal cortex. Cereb Cortex 2005, 15:1469-1484. 210. Serences JT, Yantis S: Spatially selective representations of voluntary and stimulus-driven attentional priority in human occipital, parietal, and frontal cortex. Cereb Cortex 2007, 17:284-293. 211. Mitchell J, Sundberg KS, Reynolds JH: Differential attentiondependent response modulation across cell classes in macaque visual area V4. Neuron 2007, 55:131-141. 204. Otazu GH, Zador AM: Attentional state modulation of neural responses in rat auditory cortex. Society for Neuroscience Meeting Abstract. 2006. 212. Edeline JM: The thalamo-cortical auditory receptive fields: regulation by the states of vigilance, learning and the neuromodulatory systems (review). Exp Brain Res 2003, 153:554-572. 205. Miller JM, Sutton D, Pfingst B, Ryan A, Beaton R, Gourevitch G: Single cell activity in the auditory cortex of Rhesus monkeys: behavioral dependency. Science 1972, 177:449-451. 213. Maunsell JH: Neuronal representations of cognitive state: reward or attention? Trends Cogn Sci 2004, 8:261-265. www.sciencedirect.com Current Opinion in Neurobiology 2007, 17:437–455