34 resultados para Deleuze, Gilles, 1925-1995 -- Criticism and interpretation
em Aston University Research Archive
Resumo:
Self-criticism is strongly correlated with a range of psychopathologies, such as depression, eating disorders and anxiety. In contrast, self-reassurance is inversely associated with such psychopathologies. Despite the importance of self-judgements and evaluations, little is known about the neurophysiology of these internal processes. The current study therefore used a novel fMRI task to investigate the neuronal correlates of self-criticism and self-reassurance. Participants were presented statements describing two types of scenario, with the instruction to either imagine being self-critical or self-reassuring in that situation. One scenario type focused on a personal setback, mistake or failure, which would elicit negative emotions, whilst the second was of a matched neutral event. Self-criticism was associated with activity in lateral prefrontal cortex (PFC) regions and dorsal anterior cingulate (dAC), therefore linking self-critical thinking to error processing and resolution, and also behavioural inhibition. Self-reassurance was associated with left temporal pole and insula activation, suggesting that efforts to be self-reassuring engage similar regions to expressing compassion and empathy towards others. Additionally, we found a dorsal/ventral PFC divide between an individual's tendency to be self-critical or self-reassuring. Using multiple regression analyses, dorsolateral PFC activity was positively correlated with high levels of self-criticism (assessed via self-report measure), suggesting greater error processing and behavioural inhibition in such individuals. Ventrolateral PFC activity was positively correlated with high self-reassurance. Our findings may have implications for the neural basis of a range of mood disorders that are characterised by a preoccupation with personal mistakes and failures, and a self-critical response to such events.
Resumo:
The concept of 'masculinity' has over more years received increased attention within consumer research discourse suggesting the potential of a 'crisis of masculinity', symptomatic of a growing feminisation, or 'queering' of visual imagery and consumption (e.g. Patterson & Elliott, 2002). Although this corpus of research has served to enrich the broader gender identity debate, it is, arguably, still relatively underdeveloped and therefore warrants further insight and elaboration. The aim of this paper is, therefore, to explore how masculinity is represented and interpreted by men using the Dolce et Gabbana men's 2005 print advertising campaign. The rationale for using this particular campaign is that it is one of the most homoerotic, provocative, and well publicised campaigns to cross over from the 'gay' media to more mainstream UK men's magazines. Masculinity, and what it means to be 'masculine', manifests itself within particular ideological, moral, cultural and hegemonic discourses. Masculinity is not a homogenous term which can be simply reduced, and ascribed, to those born as 'male' rather than 'female'.
Resumo:
Neuroimaging (NI) technologies are having increasing impact in the study of complex cognitive and social processes. In this emerging field of social cognitive neuroscience, a central goal should be to increase the understanding of the interaction between the neurobiology of the individual and the environment in which humans develop and function. The study of sex/gender is often a focus for NI research, and may be motivated by a desire to better understand general developmental principles, mental health problems that show female-male disparities, and gendered differences in society. In order to ensure the maximum possible contribution of NI research to these goals, we draw attention to four key principles—overlap, mosaicism, contingency and entanglement—that have emerged from sex/gender research and that should inform NI research design, analysis and interpretation. We discuss the implications of these principles in the form of constructive guidelines and suggestions for researchers, editors, reviewers and science communicators.
Resumo:
The representation of serial position in sequences is an important topic in a variety of cognitive areas including the domains of language, memory, and motor control. In the neuropsychological literature, serial position data have often been normalized across different lengths, and an improved procedure for this has recently been reported by Machtynger and Shallice (2009). Effects of length and a U-shaped normalized serial position curve have been criteria for identifying working memory deficits. We present simulations and analyses to illustrate some of the issues that arise when relating serial position data to specific theories. We show that critical distinctions are often difficult to make based on normalized data. We suggest that curves for different lengths are best presented in their raw form and that binomial regression can be used to answer specific questions about the effects of length, position, and linear or nonlinear shape that are critical to making theoretical distinctions. © 2010 Psychology Press.
Resumo:
This thesis presents a thorough and principled investigation into the application of artificial neural networks to the biological monitoring of freshwater. It contains original ideas on the classification and interpretation of benthic macroinvertebrates, and aims to demonstrate their superiority over the biotic systems currently used in the UK to report river water quality. The conceptual basis of a new biological classification system is described, and a full review and analysis of a number of river data sets is presented. The biological classification is compared to the common biotic systems using data from the Upper Trent catchment. This data contained 292 expertly classified invertebrate samples identified to mixed taxonomic levels. The neural network experimental work concentrates on the classification of the invertebrate samples into biological class, where only a subset of the sample is used to form the classification. Other experimentation is conducted into the identification of novel input samples, the classification of samples from different biotopes and the use of prior information in the neural network models. The biological classification is shown to provide an intuitive interpretation of a graphical representation, generated without reference to the class labels, of the Upper Trent data. The selection of key indicator taxa is considered using three different approaches; one novel, one from information theory and one from classical statistical methods. Good indicators of quality class based on these analyses are found to be in good agreement with those chosen by a domain expert. The change in information associated with different levels of identification and enumeration of taxa is quantified. The feasibility of using neural network classifiers and predictors to develop numeric criteria for the biological assessment of sediment contamination in the Great Lakes is also investigated.
Resumo:
In his important book on evolutionary theory, Darwin's Dangerous Idea, Daniel Dennett warns that Darwin's idea seeps through every area of human discourse like a "universal acid" (Dennett, 1995). Art and the aesthetic response cannot escape its influence. So my approach in this chapter is essentially naturalistic. Friedrich Nietzsche writes of observing the human comedy from afar, "like a cold angel...without anger, but without warmth" (Nietzsche, 1872, p. 164). Whether Nietzsche, of all people, could have done this is a matter of debate. But we know what he means. It describes a stance outside the human world as if looking down on human folly from Mount Olympus. From this stance, humans, their art and neurology are all part of the natural world, all part of the evolutionary process, the struggle for existence. The anthropologist David Dutton, in his contribution to the Routledge Companion to Aesthetics, says that all humans have an aesthetic sense (Dutton, 2001). It is a human universal. Biologists argue that such universals have an evolutionary basis. Furthermore, many have argued that not only humans but also animals, at least the higher mammals and birds, have an appreciation of the beautiful and the ugly (Eibl-Eibesfeldt, 1988).11Charles Darwin indeed writes "Birds appear to be the most aesthetic of all animals, excepting, of course, man, and they have nearly the same sense of the beautiful that we have" (1871, The Descent of Man and Selection in Relation to Sex, London: John Murray, vol.2, xiii, 39). This again suggests that aesthetics has an evolutionary origin. In parenthesis here, I should perhaps say that I am well aware of the criticism leveled at evolutionary psychology. I am well aware that it has been attacked as just so many "just-so" stories. This is neither the time nor the place to mount a defense but simply just to say that I believe that a defense is eminently feasible. © 2006 Elsevier Inc. All rights reserved.
Resumo:
The present study examines the effect of the goodness of view on the minimal exposure time required to recognize depth-rotated objects. In a previous study, Verfaillie and Boutsen (1995) derived scales of goodness of view, using a new corpus of images of depth-rotated objects. In the present experiment, a subset of this corpus (five views of 56 objects) is used to determine the recognition exposure time for each view, by increasing exposure time across successive presentations until the object is recognized. The results indicate that, for two thirds of the objects, good views are recognized more frequently and have lower recognition exposure times than bad views.
Resumo:
In this paper we consider four alternative approaches to complexity control in feed-forward networks based respectively on architecture selection, regularization, early stopping, and training with noise. We show that there are close similarities between these approaches and we argue that, for most practical applications, the technique of regularization should be the method of choice.
Resumo:
The research agenda for the field of international human resource management (IHRM) is clear. For a better understanding and to benefit substantially, management scholars must study IHRM in context (Jackson, S.E. and Schuler, R.S. 1995. Understanding human resource management in the context of organizations and their environment. Annual Review of Psychology, 46: 237–264; Geringer, J.M., Frayne, C.A. and Milliman, J.F. 2002. In search of 'best practices' in international human resource management: research design and methodology. Human Resource Management, forthcoming). IHRM should be studied within the context of changing economic and business conditions. The dynamics of both the local/regional and international/global business context in which the firm operates should be given serious consideration. Further, it could be beneficial to study IHRM within the context of the industry and the firm's strategy and its other functional areas and operations. In taking these perspectives, one needs to use multiple levels of analysis when studying IHRM: the external social, political, cultural and economic environment; the industry, the firm, the sub-unit, the group, and the individual. Research in contextual isolation is misleading: it fails to advance understanding in any significant way (Adler, N.J. and Ghadar, E. 1990. Strategic human resource management: a global perspective. Human Resource Management in International Comparison. Berlin: de Gruyter; Locke, R. and Thelen, K. 1995. Apples and oranges revisited: contextualized comparisons and the study of comparative labor politics. Politics & Society, 23, 337–367). In this paper, we attempt to review the existing state of academic work in IHRM and illustrate how it incorporates the content and how it might be expanded to do so.
Resumo:
Recently, Drǎgulescu and Yakovenko proposed an analytical formula for computing the probability density function of stock log returns, based on the Heston model, which they tested empirically. Their research design inadvertently favourably biased the fit of the data to the Heston model, thus overstating their empirical results. Furthermore, Drǎgulescu and Yakovenko did not perform any goodness-of-fit statistical tests. This study employs a research design that facilitates statistical tests of the goodness-of-fit of the Heston model to empirical returns. Robustness checks are also performed. In brief, the Heston model outperformed the Gaussian model only at high frequencies and even so does not provide a statistically acceptable fit to the data. The Gaussian model performed (marginally) better at medium and low frequencies, at which points the extra parameters of the Heston model have adverse impacts on the test statistics. © 2005 Taylor & Francis Group Ltd.
Resumo:
Objective: To assess and explain deviations from recommended practice in National Institute for Clinical Excellence (NICE) guidelines in relation to fetal heart monitoring. Design: Qualitative study. Setting: Large teaching hospital in the UK. Sample: Sixty-six hours of observation of 25 labours and interviews with 20 midwives of varying grades. Methods: Structured observations of labour and semistructured interviews with midwives. Interviews were undertaken using a prompt guide, audiotaped, and transcribed verbatim. Analysis was based on the constant comparative method, assisted by QSR N5 software. Main outcome measures: Deviations from recommended practice in relation to fetal monitoring and insights into why these occur. Results: All babies involved in the study were safely delivered, but 243 deviations from recommended practice in relation to NICE guidelines on fetal monitoring were identified, with the majority (80%) of these occurring in relation to documentation. Other deviations from recommended practice included indications for use of electronic fetal heart monitoring and conduct of fetal heart monitoring. There is evidence of difficulties with availability and maintenance of equipment, and some deficits in staff knowledge and skill. Differing orientations towards fetal monitoring were reported by midwives, which were likely to have impacts on practice. The initiation, management, and interpretation of fetal heart monitoring is complex and distributed across time, space, and professional boundaries, and practices in relation to fetal heart monitoring need to be understood within an organisational and social context. Conclusion: Some deviations from best practice guidelines may be rectified through straightforward interventions including improved systems for managing equipment and training. Other deviations from recommended practice need to be understood as the outcomes of complex processes that are likely to defy easy resolution. © RCOG 2006.
Resumo:
An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
Serial and parallel interconnection of photonic devices is integral to the construction of any all-optical data processing system. This thesis presents results from a series of experiments centering on the use of the nonlinear-optical loop mirror (NOLM) switch in architectures for the manipulation and generation of ultrashort pulses. Detailed analysis of soliton switching in a single NOLM and cascade of two NOLM's is performed, centering on primary limitations to device operation, effect of cascading on amplitude response, and impact of switching on the characteristics of incident pulses. By using relatively long input pulses, device failure due to stimulated Raman generation is postponed to demonstrate multiple-peaked switching for the first time. It is found that while cascading leads to a sharpening of the overall switching characteristic, pulse spectral and temporal integrity is not significantly degraded, and emerging pulses retain their essential soliton character. In addition, by including an asymmetrically placed in-fibre Bragg reflector as a wavelength selective loss element in the basic NOLM configuration, both soliton self-switching and dual-wavelength control-pulse switching are spectrally quantised. Results are presented from a novel dual-wavelength laser configuration generating pulse trains with an ultra-low rms inter-pulse-stream timing jitter level of 630fs enabling application in ultrafast switching environments at data rates as high as 130GBits/s. In addition, the fibre NOLM is included in architectures for all-optical memory, demonstrating storage and logical inversion of a 0.5kByte random data sequence; and ultrafast phase-locking of a gain-switched distributed feedback laser at 1.062GHz, the fourteenth harmonic of the system baseband frequency. The stringent requirements for environmental robustness of these architectures highlight the primary weaknesses of the NOLM in its fibre form and recommendations to overcome its inherent drawbacks are presented.