23 resultados para discoveries
em Aston University Research Archive
Resumo:
The application of cognitive neuroscientific techniques to understanding social behaviour has resulted in many discoveries. Yet advocates of the ‘social cognitive neuroscience’ approach maintain that it suffers from a number of limitations. The most notable of these is its distance from any form of real-world applicabity. One solution to this limitation is ‘Organisational Cognitive Neuroscience’ – the study of the cognitive neuroscience of human behaviour in, and in response to, organizations. Given that all of us will spend most of our lives in some sort of work related organisation, organisational cognitive neuroscience allows us to examine the cognitive underpinnings of social behaviour that occurs in what may be our most natural ecology. Here we provide a brief overview of this approach, a definition and also some possible questions that the new approach would be best suited to address.
Resumo:
On July 17, 1990, President George Bush ssued “Proclamation #6158" which boldly declared the following ten years would be called the “Decade of the Brain” (Bush, 1990). Accordingly, the research mandates of all US federal biomedical institutions worldwide were redirected towards the study of the brain in general and cognitive neuroscience specifically. In 2008, one of the greatest legacies of this “Decade of the Brain” is the impressive array of techniques that can be used to study cortical activity. We now stand at a juncture where cognitive function can be mapped in the time, space and frequency domains, as and when such activity occurs. These advanced techniques have led to discoveries in many fields of research and clinical science, including psychology and psychiatry. Unfortunately, neuroscientific techniques have yet to be enthusiastically adopted by the social sciences. Market researchers, as specialized social scientists, have an unparalleled opportunity to adopt cognitive neuroscientific techniques and significantly redefine the field and possibly even cause substantial dislocations in business models. Following from this is a significant opportunity for more commercially-oriented researchers to employ such techniques in their own offerings. This report examines the feasibility of these techniques.
Resumo:
Dispersion managed solitons have been discovered to have some remarkable properties which indicate an outstanding opportunity for exploitation in transmission systems. This paper will review and interpret these discoveries and discuss the potential for WDM of these solitons for both long distance systems and for the upgrade of the installed fibre base.
Resumo:
In this contribution I look at three episodes in the history of neurophysiology that bring out the complex relationship between seeing and believing. I start with Vesalius in the mid-sixteenth century who writes that he can in no way see any cavity in nerves, even in the optic nerves. He thus questions the age-old theory (dating back to the Alexandrians in the third century BC) but, because of the overarching psychophysiology of his time, does not press his case. This conflict between observation and theory persisted for a quarter of a millennium until finally resolved at the beginning of the nineteenth century by the discoveries of Galvani and Volta. The second case is provided by the early history of retinal synaptology. Schultze in 1866 had represented rod spherules and bipolar dendrites in the outer plexiform layer as being separated by a (synaptic) gap, yet in his written account, because of his theoretical commitments, held them to be continuous. Cajal later, 1892, criticized Schultze for this pusillanimity, but his own figure in La Cellule is by no means clear. It was only with the advent of the electron microscopy in the mid-twentieth century that the true complexity of the junction was revealed and it was shown that both investigators were partially right. My final example comes from the Hodgkin-Huxley biophysics of the 1950s. Their theory of the action potential depended on the existence of unseen ion pores with quite complex biophysical characteristics. These were not seen until the Nobel-Prize-winning X-ray diffraction analyses of the early twenty-first century. Seeing, even at several removes, then confirmed Hodgkin and Huxley’s belief. The relation between seeing and believing is by no means straightforward.
Resumo:
This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.
Resumo:
Cognitive linguistics scholars argue that metaphor is fundamentally a conceptual process of mapping one domain of experience onto another domain. The study of metaphor in the context of Translation Studies has not, unfortunately, kept pace with the discoveries about the nature and role of metaphor in the cognitive sciences. This study aims primarily to fill part of this gap of knowledge. Specifically, the thesis is an attempt to explore some implications of the conceptual theory of metaphor for translation. Because the study of metaphor in translation is also based on views about the nature of translation, the thesis first presents a general overview of the discipline of Translation Studies, describing the major models of translation. The study (in Chapter Two) then discusses the major traditional theories of metaphor (comparison, substitution and interaction theories) and shows how the ideas of those theories were adopted in specific translation studies of metaphor. After that, the study presents a detailed account of the conceptual theory of metaphor and some hypothetical implications for the study of metaphor in translation from the perspective of cognitive linguistics. The data and methodology are presented in Chapter Four. A novel classification of conceptual metaphor is presented which distinguishes between different source domains of conceptual metaphors: physical, human-life and intertextual. It is suggested that each source domain places different demands on translators. The major sources of the data for this study are (1) the translations done by the Foreign Broadcasting Information Service (FBIS), which is a translation service of the Central Intelligence Agency (CIA) in the United Sates of America, of a number of speeches by the Iraqi president Saddam Hussein during the Gulf Crisis (1990-1991) and (2) official (governmental) Omani translations of National Day speeches of Sultan Qaboos bin Said of Oman.
Resumo:
Following Andersen's (1986, 1991) study of untutored anglophone learners of Spanish, aspectual features have been at the centre of hypotheses on the development of past verbal morphology in language acquisition. The Primacy of Aspect Hypothesis claims that the association of any verb category (Aktionsart) with any aspect (perfective or imperfective) constitutes the endpoint of acquisition. However, its predictions rely on the observation of a limited number of untutored learners at the early stages of their acquisition, and have yet to be confirmed in other settings. The aim of the present thesis is to evaluate the explanatory power of the PAH in respect of the acquisition of French past tenses, an aspect of the language which constitutes a serious stumbling block for foreign learners, even those at the highest levels of proficiency (Coppieters 1987). The present research applies the PAH to the production of 61 anglophone 'advanced learners' (as defined in Bartning 1997) in a tutored environment. In so doing, it tests concurrent explanations, including the influence of the input, the influence of chunking, and the hypothesis of cyclic development. Finally, it discusses the cotextual and contextual factors that still provoke what Anderson (1991) terms "non-native glitches" at the final stage, as predicted by the PAH. The first part of the thesis provides the theoretical background to the corpus analysis. It opens with a diachronic presentation of the French past tense system focusing on present areas of competition and developments that emphasize the complexity of the system to be acquired. The concepts of time, grammatical aspect and lexical aspect (Aktionsart) are introduced and discussed in the second chapter, and a distinctive formal representation of the French past tenses is offered in the third chapter. The second part of the thesis is devoted to a corpus analysis. The data gathering procedures and the choice of tasks (oral and written film narratives based on Modern Times, cloze tests and acceptability judgement tests) are described and justified in the research methodology chapter. The research design was shaped by previous studies and consequently allows comparison with these. The second chapter is devoted to the narratives analysis and the third to the grammatical tasks. This section closes with a summary of discoveries and a comparison with previous results. The conclusion addresses the initial research questions in the light of both theory and practice. It shows that the PAH fails to account for the complex phenomenon of past tense development in the acquisitional settings under study, as it adopts a local (the verb phrase) and linear (steady progression towards native usage) approach. It is thus suggested that past tense acquisition rather follows a pendular development as learners reformulate their learning hypotheses and become increasingly able to shift from local to global cues and so to integrate the influence of cotext and context in their tense choice.
Resumo:
The application of cognitive neuroscientific techniques to understanding social behaviour has resulted in many discoveries. Yet advocates of the ‘social cognitive neuroscience’ approach maintain that it suffers from a number of limitations. The most notable of these is its distance from any form of real-world applicabity. One solution to this limitation is ‘Organisational Cognitive Neuroscience’— the study of the cognitive neuroscience of human behaviour in, and in response to, organizations, which are arguably our most natural contemporary ecology. Here we provide a brief overview of this approach, a definition and also some examples of questions that the approach would be best suited to address. Furthemore, we consider neuromarketing as a subfield of organizational cognitive neuroscience, arguing that such a relationship clarifies the role of scholarly marketing research in the area, and provides a welcome emphasis on theoretical rigour.
Resumo:
The field of free radical biology and medicine continues to move at a tremendous pace, with a constant flow of ground-breaking discoveries. The following collection of papers in this issue of Biochemical Society Transactions highlights several key areas of topical interest, including the crucial role of validated measurements of radicals and reactive oxygen species in underpinning nearly all research in the field, the important advances being made as a result of the overlap of free radical research with the reinvigorated field of lipidomics (driven in part by innovations in MS-based analysis), the acceleration of new insights into the role of oxidative protein modifications (particularly to cysteine residues) in modulating cell signalling, and the effects of free radicals on the functions of mitochondria, extracellular matrix and the immune system. In the present article, we provide a brief overview of these research areas, but, throughout this discussion, it must be remembered that it is the availability of reliable analytical methodologies that will be a key factor in facilitating continuing developments in this exciting research area.
Resumo:
During the last decade, biomedicine has witnessed a tremendous development. Large amounts of experimental and computational biomedical data have been generated along with new discoveries, which are accompanied by an exponential increase in the number of biomedical publications describing these discoveries. In the meantime, there has been a great interest with scientific communities in text mining tools to find knowledge such as protein-protein interactions, which is most relevant and useful for specific analysis tasks. This paper provides a outline of the various information extraction methods in biomedical domain, especially for discovery of protein-protein interactions. It surveys methodologies involved in plain texts analyzing and processing, categorizes current work in biomedical information extraction, and provides examples of these methods. Challenges in the field are also presented and possible solutions are discussed.
Resumo:
To date, more than 16 million citations of published articles in biomedical domain are available in the MEDLINE database. These articles describe the new discoveries which accompany a tremendous development in biomedicine during the last decade. It is crucial for biomedical researchers to retrieve and mine some specific knowledge from the huge quantity of published articles with high efficiency. Researchers have been engaged in the development of text mining tools to find knowledge such as protein-protein interactions, which are most relevant and useful for specific analysis tasks. This chapter provides a road map to the various information extraction methods in biomedical domain, such as protein name recognition and discovery of protein-protein interactions. Disciplines involved in analyzing and processing unstructured-text are summarized. Current work in biomedical information extracting is categorized. Challenges in the field are also presented and possible solutions are discussed.
Resumo:
Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.
Resumo:
Was darf in einem gutsortierten Plattenschrank aus den Bereichen Rock und Pop nicht fehlen? Uwe Schütte beantwortet diese drängende Frage. Inzwischen hat er seine erfolgreiche Empfehlungsliste überarbeitet und den Band um fünf Klassiker sowie Entdeckungen in neuen Einzelartikeln ergänzt: von Abba über Burial, Death Cab For Cutie und Tocotronic bis hin zu Wilco. Register der Platten und Gruppen erleichtern die Orientierung in diesem für den Musikfan unentbehrlichen Band. What may be missing in a well-stocked record collection from the fields of rock and pop not? Uwe Schütte answers this pressing question. Since then he has revised his successful referral list and the band around five classics and new discoveries, supplemented in individual articles: Abba on Burial, Death Cab For Cutie and Plumb to Wilco. Register of the panels and groups to facilitate the orientation in this indispensable volume for the music lover.
Resumo:
Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.
Resumo:
Here we demonstrate the first application of time-resolved synchrotron X-ray absorption spectroscopy to simultaneously follow dynamic nanoparticle surface restructuring and the evolution of surface and gas-phase products during an organic reaction. Surface palladium oxide, and not metal, is identified as the catalytic species responsible for the selective oxidation (selox) of crotyl alcohol to crotonaldehyde. Elevated reaction temperatures facilitate reversible nanoparticle redox processes, and concomitant catalytic selectivity loss, in response to reaction conditions. These discoveries highlight the importance of stabilizing surface palladium oxide and minimizing catalyst reducibility in order to achieve high selox yields, and will aid the future design of Pd-derived selox catalysts. This discovery has important implications for the design of future liquid and vapor phase selox catalysts, and the thermochemical behavior of Pd nanostructures in general.