75 resultados para Collection and processing of information
em CentAUR: Central Archive University of Reading - UK
Resumo:
Good information and career guidance about what post-compulsory educational routes are available and where these routes lead is important in ensuring that young people make choices that are most appropriate to their needs and aspirations. Yet the Association of School and College Leaders (2011) express fears that future provision will be inadequate. This paper reports the findings from an on-line survey of 300 secondary school teachers, and follow up telephone interviews with 18 in the South East of England which explored teachers’ experiences of delivering post-compulsory educational and career guidance and their knowledge and confidence in doing so. Results suggest that teachers lack confidence in delivering information, advice and guidance outside their own area of specialism and experience. In particular, teachers knew little in relation to alternative local provision of post-16 education and lacked knowledge of more non-traditional, vocational routes. This paper will therefore raises important policy considerations with respect to supporting teachers’ knowledge, ability and confidence in delivering information in relation to future pathways and career guidance.
Resumo:
In vitro fermentation techniques (IVFT) have been widely used to evaluate the nutritivevalue of feeds for ruminants and in the last decade to assess the effect of different nutritionalstrategies on methane (CH4) production. However, many technical factors may influencethe results obtained. The present review has been prepared by the ‘Global Network’ FACCE-JPI international research consortium to provide a critical evaluation of the main factorsthat need to be considered when designing, conducting and interpreting IVFT experimentsthat investigate nutritional strategies to mitigate CH4emission from ruminants. Given theincreasing and wide-scale use of IVFT, there is a need to critically review reports in the lit-erature and establish what criteria are essential to the establishment and implementationof in vitro techniques. Key aspects considered include: i) donor animal species and numberof animal used, ii) diet fed to donor animals, iii) collection and processing of rumen fluidas inoculum, iv) choice of substrate and incubation buffer, v) incubation procedures andCH4measurements, vi) headspace gas composition and vii) comparability of in vitro andin vivo measurements. Based on an evaluation of experimental evidence, a set of techni-cal recommendations are presented to harmonize IVFT for feed evaluation, assessment ofrumen function and CH4production.
Resumo:
Semiotics is the study of signs. Application of semiotics in information systems design is based on the notion that information systems are organizations within which agents deploy signs in the form of actions according to a set of norms. An analysis of the relationships among the agents, their actions and the norms would give a better specification of the system. Distributed multimedia systems (DMMS) could be viewed as a system consisted of many dynamic, self-controlled normative agents engaging in complex interaction and processing of multimedia information. This paper reports the work of applying the semiotic approach to the design and modeling of DMMS, with emphasis on using semantic analysis under the semiotic framework. A semantic model of DMMS describing various components and their ontological dependencies is presented, which then serves as a design model and implemented in a semantic database. Benefits of using the semantic database are discussed with reference to various design scenarios.
Processing reflexives in a second language: the timing of structural and discourse-level information
Resumo:
We report the results from two eye-movement monitoring experiments examining the processing of reflexive pronouns by proficient German-speaking learners of second language (L2) English. Our results show that the nonnative speakers initially tried to link English argument reflexives to a discourse-prominent but structurally inaccessible antecedent, thereby violating binding condition A. Our native speaker controls, in contrast, showed evidence of applying condition A immediately during processing. Together, our findings show that L2 learners’ initial focusing on a structurally inaccessible antecedent cannot be due to first language influence and is also independent of whether the inaccessible antecedent c-commands the reflexive. This suggests that unlike native speakers, nonnative speakers of English initially attempt to interpret reflexives through discourse-based coreference assignment rather than syntactic binding.
Resumo:
The neuropeptide substance P and its receptor NK1 have been implicated in emotion, anxiety and stress in preclinical studies. However, the role of NK1 receptors in human brain function is less clear and there have been inconsistent reports of the value of NK1 receptor antagonists in the treatment of clinical depression. The present study therefore aimed to investigate effects of NK1 antagonism on the neural processing of emotional information in healthy volunteers. Twenty-four participants were randomized to receive a single dose of aprepitant (125 mg) or placebo. Approximately 4 h later, neural responses during facial expression processing and an emotional counting Stroop word task were assessed using fMRI. Mood and subjective experience were also measured using self-report scales. As expected a single dose of aprepitant did not affect mood and subjective state in the healthy volunteers. However, NK1 antagonism increased responses specifically during the presentation of happy facial expressions in both the rostral anterior cingulate and the right amygdala. In the emotional counting Stroop task the aprepitant group had increased activation in both the medial orbitofrontal cortex and the precuneus cortex to positive vs. neutral words. These results suggest consistent effects of NK1 antagonism on neural responses to positive affective information in two different paradigms. Such findings confirm animal studies which support a role for NK1 receptors in emotion. Such an approach may be useful in understanding the effects of novel drug treatments prior to full-scale clinical trials.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
If soy isoflavones are to be effective in preventing or treating a range of diseases, they must be bioavailable, and thus understanding factors which may alter their bioavailability needs to be elucidated. However, to date there is little information on whether the pharmacokinetic profile following ingestion of a defined dose is influenced by the food matrix in which the isoflavone is given or by the processing method used. Three different foods (cookies, chocolate bars and juice) were prepared, and their isoflavone contents were determined. We compared the urinary and serum concentrations of daidzein, genistein and equol following the consumption of three different foods, each of which contained 50 mg of isoflavones. After the technological processing of the different test foods, differences in aglycone levels were observed. The plasma levels of the isoflavone precursor daidzein were not altered by food matrix. Urinary daidzein recovery was similar for all three foods ingested with total urinary output of 33-34% of ingested dose. Peak genistein concentrations were attained in serum earlier following consumption of a liquid matrix rather than a solid matrix, although there was a lower total urinary recovery of genistein following ingestion of juice than that of the two other foods. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
Background: The computational grammatical complexity ( CGC) hypothesis claims that children with G(rammatical)-specific language impairment ( SLI) have a domain-specific deficit in the computational system affecting syntactic dependencies involving 'movement'. One type of such syntactic dependencies is filler-gap dependencies. In contrast, the Generalized Slowing Hypothesis claims that SLI children have a domain-general deficit affecting processing speed and capacity. Aims: To test contrasting accounts of SLI we investigate processing of syntactic (filler-gap) dependencies in wh-questions. Methods & Procedures: Fourteen 10; 2 - 17; 2 G-SLI children, 14 age- matched and 17 vocabulary-matched controls were studied using the cross- modal picturepriming paradigm. Outcomes & Results: G-SLI children's processing speed was significantly slower than the age controls, but not younger vocabulary controls. The G- SLI children and vocabulary controls did not differ on memory span. However, the typically developing and G-SLI children showed a qualitatively different processing pattern. The age and vocabulary controls showed priming at the gap, indicating that they process wh-questions through syntactic filler-gap dependencies. In contrast, G-SLI children showed priming only at the verb. Conclusions: The findings indicate that G-SLI children fail to establish reliably a syntactic filler- gap dependency and instead interpret wh-questions via lexical thematic information. These data challenge the Generalized Slowing Hypothesis account, but support the CGC hypothesis, according to which G-SLI children have a particular deficit in the computational system affecting syntactic dependencies involving 'movement'. As effective remediation often depends on aetiological insight, the discovery of the nature of the syntactic deficit, along side a possible compensatory use of semantics to facilitate sentence processing, can be used to direct therapy. However, the therapeutic strategy to be used, and whether such similar strengths and weaknesses within the language system are found in other SLI subgroups are empirical issues that warrant further research.
Resumo:
Individuals with social phobia display social information processing biases yet their aetiological significance is unclear. Infants of mothers with social phobia and control infants' responses were assessed at 10 days, 10 and 16 weeks, and 10 months to faces versus non-faces, variations in intensity of emotional expressions, and gaze direction. Infant temperament and maternal behaviours were also assessed. Both groups showed a preference for faces over non-faces at 10 days and 10 weeks, and full faces over profiles at 16 weeks; they also looked more to high vs. low intensity angry faces at 10 weeks, and fearful faces at 10 months; however, index infants' initial orientation and overall looking to high-intensity fear faces was relatively less than controls at 10 weeks. This was not explained by infant temperament or maternal behaviours. The findings suggest that offspring of mothers with social phobia show processing biases to emotional expressions in infancy.
Resumo:
This paper addresses the nature and cause of Specific Language Impairment (SLI) by reviewing recent research in sentence processing of children with SLI compared to typically developing (TD) children and research in infant speech perception. These studies have revealed that children with SLI are sensitive to syntactic, semantic, and real-world information, but do not show sensitivity to grammatical morphemes with low phonetic saliency, and they show longer reaction times than age-matched controls. TD children from the age of 4 show trace reactivation, but some children with SLI fail to show this effect, which resembles the pattern of adults and TD children with low working memory. Finally, findings from the German Language Development (GLAD) Project have revealed that a group of children at risk for SLI had a history of an auditory delay and impaired processing of prosodic information in the first months of their life, which is not detectable later in life. Although this is a single project that needs to be replicated with a larger group of children, it provides preliminary support for accounts of SLI which make an explicit link between an early deficit in the processing of phonology and later language deficits, and the Computational Complexity Hypothesis that argues that the language deficit in children with SLI lies in difficulties integrating different types of information at the interfaces.
Resumo:
Background: Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples. Results: We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2 of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log(2) units (6 of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators. Conclusions: This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells.
Resumo:
This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.
Resumo:
It has been previously demonstrated that extensive activation in the dorsolateral temporal lobes associated with masking a speech target with a speech masker, consistent with the hypothesis that competition for central auditory processes is an important factor in informational masking. Here, masking from speech and two additional maskers derived from the original speech were investigated. One of these is spectrally rotated speech, which is unintelligible and has a similar (inverted) spectrotemporal profile to speech. The authors also controlled for the possibility of “glimpsing” of the target signal during modulated masking sounds by using speech-modulated noise as a masker in a baseline condition. Functional imaging results reveal that masking speech with speech leads to bilateral superior temporal gyrus (STG) activation relative to a speech-in-noise baseline, while masking speech with spectrally rotated speech leads solely to right STG activation relative to the baseline. This result is discussed in terms of hemispheric asymmetries for speech perception, and interpreted as showing that masking effects can arise through two parallel neural systems, in the left and right temporal lobes. This has implications for the competition for resources caused by speech and rotated speech maskers, and may illuminate some of the mechanisms involved in informational masking.