66 resultados para Cognitive Linguistics. Situation Models. Mental Simulation. Frames and Schemes
em CentAUR: Central Archive University of Reading - UK
Resumo:
Acute doses of Ginkgo biloba have been shown to improve attention and memory in young, healthy participants, but there has been a lack of investigation into possible effects on executive function. In addition, only one study has investigated the effects of chronic treatment in young volunteers. This study was conducted to compare the effects of ginkgo after acute and chronic treatment on tests of attention, memory and executive function in healthy university students. Using a placebo-controlled double-blind design, in experiment 1, 52 students were randomly allocated to receive a single dose of ginkgo (120 mg, n=26) or placebo (n=26), and were tested 4h later. In experiment 2, 40 students were randomly allocated to receive ginkgo (120 mg/day; n=20) or placebo (n=20) for a 6-week period and were tested at baseline and after 6 weeks of treatment. In both experiments, participants underwent tests of sustained attention, episodic and working memory, mental flexibility and planning, and completed mood rating scales. The acute dose of ginkgo significantly improved performance on the sustained-attention task and pattern-recognition memory task; however, there were no effects on working memory, planning, mental flexibility or mood. After 6 weeks of treatment, there were no significant effects of ginkgo on mood or any of the cognitive tests. In line with the literature, after acute administration ginkgo improved performance in tests of attention and memory. However, there were no effects after 6 weeks, suggesting that tolerance develops to the effects in young, healthy participants.
Resumo:
The capability of a feature model of immediate memory (Nairne, 1990; Neath, 2000) to predict and account for a relationship between absolute and proportion scoring of immediate serial recall when memory load is varied (the list-length effect, LLE) is examined. The model correctly predicts the novel finding of an LLE in immediate serial order memory similar to that observed with free recall and previously assumed to be attributable to the long-term memory component of that procedure (Glanzer, 1972). The usefulness of formal models as predictive tools and the continuity between short-term serial order and longer term item memory are considered.
Resumo:
Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.
Resumo:
The aim of this study was to assess and improve the accuracy of biotransfer models for the organic pollutants (PCBs, PCDD/Fs, PBDEs, PFCAs, and pesticides) into cow’s milk and beef used in human exposure assessment. Metabolic rate in cattle is known as a key parameter for this biotransfer, however few experimental data and no simulation methods are currently available. In this research, metabolic rate was estimated using existing QSAR biodegradation models of microorganisms (BioWIN) and fish (EPI-HL and IFS-HL). This simulated metabolic rate was then incorporated into the mechanistic cattle biotransfer models (RAIDAR, ACC-HUMAN, OMEGA, and CKow). The goodness of fit tests showed that RAIDAR, ACC-HUMAN, OMEGA model performances were significantly improved using either of the QSARs when comparing the new model outputs to observed data. The CKow model is the only one that separates the processes in the gut and liver. This model showed the lowest residual error of all the models tested when the BioWIN model was used to represent the ruminant metabolic process in the gut and the two fish QSARs were used to represent the metabolic process in the liver. Our testing included EUSES and CalTOX which are KOW-regression models that are widely used in regulatory assessment. New regressions based on the simulated rate of the two metabolic processes are also proposed as an alternative to KOW-regression models for a screening risk assessment. The modified CKow model is more physiologically realistic, but has equivalent usability to existing KOW-regression models for estimating cattle biotransfer of organic pollutants.
Resumo:
Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.
Resumo:
There is an increasing interest in modelling electromagnetic methods of NDT - particularly eddy currents. A collaboration within the International Institute of Welding led to a survey intended to explain to non mathematicians the present scope of modelling. The present review commences with this survey and then points out some of the developments and some of the outstanding problems in transferring modelling into industry.
Resumo:
Demonstration models of the costs of BVD and Johnes in dairy and beef cattle and the costs and benefits of control have been developed. An example applied to BVD in dairy cattle is presented. Downloadable versions of the models, together with supporting material on how to use them are available to veterinarians from a dedicated website.
Resumo:
Demonstration models of the costs of BVD and Johnes in dairy and beef cattle and the costs and benefits of control have been developed. An example applied to BVD in dairy cattle is presented. Downloadable versions of the models, together with supporting material on how to use them are available to veterinarians from a dedicated website.
Resumo:
Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
The past decade has seen considerable growth in the evidence base of cognitive behavioural therapy for psychosis. Consistent reports of moderate effect sizes have led to such interventions being recommended as part of routine clinical practice. Most of this evidence is based on a generic form of CBT for psychosis applied to a heterogeneous group. An increase in the effectiveness of cognitive behavioural interventions may require new protocols. Such therapeutic developments should be based on the theoretical understanding of the psychological processes associated with specific forms of psychotic presentation. The current evidence base of CBT for psychosis is reviewed, and barriers that have held back the development of this research are discussed.
Resumo:
This paper reviews recent theoretical, conceptual and practice developments in cognitive-behaviour therapy (CBT) for anxiety disorders. The empirical status of CBT for anxiety disorders is reviewed and recent advances in the field are outlined. Challenges for the future development of CBT for the anxiety disorders are examined in relation to the efficacy, effectiveness and cost-effectiveness of the approach. It is concluded that the major challenge currently facing CBT for anxiety disorders in the UK is how to meet the increased demand for provision whilst maintaining high levels of efficacy and effectiveness. It is suggested that the creation of an evidence base for the dissemination of CBT needs to become a priority for empirical investigation in order effectively to expand the provision of CBT for anxiety disorders.
Gabor wavelets and Gaussian models to separate ground and non-ground for airborne scanned LIDAR data
Resumo:
We analyze the publicly released outputs of the simulations performed by climate models (CMs) in preindustrial (PI) and Special Report on Emissions Scenarios A1B (SRESA1B) conditions. In the PI simulations, most CMs feature biases of the order of 1 W m −2 for the net global and the net atmospheric, oceanic, and land energy balances. This does not result from transient effects but depends on the imperfect closure of the energy cycle in the fluid components and on inconsistencies over land. Thus, the planetary emission temperature is underestimated, which may explain the CMs' cold bias. In the PI scenario, CMs agree on the meridional atmospheric enthalpy transport's peak location (around 40°N/S), while discrepancies of ∼20% exist on the intensity. Disagreements on the oceanic transport peaks' location and intensity amount to ∼10° and ∼50%, respectively. In the SRESA1B runs, the atmospheric transport's peak shifts poleward, and its intensity increases up to ∼10% in both hemispheres. In most CMs, the Northern Hemispheric oceanic transport decreases, and the peaks shift equatorward in both hemispheres. The Bjerknes compensation mechanism is active both on climatological and interannual time scales. The total meridional transport peaks around 35° in both hemispheres and scenarios, whereas disagreements on the intensity reach ∼20%. With increased CO 2 concentration, the total transport increases up to ∼10%, thus contributing to polar amplification of global warming. Advances are needed for achieving a self-consistent representation of climate as a nonequilibrium thermodynamical system. This is crucial for improving the CMs' skill in representing past and future climate changes.