13 resultados para Additional somatosensory information

em CentAUR: Central Archive University of Reading - UK


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study evaluates computer-generated written explanations about drug prescriptions that are based on an analysis of both patient and doctor informational needs. Three experiments examine the effects of varying the type of information given about the possible side effects of the medication, and the order of information within the explanation. Experiment 1 investigated the effects of these two factors on people's ratings of how good they consider the explanations to be and of their perceived likelihood of taking the medication, as well as on their memory for the information in the explanation. Experiment 2 further examined the effects of varying information about side effects by separating out the contribution of number and severity of side effects. It was found that participants in this study did not “like” explanations that described severe side effects, and also judged that they would be less likely to take the medication if given such explanations. Experiment 3 therefore investigated whether information about severe side effects could be presented in such a way as to increase judgements of how good explanations are thought to be, as well as the perceived likelihood of adherence. The results showed some benefits of providing additional explanatory information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study uses a Granger causality time series modeling approach to quantitatively diagnose the feedback of daily sea surface temperatures (SSTs) on daily values of the North Atlantic Oscillation (NAO) as simulated by a realistic coupled general circulation model (GCM). Bivariate vector autoregressive time series models are carefully fitted to daily wintertime SST and NAO time series produced by a 50-yr simulation of the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3). The approach demonstrates that there is a small yet statistically significant feedback of SSTs oil the NAO. The SST tripole index is found to provide additional predictive information for the NAO than that available by using only past values of NAO-the SST tripole is Granger causal for the NAO. Careful examination of local SSTs reveals that much of this effect is due to the effect of SSTs in the region of the Gulf Steam, especially south of Cape Hatteras. The effect of SSTs on NAO is responsible for the slower-than-exponential decay in lag-autocorrelations of NAO notable at lags longer than 10 days. The persistence induced in daily NAO by SSTs causes long-term means of NAO to have more variance than expected from averaging NAO noise if there is no feedback of the ocean on the atmosphere. There are greater long-term trends in NAO than can be expected from aggregating just short-term atmospheric noise, and NAO is potentially predictable provided that future SSTs are known. For example, there is about 10%-30% more variance in seasonal wintertime means of NAO and almost 70% more variance in annual means of NAO due to SST effects than one would expect if NAO were a purely atmospheric process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We review the procedures and challenges that must be considered when using geoid data derived from the Gravity and steady-state Ocean Circulation Explorer (GOCE) mission in order to constrain the circulation and water mass representation in an ocean 5 general circulation model. It covers the combination of the geoid information with timemean sea level information derived from satellite altimeter data, to construct a mean dynamic topography (MDT), and considers how this complements the time-varying sea level anomaly, also available from the satellite altimeter. We particularly consider the compatibility of these different fields in their spatial scale content, their temporal rep10 resentation, and in their error covariances. These considerations are very important when the resulting data are to be used to estimate ocean circulation and its corresponding errors. We describe the further steps needed for assimilating the resulting dynamic topography information into an ocean circulation model using three different operational fore15 casting and data assimilation systems. We look at methods used for assimilating altimeter anomaly data in the absence of a suitable geoid, and then discuss different approaches which have been tried for assimilating the additional geoid information. We review the problems that have been encountered and the lessons learned in order the help future users. Finally we present some results from the use of GRACE geoid in20 formation in the operational oceanography community and discuss the future potential gains that may be obtained from a new GOCE geoid.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines two hydrochemical time-series derived from stream samples taken in the Upper Hafren catchment, Plynlimon, Wales. One time-series comprises data collected at 7-hour intervals over 22 months (Neal et al., submitted, this issue), while the other is based on weekly sampling over 20 years. A subset of determinands: aluminium, calcium, chloride, conductivity, dissolved organic carbon, iron, nitrate, pH, silicon and sulphate are examined within a framework of non-stationary time-series analysis to identify determinand trends, seasonality and short-term dynamics. The results demonstrate that both long-term and high-frequency monitoring provide valuable and unique insights into the hydrochemistry of a catchment. The long-term data allowed analysis of long-termtrends, demonstrating continued increases in DOC concentrations accompanied by declining SO4 concentrations within the stream, and provided new insights into the changing amplitude and phase of the seasonality of the determinands such as DOC and Al. Additionally, these data proved invaluable for placing the short-term variability demonstrated within the high-frequency data within context. The 7-hour data highlighted complex diurnal cycles for NO3, Ca and Fe with cycles displaying changes in phase and amplitude on a seasonal basis. The high-frequency data also demonstrated the need to consider the impact that the time of sample collection can have on the summary statistics of the data and also that sampling during the hours of darkness provides additional hydrochemical information for determinands which exhibit pronounced diurnal variability. Moving forward, this research demonstrates the need for both long-term and high-frequency monitoring to facilitate a full and accurate understanding of catchment hydrochemical dynamics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Successful quantitative precipitation forecasts under convectively unstable conditions depend on the ability of the model to capture the location, timing and intensity of convection. Ensemble forecasts of two mesoscale convective outbreaks over the UK are examined with a view to understanding the nature and extent of their predictability. In addition to a control forecast, twelve ensemble members are run for each case with the same boundary conditions but with perturbations added to the boundary layer. The intention is to introduce perturbations of appropriate magnitude and scale so that the large-scale behaviour of the simulations is not changed. In one case, convection is in statistical equilibrium with the large-scale flow. This places a constraint on the total precipitation, but the location and intensity of individual storms varied. In contrast, the other case was characterised by a large-scale capping inversion. As a result, the location of individual storms was fixed, but their intensities and the total precipitation varied strongly. The ensemble shows case-to-case variability in the nature of predictability of convection in a mesoscale model, and provides additional useful information for quantitative precipitation forecasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study suggests a statistical strategy for explaining how food purchasing intentions are influenced by different levels of risk perception and trust in food safety information. The modelling process is based on Ajzen's Theory of Planned Behaviour and includes trust and risk perception as additional explanatory factors. Interaction and endogeneity across these determinants is explored through a system of simultaneous equations, while the SPARTA equation is estimated through an ordered probit model. Furthermore, parameters are allowed to vary as a function of socio-demographic variables. The application explores chicken purchasing intentions both in a standard situation and conditional to an hypothetical salmonella scare. Data were collected through a nationally representative UK wide survey of 533 UK respondents in face-to-face, in-home interviews. Empirical findings show that interactions exist among the determinants of planned behaviour and socio-demographic variables improve the model's performance. Attitudes emerge as the key determinant of intention to purchase chicken, while trust in food safety information provided by media reduces the likelihood to purchase. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present three components of a virtual research environment developed for the ongoing Roman excavation at Silchester. These components — Recycle Bridge, XDB cross-database search, and Arch3D — provide additional services around the existing core of the system, run on the Integrated Archaeological Database (IADB). They provide, respectively, embedding of legacy applications into portals, cross-database searching, and 3D visualisation of stratigraphic information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the somatosensory homunculus is a classically used description of the way somatosensory inputs are processed in the brain, the actual contributions of primary (SI) and secondary (SII) somatosensory cortices to the spatial coding of touch remain poorly understood. We studied adaptation of the fMRI BOLD response in the somatosensory cortex by delivering pairs of vibrotactile stimuli to the finger tips of the index and middle fingers. The first stimulus (adaptor) was delivered either to the index or to the middle finger of the right or left hand, whereas the second stimulus (test) was always administered to the left index finger. The overall BOLD response evoked by the stimulation was primarily contralateral in SI and was more bilateral in SII. However, our fMRI adaptation approach also revealed that both somatosensory cortices were sensitive to ipsilateral as well as to contralateral inputs. SI and SII adapted more after subsequent stimulation of homologous as compared with nonhomologous fingers, showing a distinction between different fingers. Most importantly, for both somatosensory cortices, this finger-specific adaptation occurred irrespective of whether the tactile stimulus was delivered to the same or to different hands. This result implies integration of contralateral and ipsilateral somatosensory inputs in SI as well as in SII. Our findings suggest that SI is more than a simple relay for sensory information and that both SI and SII contribute to the spatial coding of touch by discriminating between body parts (fingers) and by integrating the somatosensory input from the two sides of the body (hands).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a world where data is captured on a large scale the major challenge for data mining algorithms is to be able to scale up to large datasets. There are two main approaches to inducing classification rules, one is the divide and conquer approach, also known as the top down induction of decision trees; the other approach is called the separate and conquer approach. A considerable amount of work has been done on scaling up the divide and conquer approach. However, very little work has been conducted on scaling up the separate and conquer approach.In this work we describe a parallel framework that allows the parallelisation of a certain family of separate and conquer algorithms, the Prism family. Parallelisation helps the Prism family of algorithms to harvest additional computer resources in a network of computers in order to make the induction of classification rules scale better on large datasets. Our framework also incorporates a pre-pruning facility for parallel Prism algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detection of a tactile stimulus on one finger is impaired when a concurrent stimulus (masker) is presented on an additional finger of the same or the opposite hand. This phenomenon is known to be finger-specific at the within-hand level. However, whether this specificity is also maintained at the between-hand level is not known. In four experiments, we addressed this issue by combining a Bayesian adaptive staircase procedure (QUEST) with a two-interval forced choice (2IFC) design in order to establish threshold for detecting 200ms, 100Hz sinusoidal vibrations applied to the index or little fingertip of either hand (targets). We systematically varied the masker finger (index, middle, ring, or little finger of either hand), while controlling the spatial location of the target and masker stimuli. Detection thresholds varied consistently as a function of the masker finger when the latter was on the same hand (Experiments 1 and 2), but not when on different hands (Experiments 3 and 4). Within the hand, detection thresholds increased for masker fingers closest to the target finger (i.e., middle>ring when the target was index). Between the hands, detection thresholds were higher only when the masker was present on any finger as compared to when the target was presented in isolation. The within hand effect of masker finger is consistent with the segregation of different fingers at the early stages of somatosensory processing, from the periphery to the primary somatosensory cortex (SI). We propose that detection is finger-specific and reflects the organisation of somatosensory receptive fields in SI within, but not between the hands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate and reliable rain rate estimates are important for various hydrometeorological applications. Consequently, rain sensors of different types have been deployed in many regions. In this work, measurements from different instruments, namely, rain gauge, weather radar, and microwave link, are combined for the first time to estimate with greater accuracy the spatial distribution and intensity of rainfall. The objective is to retrieve the rain rate that is consistent with all these measurements while incorporating the uncertainty associated with the different sources of information. Assuming the problem is not strongly nonlinear, a variational approach is implemented and the Gauss–Newton method is used to minimize the cost function containing proper error estimates from all sensors. Furthermore, the method can be flexibly adapted to additional data sources. The proposed approach is tested using data from 14 rain gauges and 14 operational microwave links located in the Zürich area (Switzerland) to correct the prior rain rate provided by the operational radar rain product from the Swiss meteorological service (MeteoSwiss). A cross-validation approach demonstrates the improvement of rain rate estimates when assimilating rain gauge and microwave link information.