39 resultados para Residual autocorrelation and autocovariance matrices
em CentAUR: Central Archive University of Reading - UK
Resumo:
The positive, psychotic symptoms of schizophrenia can be treated by antipsychotic drugs and it has been assumed that these are antagonists at the D-2 and D-3 dopamine receptors in the brain. Recently, the D-2/D-3 partial agonist aripiprazole has been introduced as an antipsychotic drug. It has also been realized that, using in vitro assays, the other antipsychotic drugs are in fact inverse agonists at D-2/D-3 dopamine receptors. This raises questions about how these disparate drugs can achieve a similar clinical outcome. In this review, I shall consider the efficacies of these drugs in signalling assays and how these efficacies might affect treatment outcomes. It seems that the treatment outcome might depend on the overall level of cell stimulation, which is in turn dependent on the level of residual dopamine and the efficacy of the drug in signalling assays.
Resumo:
Deep Brain Stimulation (DBS) is a treatment routinely used to alleviate the symptoms of Parkinson's disease (PD). In this type of treatment, electrical pulses are applied through electrodes implanted into the basal ganglia of the patient. As the symptoms are not permanent in most patients, it is desirable to develop an on-demand stimulator, applying pulses only when onset of the symptoms is detected. This study evaluates a feature set created for the detection of tremor - a cardinal symptom of PD. The designed feature set was based on standard signal features and researched properties of the electrical signals recorded from subthalamic nucleus (STN) within the basal ganglia, which together included temporal, spectral, statistical, autocorrelation and fractal properties. The most characterized tremor related features were selected using statistical testing and backward algorithms then used for classification on unseen patient signals. The spectral features were among the most efficient at detecting tremor, notably spectral bands 3.5-5.5 Hz and 0-1 Hz proved to be highly significant. The classification results for determination of tremor achieved 94% sensitivity with specificity equaling one.
Resumo:
Numerical methods are described for determining robust, or well-conditioned, solutions to the problem of pole assignment by state feedback. The solutions obtained are such that the sensitivity of the assigned poles to perturbations in the system and gain matrices is minimized. It is shown that for these solutions, upper bounds on the norm of the feedback matrix and on the transient response are also minimized and a lower bound on the stability margin is maximized. A measure is derived which indicates the optimal conditioning that may be expected for a particular system with a given set of closed-loop poles, and hence the suitability of the given poles for assignment.
Resumo:
The construction industry is widely recognised as being inherent with risk and uncertainty. This necessitates the need for effective project risk management to achieve the project objectives of time, cost and quality. A popular tool employed in projects to aid in the management of risk is a risk register. This tool documents the project risks and is often employed by the Project Manager (PM) to manage the associated risks on a project. This research aims to ascertain how widely risk registers are used by Project Managers as part of their risk management practices. To achieve this aim entailed interviewing ten PMs, to discuss their use of the risk register as a risk management tool. The results from these interviews indicated the prevalent use of this document and recognised its effectiveness in the management of project risks. The findings identified the front end and feasibility phases of a project as crucial stages for using risk registers, noting it as a vital ingredient in the risk response planning of the decision making process. Moreover, the composition of the risk register was also understood, with an insight into how PMs produce and develop this tool also ascertained. In conclusion, this research signifies the extensive use of the risk register by PMs. A majority of PMs were of the view that risk registers constitute an essential component of their project risk management practices. This suggests a need for further research on the extent to which risk registers actually help PMs to control the risks in a construction project, particularly residual risks, and how this can be improved to minimize deviations from expected outcomes.
Resumo:
The United Kingdom’s pharmacy regulator contemplated using continuing professional development (CPD) in pharmacy revalidation in 2009, simultaneously asking pharmacy professionals to demonstrate the value of their CPD by showing its relevance and impact. The idea of linking new CPD requirements with revalidation was yet to be explored. Our aim was to develop and validate a framework to guide pharmacy professionals to select CPD activities that are relevant to their work and to produce a score sheet that would make it possible to quantify the impact and relevance of CPD. METHODS: We adapted an existing risk matrix, producing a CPD framework consisting of relevance and impact matrices. Concepts underpinning the framework were refined through feedback from five pharmacist teacher-practitioners. We then asked seven pharmacists to rate the relevance of the framework’s individual elements on a 4-point scale to determine content validity. We explored views about the framework through focus groups with six and interviews with 17 participants who had used it formally in a study. RESULTS: The framework’s content validity index was 0.91. Feedback about the framework related to three themes of penetrability of the framework, usefulness to completion of CPD, and advancement of CPD records for the purpose of revalidation. DISCUSSION: The framework can help professionals better select CPD activities prospectively, and makes assessment of CPD more objective by allowing quantification, which could be helpful for revalidation. We believe the framework could potentially help other health professionals with better management of their CPD irrespective of their field of practice.
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
The classic vertical advection-diffusion (VAD) balance is a central concept in studying the ocean heat budget, in particular in simple climate models (SCMs). Here we present a new framework to calibrate the parameters of the VAD equation to the vertical ocean heat balance of two fully-coupled climate models that is traceable to the models’ circulation as well as to vertical mixing and diffusion processes. Based on temperature diagnostics, we derive an effective vertical velocity w∗ and turbulent diffusivity k∗ for each individual physical process. In steady-state, we find that the residual vertical velocity and diffusivity change sign in mid-depth, highlighting the different regional contributions of isopycnal and diapycnal diffusion in balancing the models’ residual advection and vertical mixing. We quantify the impacts of the time-evolution of the effective quantities under a transient 1%CO2 simulation and make the link to the parameters of currently employed SCMs.
Resumo:
Hourly sea level records from 1954 to 2012 at 20 tide gauges at and adjacent to the Chinese coasts are used to analyze extremes in sea level and in tidal residual. Tides and tropical cyclones determine the spatial distribution of sea level maxima. Tidal residual maxima are predominantly determined by tropical cyclones. The 50 year return level is found to be sensitive to the number of extreme events used in the estimation. This is caused by the small number of tropical cyclone events happening each year which lead to other local storm events included thus significantly affecting the estimates. Significant increase in sea level extremes is found with trends in the range between 2.0 and 14.1 mm yr−1. The trends are primarily driven by changes in median sea level but also linked with increases in tidal amplitudes at three stations. Tropical cyclones cause significant interannual variations in the extremes. The interannual variability in the sea level extremes is also influenced by the changes in median sea level at the north and by the 18.6 year nodal cycle at the South China Sea. Neither of PDO and ENSO is found to be an indicator of changes in the size of extremes, but ENSO appears to regulate the number of tropical cyclones that reach the Chinese coasts. Global mean atmospheric temperature appears to be a good descriptor of the interannual variability of tidal residual extremes induced by tropical cyclones but the trend in global temperature is inconsistent with the lack of trend in the residuals.
Resumo:
Proanthocyanidins (PA) in Senna alata leaves were investigated by thiolysis with benzyl mercaptan, LC–MS and NMR and consisted of almost pure propelargonidins with <6% procyanidins, had B-type linkages and a mean degree of polymerisation of three. Epiafzelechin was the major flavan-3-ol subunit (>94%) and epicatechin a minor constituent (6.4%) in residual PA and mainly detected as an extension unit.
Resumo:
Bioaccessibility studies have been widely used as a research tool to determine the potential human exposure to ingested contaminants. More recently they have been practically applied for soil borne toxic elements. This paper reviews the application of bioaccessibility tests across a range of organic pollutants and contaminated matrices. Important factors are reported to be: the physiological relevance of the test, the components in the gut media, the size fraction chosen for the test and whether it contains a sorptive sink. The bioaccessibility is also a function of the composition of the matrix (e.g. organic carbon content of soils) and the physico-chemical characteristics of the pollutant under test. Despite the widespread use of these tests, there are a large number of formats used and very few validation studies with animal models. We propose a unified format for a bioaccessibility test for organic pollutants. The robustness of this test should first be confirmed through inter laboratory comparison, then tested in-vivo.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The Olsen method is an indicator of plant-available phosphorus (P). The effect of time and temperature on residual phosphate in soils was measured using the Olsen method in a pot experiment. Four soils were investigated: two from Pakistan and one each from England (calcareous) and Colombia (acidic). Two levels of residual phosphate were developed in each soil after addition of phosphate by incubation at either 10degreesC or 45degreesC. The amount of phosphate added was based on the P maximum of each soil, calculated using the Langmuir equation. Rvegrass was used as the test crop. The pooled data for the four soils incubated at 10degreesC showed good correlation between Olsen P and dry matter yield or P uptake (r(2) = 0.85 and 0.77, respectively), whereas at 45 degreesC, each soil had its own relationship and pooled data did not show correlation of Olsen P with dry matter yield or P uptake. When the data at both temperatures were pooled, Olsen P was a good indicator of yield and uptake for the English soil. For the Pakistani soils, Olsen P after 45 degreesC treatment was an underestimate relative to the 10 degreesC data and for the Colombian soil it was an overestimate. The reasons for these differences need to be explored further before high temperature incubation can be used to simulate long-term changes in the field.
Resumo:
It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.