19 resultados para Multiple escales method
Resumo:
This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.
Resumo:
Quantum dots (Qdots) are fluorescent nanoparticles that have great potential as detection agents in biological applications. Their optical properties, including photostability and narrow, symmetrical emission bands with large Stokes shifts, and the potential for multiplexing of many different colours, give them significant advantages over traditionally used fluorescent dyes. Here, we report the straightforward generation of stable, covalent quantum dot-protein A/G bioconjugates that will be able to bind to almost any IgG antibody, and therefore can be used in many applications. An additional advantage is that the requirement for a secondary antibody is removed, simplifying experimental design. To demonstrate their use, we show their application in multiplexed western blotting. The sensitivity of Qdot conjugates is found to be superior to fluorescent dyes, and comparable to, or potentially better than, enhanced chemiluminescence. We show a true biological validation using a four-colour multiplexed western blot against a complex cell lysate background, and have significantly improved previously reported non-specific binding of the Qdots to cellular proteins.
Resumo:
A new 3D implementation of a hybrid model based on the analogy with two-phase hydrodynamics has been developed for the simulation of liquids at microscale. The idea of the method is to smoothly combine the atomistic description in the molecular dynamics zone with the Landau-Lifshitz fluctuating hydrodynamics representation in the rest of the system in the framework of macroscopic conservation laws through the use of a single "zoom-in" user-defined function s that has the meaning of a partial concentration in the two-phase analogy model. In comparison with our previous works, the implementation has been extended to full 3D simulations for a range of atomistic models in GROMACS from argon to water in equilibrium conditions with a constant or a spatially variable function s. Preliminary results of simulating the diffusion of a small peptide in water are also reported.
Resumo:
OBJECTIVE: To determine the distribution of the pathological changes in the neocortex in multiple-system atrophy (MSA). METHOD: The vertical distribution of the abnormal neurons (neurons with enlarged or atrophic perikarya), surviving neurons, glial cytoplasmic inclusions (GCI) and neuronal cytoplasmic inclusions (NI) were studied in alpha-synuclein-stained material of frontal and temporal cortex in ten cases of MSA. RESULTS: Abnormal neurons exhibited two common patterns of distribution, viz., density was either maximal in the upper cortex or a bimodal distribution was present with a density peak in the upper and lower cortex. The NI were either located in the lower cortex or were more uniformly distributed down the cortical profile. The distribution of the GCI varied considerably between gyri and cases. The density of the glial cell nuclei was maximal in the lower cortex in the majority of gyri. In a number of gyri, there was a positive correlation between the vertical densities of the abnormal neurons, the total number of surviving neurons, and the glial cell nuclei. The vertical densities of the GCI were not correlated with those of the surviving neurons or glial cells but the GCI and NI were positively correlated in a small number of gyri. CONCLUSION: The data suggest that there is significant degeneration of the frontal and temporal lobes in MSA, the lower laminae being affected more significantly than the upper laminae. Cortical degeneration in MSA is likely to be secondary to pathological changes occurring within subcortical areas.
Resumo:
Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.
Resumo:
An investigator may also wish to select a small subset of the X variables which give the best prediction of the Y variable. In this case, the question is how many variables should the regression equation include? One method would be to calculate the regression of Y on every subset of the X variables and choose the subset that gives the smallest mean square deviation from the regression. Most investigators, however, prefer to use a ‘stepwise multiple regression’ procedure. There are two forms of this analysis called the ‘step-up’ (or ‘forward’) method and the ‘step-down’ (or ‘backward’) method. This Statnote illustrates the use of stepwise multiple regression with reference to the scenario introduced in Statnote 24, viz., the influence of climatic variables on the growth of the crustose lichen Rhizocarpon geographicum (L.)DC.
Resumo:
Purpose: To analyse the relationship between measured intraocular pressure (IOP) and central corneal thickness (CCT), corneal hysteresis (CH) and corneal resistance factor (CRF) in ocular hypertension (OHT), primary open-angle (POAG) and normal tension glaucoma (NTG) eyes using multiple tonometry devices. Methods: Right eyes of patients diagnosed with OHT (n=47), normal tension glaucoma (n=17) and POAG (n=50) were assessed, IOP was measured in random order with four devices: Goldmann applanation tonometry (GAT); Pascal(R) dynamic contour tonometer (DCT); Reichert(R) ocular response analyser (ORA); and Tono-Pen(R) XL. CCT was then measured using a hand-held ultrasonic pachymeter. CH and CRF were derived from the air pressure to corneal reflectance relationship of the ORA data. Results: Compared to the GAT, the Tonopen and ORA Goldmann equivalent (IOPg) and corneal compensated (IOPcc) measured higher IOP readings (F=19.351, p<0.001), particularly in NTG (F=12.604, p<0.001). DCT was closest to Goldmann IOP and had the lowest variance. CCT was significantly different (F=8.305, p<0.001) between the 3 conditions as was CH (F=6.854, p=0.002) and CRF (F=19.653, p<0.001). IOPcc measures were not affected by CCT. The DCT was generally not affected by corneal biomechanical factors. Conclusion: This study suggests that as the true pressure of the eye cannot be determined non-invasively, measurements from any tonometer should be interpreted with care, particularly when alterations in the corneal tissue are suspected.
Resumo:
An inverse problem is considered where the structure of multiple sound-soft planar obstacles is to be determined given the direction of the incoming acoustic field and knowledge of the corresponding total field on a curve located outside the obstacles. A local uniqueness result is given for this inverse problem suggesting that the reconstruction can be achieved by a single incident wave. A numerical procedure based on the concept of the topological derivative of an associated cost functional is used to produce images of the obstacles. No a priori assumption about the number of obstacles present is needed. Numerical results are included showing that accurate reconstructions can be obtained and that the proposed method is capable of finding both the shapes and the number of obstacles with one or a few incident waves.
Resumo:
Clinical Decision Support Systems (CDSSs) need to disseminate expertise in formats that suit different end users and with functionality tuned to the context of assessment. This paper reports research into a method for designing and implementing knowledge structures that facilitate the required flexibility. A psychological model of expertise is represented using a series of formally specified and linked XML trees that capture increasing elements of the model, starting with hierarchical structuring, incorporating reasoning with uncertainty, and ending with delivering the final CDSS. The method was applied to the Galatean Risk and Safety Tool, GRiST, which is a web-based clinical decision support system (www.egrist.org) for assessing mental-health risks. Results of its clinical implementation demonstrate that the method can produce a system that is able to deliver expertise targetted and formatted for specific patient groups, different clinical disciplines, and alternative assessment settings. The approach may be useful for developing other real-world systems using human expertise and is currently being applied to a logistics domain. © 2013 Polish Information Processing Society.
Resumo:
An increasing number of publications on the dried blood spot (DBS) sampling approach for the quantification of drugs and metabolites have been spurred on by the inherent advantages of this sampling technique. In the present research, a selective and sensitive high-performance liquid chromatography method for the concurrent determination of multiple antiepileptic drugs (AEDs) [levetiracetam (LVT), lamotrigine (LTG), phenobarbital (PHB)], carbamazepine (CBZ) and its active metabolite carbamazepine-10,11 epoxide (CBZE)] in a single DBS has been developed and validated. Whole blood was spotted onto Guthrie cards and dried. Using a standard punch (6 mm diameter), a circular disc was punched from the card and extracted with methanol: acetonitrile (3:1, v/v) containing hexobarbital (Internal Standard) and sonicated prior to evaporation. The extract was then dissolved in water and vortex mixed before undergoing solid phase extraction using HLB cartridges. Chromatographic separation of the AEDs was achieved using Waters XBridge™ C18 column with a gradient system. The developed method was linear over the concentration ranges studied with r ≥ 0.995 for all compounds. The lower limits of quantification (LLOQs) were 2, 1, 2, 0.5 and 1 μg/mL for LVT, LTG, PHB, CBZE and CBZ, respectively. Accuracy (%RE) and precision (%CV) values for within and between day were <20% at the LLOQs and <15% at all other concentrations tested. This method was successfully applied to the analysis of the AEDs in DBS samples taken from children with epilepsy for the assessment of their adherence to prescribed treatments.
Resumo:
Objectives: To conduct an independent evaluation of the first phase of the Health Foundation's Safer Patients Initiative (SPI), and to identify the net additional effect of SPI and any differences in changes in participating and non-participating NHS hospitals. Design: Mixed method evaluation involving five substudies, before and after design. Setting: NHS hospitals in United Kingdom. Participants: Four hospitals (one in each country in the UK) participating in the first phase of the SPI (SPI1); 18 control hospitals. Intervention: The SPI1 was a compound (multicomponent) organisational intervention delivered over 18 months that focused on improving the reliability of specific frontline care processes in designated clinical specialties and promoting organisational and cultural change. Results: Senior staff members were knowledgeable and enthusiastic about SPI1. There was a small (0.08 points on a 5 point scale) but significant (P<0.01) effect in favour of the SPI1 hospitals in one of 11 dimensions of the staff questionnaire (organisational climate). Qualitative evidence showed only modest penetration of SPI1 at medical ward level. Although SPI1 was designed to engage staff from the bottom up, it did not usually feel like this to those working on the wards, and questions about legitimacy of some aspects of SPI1 were raised. Of the five components to identify patients at risk of deterioration - monitoring of vital signs (14 items); routine tests (three items); evidence based standards specific to certain diseases (three items); prescribing errors (multiple items from the British National Formulary); and medical history taking (11 items) - there was little net difference between control and SPI1 hospitals, except in relation to quality of monitoring of acute medical patients, which improved on average over time across all hospitals. Recording of respiratory rate increased to a greater degree in SPI1 than in control hospitals; in the second six hours after admission recording increased from 40% (93) to 69% (165) in control hospitals and from 37% (141) to 78% (296) in SPI1 hospitals (odds ratio for "difference in difference" 2.1, 99% confidence interval 1.0 to 4.3; P=0.008). Use of a formal scoring system for patients with pneumonia also increased over time (from 2% (102) to 23% (111) in control hospitals and from 2% (170) to 9% (189) in SPI1 hospitals), which favoured controls and was not significant (0.3, 0.02 to 3.4; P=0.173). There were no improvements in the proportion of prescription errors and no effects that could be attributed to SPI1 in non-targeted generic areas (such as enhanced safety culture). On some measures, the lack of effect could be because compliance was already high at baseline (such as use of steroids in over 85% of cases where indicated), but even when there was more room for improvement (such as in quality of medical history taking), there was no significant additional net effect of SPI1. There were no changes over time or between control and SPI1 hospitals in errors or rates of adverse events in patients in medical wards. Mortality increased from 11% (27) to 16% (39) among controls and decreased from17%(63) to13%(49) among SPI1 hospitals, but the risk adjusted difference was not significant (0.5, 0.2 to 1.4; P=0.085). Poor care was a contributing factor in four of the 178 deaths identified by review of case notes. The survey of patients showed no significant differences apart from an increase in perception of cleanliness in favour of SPI1 hospitals. Conclusions The introduction of SPI1 was associated with improvements in one of the types of clinical process studied (monitoring of vital signs) and one measure of staff perceptions of organisational climate. There was no additional effect of SPI1 on other targeted issues nor on other measures of generic organisational strengthening.
Resumo:
Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.
Resumo:
This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.
Resumo:
Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.
Resumo:
Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.