195 resultados para Set-valued map
Resumo:
To overcome the weak evidence base coming from often poor and insufficient clinical research in older people, a minimum data set to achieve harmonisation is highly advisable. This will lead to uniform nomenclature and to the standardisation of the assessment tools. Our primary objective was to develop a Geriatric Minimum Data Set (GMDS) for clinical research.
Resumo:
This chapter offers an analysis of a map of the Irish border. The Map of Watchful Architecture charts the history of defensive structures in the border region.
Resumo:
In this article, we focus on the analysis of competitive gene set methods for detecting the statistical significance of pathways from gene expression data. Our main result is to demonstrate that some of the most frequently used gene set methods, GSEA, GSEArot and GAGE, are severely influenced by the filtering of the data in a way that such an analysis is no longer reconcilable with the principles of statistical inference, rendering the obtained results in the worst case inexpressive. A possible consequence of this is that these methods can increase their power by the addition of unrelated data and noise. Our results are obtained within a bootstrapping framework that allows a rigorous assessment of the robustness of results and enables power estimates. Our results indicate that when using competitive gene set methods, it is imperative to apply a stringent gene filtering criterion. However, even when genes are filtered appropriately, for gene expression data from chips that do not provide a genome-scale coverage of the expression values of all mRNAs, this is not enough for GSEA, GSEArot and GAGE to ensure the statistical soundness of the applied procedure. For this reason, for biomedical and clinical studies, we strongly advice not to use GSEA, GSEArot and GAGE for such data sets.
Resumo:
Improvement in the quality of end-of-life (EOL) care is a priority health care issue since serious deficiencies in quality of care have been reported across care settings. Increasing pressure is now focused on Canadian health care organizations to be accountable for the quality of palliative and EOL care delivered. Numerous domains of quality EOL care upon which to create accountability frameworks are now published, with some derived from the patient/family perspective. There is a need to reach common ground on the domains of quality EOL care valued by patients and families in order to develop consistent performance measures and set priorities for health care improvement. This paper describes a meta-synthesis study to develop a common conceptual framework of quality EOL care integrating attributes of quality valued by patients and their families. © 2005 Centre for Bioethics, IRCM.
Resumo:
When examining complex problems, such as the folding of proteins, coarse grained descriptions of the system drive our investigation and help us to rationalize the results. Oftentimes collective variables (CVs), derived through some chemical intuition about the process of interest, serve this purpose. Because finding these CVs is the most difficult part of any investigation, we recently developed a dimensionality reduction algorithm, sketch-map, that can be used to build a low-dimensional map of a phase space of high-dimensionality. In this paper we discuss how these machine-generated CVs can be used to accelerate the exploration of phase space and to reconstruct free-energy landscapes. To do so, we develop a formalism in which high-dimensional configurations are no longer represented by low-dimensional position vectors. Instead, for each configuration we calculate a probability distribution, which has a domain that encompasses the entirety of the low-dimensional space. To construct a biasing potential, we exploit an analogy with metadynamics and use the trajectory to adaptively construct a repulsive, history-dependent bias from the distributions that correspond to the previously visited configurations. This potential forces the system to explore more of phase space by making it desirable to adopt configurations whose distributions do not overlap with the bias. We apply this algorithm to a small model protein and succeed in reproducing the free-energy surface that we obtain from a parallel tempering calculation.
Resumo:
This paper presents an Invariant Information Local Sub-map Filter (IILSF) as a technique for consistent Simultaneous Localisation and Mapping (SLAM) in a large environment. It harnesses the benefits of sub-map technique to improve the consistency and efficiency of Extended Kalman Filter (EKF) based SLAM. The IILSF makes use of invariant information obtained from estimated locations of features in independent sub-maps, instead of incorporating every observation directly into the global map. Then the global map is updated at regular intervals. Applying this technique to the EKF based SLAM algorithm: (a) reduces the computational complexity of maintaining the global map estimates and (b) simplifies transformation complexities and data association ambiguities usually experienced in fusing sub-maps together. Simulation results show that the method was able to accurately fuse local map observations to generate an efficient and consistent global map, in addition to significantly reducing computational cost and data association ambiguities.
Resumo:
This study aims to evaluate the use of Varian radiotherapy dynamic treatment log (DynaLog) files to verify IMRT plan delivery as part of a routine quality assurance procedure. Delivery accuracy in terms of machine performance was quantified by multileaf collimator (MLC) position errors and fluence delivery accuracy for patients receiving intensity modulated radiation therapy (IMRT) treatment. The relationship between machine performance and plan complexity, quantified by the modulation complexity score (MCS) was also investigated. Actual MLC positions and delivered fraction of monitor units (MU), recorded every 50 ms during IMRT delivery, were extracted from the DynaLog files. The planned MLC positions and fractional MU were taken from the record and verify system MLC control file. Planned and delivered beam data were compared to determine leaf position errors with and without the overshoot effect. Analysis was also performed on planned and actual fluence maps reconstructed from the MLC control file and delivered treatment log files respectively. This analysis was performed for all treatment fractions for 5 prostate, 5 prostate and pelvic node (PPN) and 5 head and neck (H&N) IMRT plans, totalling 82 IMRT fields in ∼5500 DynaLog files. The root mean square (RMS) leaf position errors without the overshoot effect were 0.09, 0.26, 0.19 mm for the prostate, PPN and H&N plans respectively, which increased to 0.30, 0.39 and 0.30 mm when the overshoot effect was considered. Average errors were not affected by the overshoot effect and were 0.05, 0.13 and 0.17 mm for prostate, PPN and H&N plans respectively. The percentage of pixels passing fluence map gamma analysis at 3%/3 mm was 99.94 ± 0.25%, which reduced to 91.62 ± 11.39% at 1%/1 mm criterion. Leaf position errors, but not gamma passing rate, were directly related to plan complexity as determined by the MCS. Site specific confidence intervals for average leaf position errors were set at -0.03-0.12 mm for prostate and -0.02-0.28 mm for more complex PPN and H&N plans. For all treatment sites confidence intervals for RMS errors with the overshoot was set at 0-0.50 mm and for the percentage of pixels passing a gamma analysis at 1%/1 mm a confidence interval of 68.83% was set also for all treatment sites. This work demonstrates the successful implementation of treatment log files to validate IMRT deliveries and how dynamic log files can diagnose delivery errors not possible with phantom based QC. Machine performance was found to be directly related to plan complexity but this is not the dominant determinant of delivery accuracy.
Resumo:
Advances in computational and information technologies have facilitated the acquisition of geospatial information for regional and national soil and geology databases. These have been completed for a range of purposes from geological and soil baseline mapping to economic prospecting and land resource assessment, but have become increasingly used for forensic purposes. On the question of provenance of a questioned sample, the geologist or soil scientist will draw invariably on prior expert knowledge and available digital map and database sources in a ‘pseudo Bayesian’ approach. The context of this paper is the debate on whether existing (digital) geology and soil databases are indeed useful and suitable for forensic inferences. Published and new case studies are used to explore issues of completeness, consistency, compatibility and applicability in relation to the use of digital geology and soil databases in environmental and criminal forensics. One key theme that emerges is that, despite an acknowledgement that databases can be neither exhaustive nor precise enough to portray spatial variability at the scene of crime scale, coupled with expert knowledge, they play an invaluable role in providing background or
reference material in a criminal investigation. Moreover databases can offer an independent control set of samples.