175 resultados para Computer algorithms
em Université de Lausanne, Switzerland
Resumo:
To make a comprehensive evaluation of organ-specific out-of-field doses using Monte Carlo (MC) simulations for different breast cancer irradiation techniques and to compare results with a commercial treatment planning system (TPS). Three breast radiotherapy techniques using 6MV tangential photon beams were compared: (a) 2DRT (open rectangular fields), (b) 3DCRT (conformal wedged fields), and (c) hybrid IMRT (open conformal+modulated fields). Over 35 organs were contoured in a whole-body CT scan and organ-specific dose distributions were determined with MC and the TPS. Large differences in out-of-field doses were observed between MC and TPS calculations, even for organs close to the target volume such as the heart, the lungs and the contralateral breast (up to 70% difference). MC simulations showed that a large fraction of the out-of-field dose comes from the out-of-field head scatter fluence (>40%) which is not adequately modeled by the TPS. Based on MC simulations, the 3DCRT technique using external wedges yielded significantly higher doses (up to a factor 4-5 in the pelvis) than the 2DRT and the hybrid IMRT techniques which yielded similar out-of-field doses. In sharp contrast to popular belief, the IMRT technique investigated here does not increase the out-of-field dose compared to conventional techniques and may offer the most optimal plan. The 3DCRT technique with external wedges yields the largest out-of-field doses. For accurate out-of-field dose assessment, a commercial TPS should not be used, even for organs near the target volume (contralateral breast, lungs, heart).
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
PURPOSE: To determine the lower limit of dose reduction with hybrid and fully iterative reconstruction algorithms in detection of endoleaks and in-stent thrombus of thoracic aorta with computed tomographic (CT) angiography by applying protocols with different tube energies and automated tube current modulation. MATERIALS AND METHODS: The calcification insert of an anthropomorphic cardiac phantom was replaced with an aortic aneurysm model containing a stent, simulated endoleaks, and an intraluminal thrombus. CT was performed at tube energies of 120, 100, and 80 kVp with incrementally increasing noise indexes (NIs) of 16, 25, 34, 43, 52, 61, and 70 and a 2.5-mm section thickness. NI directly controls radiation exposure; a higher NI allows for greater image noise and decreases radiation. Images were reconstructed with filtered back projection (FBP) and hybrid and fully iterative algorithms. Five radiologists independently analyzed lesion conspicuity to assess sensitivity and specificity. Mean attenuation (in Hounsfield units) and standard deviation were measured in the aorta to calculate signal-to-noise ratio (SNR). Attenuation and SNR of different protocols and algorithms were analyzed with analysis of variance or Welch test depending on data distribution. RESULTS: Both sensitivity and specificity were 100% for simulated lesions on images with 2.5-mm section thickness and an NI of 25 (3.45 mGy), 34 (1.83 mGy), or 43 (1.16 mGy) at 120 kVp; an NI of 34 (1.98 mGy), 43 (1.23 mGy), or 61 (0.61 mGy) at 100 kVp; and an NI of 43 (1.46 mGy) or 70 (0.54 mGy) at 80 kVp. SNR values showed similar results. With the fully iterative algorithm, mean attenuation of the aorta decreased significantly in reduced-dose protocols in comparison with control protocols at 100 kVp (311 HU at 16 NI vs 290 HU at 70 NI, P ≤ .0011) and 80 kVp (400 HU at 16 NI vs 369 HU at 70 NI, P ≤ .0007). CONCLUSION: Endoleaks and in-stent thrombus of thoracic aorta were detectable to 1.46 mGy (80 kVp) with FBP, 1.23 mGy (100 kVp) with the hybrid algorithm, and 0.54 mGy (80 kVp) with the fully iterative algorithm.
Resumo:
Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.
Resumo:
Regulatory gene networks contain generic modules, like those involving feedback loops, which are essential for the regulation of many biological functions (Guido et al. in Nature 439:856-860, 2006). We consider a class of self-regulated genes which are the building blocks of many regulatory gene networks, and study the steady-state distribution of the associated Gillespie algorithm by providing efficient numerical algorithms. We also study a regulatory gene network of interest in gene therapy, using mean-field models with time delays. Convergence of the related time-nonhomogeneous Markov chain is established for a class of linear catalytic networks with feedback loops.
Resumo:
BACKGROUND: Surveillance of multiple congenital anomalies is considered to be more sensitive for the detection of new teratogens than surveillance of all or isolated congenital anomalies. Current literature proposes the manual review of all cases for classification into isolated or multiple congenital anomalies. METHODS: Multiple anomalies were defined as two or more major congenital anomalies, excluding sequences and syndromes. A computer algorithm for classification of major congenital anomaly cases in the EUROCAT database according to International Classification of Diseases (ICD)v10 codes was programmed, further developed, and implemented for 1 year's data (2004) from 25 registries. The group of cases classified with potential multiple congenital anomalies were manually reviewed by three geneticists to reach a final agreement of classification as "multiple congenital anomaly" cases. RESULTS: A total of 17,733 cases with major congenital anomalies were reported giving an overall prevalence of major congenital anomalies at 2.17%. The computer algorithm classified 10.5% of all cases as "potentially multiple congenital anomalies". After manual review of these cases, 7% were agreed to have true multiple congenital anomalies. Furthermore, the algorithm classified 15% of all cases as having chromosomal anomalies, 2% as monogenic syndromes, and 76% as isolated congenital anomalies. The proportion of multiple anomalies varies by congenital anomaly subgroup with up to 35% of cases with bilateral renal agenesis. CONCLUSIONS: The implementation of the EUROCAT computer algorithm is a feasible, efficient, and transparent way to improve classification of congenital anomalies for surveillance and research.
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
The aim of this retrospective study was to compare the clinical and radiographic results after TKA (PFC, DePuy), performed either by computer assisted navigation (CAS, Brainlab, Johnson&Johnson) or by conventional means. Material and methods: Between May and December 2006 we reviewed 36 conventional TKA performed between 2002 and 2003 (group A) and 37 navigated TKA performed between 2005 and 2006 (group B) by the same experienced surgeon. The mean age in group A was 74 years (range 62-90) and 73 (range 58-85) in group B with a similar age distribution. The preoperative mechanical axes in group A ranged from -13° varus to +13° valgus (mean absolute deviation 6.83°, SD 3.86), in group B from -13° to +16° (mean absolute deviation 5.35, SD 4.29). Patients with a previous tibial osteotomy or revision arthroplasty were excluded from the study. Examination was done by an experienced orthopedic resident independent of the surgeon. All patients had pre- and postoperative long standing radiographs. The IKSS and the WOMAC were utilized to determine the clinical outcome. Patient's degree of satisfaction was assessed on a visual analogous scale (VAS). Results: 32 of the 37 navigated TKAs (86,5%) showed a postoperative mechanical axis within the limits of 3 degrees of valgus or varus deviation compared to only 24 (66%) of the 36 standard TKAs. This difference was significant (p = 0.045). The mean absolute deviation from neutral axis was 3.00° (range -5° to +9°, SD: 1.75) in group A in comparison to 1.54° (range -5° to +4°, SD: 1.41) in group B with a highly significant difference (p = 0.000). Furthermore, both groups showed a significant postoperative improvement of their mean IKSS-values (group A: 89 preoperative to 169 postoperative, group B 88 to 176) without a significant difference between the two groups. Neither the WOMAC nor the patient's degree of satisfaction - as assessed by VAS - showed significant differences. Operation time was significantly higher in group B (mean 119.9 min.) than in group A (mean 99.6 min., p <0.000). Conclusion: Our study showed consistent significant improvement of postoperative frontal alignment in TKA by computer assisted navigation (CAS) compared to standard methods, even in the hands of a surgeon well experienced in standard TKA implantation. However, the follow-up time of this study was not long enough to judge differences in clinical outcome. Thus, the relevance of computer navigation for clinical outcome and survival of TKA remains to be proved in long term studies to justify the longer operation time. References 1 Stulberg SD. Clin Orth Rel Res. 2003;(416):177-84. 2 Chauhan SK. JBJS Br. 2004;86(3):372-7. 3 Bäthis H, et al. Orthopäde. 2006;35(10):1056-65.
Resumo:
BACKGROUND: This study describes the prevalence, associated anomalies, and demographic characteristics of cases of multiple congenital anomalies (MCA) in 19 population-based European registries (EUROCAT) covering 959,446 births in 2004 and 2010. METHODS: EUROCAT implemented a computer algorithm for classification of congenital anomaly cases followed by manual review of potential MCA cases by geneticists. MCA cases are defined as cases with two or more major anomalies of different organ systems, excluding sequences, chromosomal and monogenic syndromes. RESULTS: The combination of an epidemiological and clinical approach for classification of cases has improved the quality and accuracy of the MCA data. Total prevalence of MCA cases was 15.8 per 10,000 births. Fetal deaths and termination of pregnancy were significantly more frequent in MCA cases compared with isolated cases (p < 0.001) and MCA cases were more frequently prenatally diagnosed (p < 0.001). Live born infants with MCA were more often born preterm (p < 0.01) and with birth weight < 2500 grams (p < 0.01). Respiratory and ear, face, and neck anomalies were the most likely to occur with other anomalies (34% and 32%) and congenital heart defects and limb anomalies were the least likely to occur with other anomalies (13%) (p < 0.01). However, due to their high prevalence, congenital heart defects were present in half of all MCA cases. Among males with MCA, the frequency of genital anomalies was significantly greater than the frequency of genital anomalies among females with MCA (p < 0.001). CONCLUSION: Although rare, MCA cases are an important public health issue, because of their severity. The EUROCAT database of MCA cases will allow future investigation on the epidemiology of these conditions and related clinical and diagnostic problems.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
With the aid of the cobalt labelling technique, frog spinal cord motor neuron dendrites of the subpial dendritic plexus have been identified in serial electron micrographs. Computer reconstructions of various lengths (2.5-9.8 micron) of dendritic segments showed the contours of these dendrites to be highly irregular, and to present many thorn-like projections 0.4-1.8 micron long. Number, size and distribution of synaptic contacts were also determined. Almost half of the synapses occurred at the origins of the thorns and these synapses had the largest contact areas. Only 8 out of 54 synapses analysed were found on thorns and these were the smallest. For the total length of reconstructed dendrites there was, on average, one synapse per 1.2 micron, while 4.4% of the total dendritic surface was covered with synaptic contacts. The functional significance of these distal dendrites and their capacity to influence the soma membrane potential is discussed.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.