98 resultados para Statistical analysis methods
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
γ-Hydroxybutyric acid (GHB) is an endogenous short-chain fatty acid popular as a recreational drug due to sedative and euphoric effects, but also often implicated in drug-facilitated sexual assaults owing to disinhibition and amnesic properties. Whilst discrimination between endogenous and exogenous GHB as required in intoxication cases may be achieved by the determination of the carbon isotope content, such information has not yet been exploited to answer source inference questions of forensic investigation and intelligence interests. However, potential isotopic fractionation effects occurring through the whole metabolism of GHB may be a major concern in this regard. Thus, urine specimens from six healthy male volunteers who ingested prescription GHB sodium salt, marketed as Xyrem(®), were analysed by means of gas chromatography/combustion/isotope ratio mass spectrometry to assess this particular topic. A very narrow range of δ(13)C values, spreading from -24.810/00 to -25.060/00, was observed, whilst mean δ(13)C value of Xyrem(®) corresponded to -24.990/00. Since urine samples and prescription drug could not be distinguished by means of statistical analysis, carbon isotopic effects and subsequent influence on δ(13)C values through GHB metabolism as a whole could be ruled out. Thus, a link between GHB as a raw matrix and found in a biological fluid may be established, bringing relevant information regarding source inference evaluation. Therefore, this study supports a diversified scope of exploitation for stable isotopes characterized in biological matrices from investigations on intoxication cases to drug intelligence programmes.
Resumo:
The question of where retroviral DNA becomes integrated in chromosomes is important for understanding (i) the mechanisms of viral growth, (ii) devising new anti-retroviral therapy, (iii) understanding how genomes evolve, and (iv) developing safer methods for gene therapy. With the completion of genome sequences for many organisms, it has become possible to study integration targeting by cloning and sequencing large numbers of host-virus DNA junctions, then mapping the host DNA segments back onto the genomic sequence. This allows statistical analysis of the distribution of integration sites relative to the myriad types of genomic features that are also being mapped onto the sequence scaffold. Here we present methods for recovering and analyzing integration site sequences.
Resumo:
This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
Purpose: More than five hundred million direct dental restorations are placed each year worldwide. In about 55% of the cases, resin composites or compomers are used, and in 45% amalgam. The longevity of posterior resin restorations is well documented. However, data on resin composites that are placed without enamel/dentin conditioning and resin composites placed with self-etching adhesive systems are missing. Material and Methods: The database SCOPUS was searched for clinical trials on posterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimum number of restorations at last recall = 20; (3) report on dropout rate; (4) report of operative technique and materials used; (5) utilization of Ryge or modified Ryge evaluation criteria. For amalgam, only those studies were included that directly compared composite resin restorations with amalgam. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. P-values under 0.05 were considered significant. Results: Of the 373 clinical trials, 59 studies met the inclusion criteria. In 70% of the studies, Class II and Class I restorations had been placed. The overall success rate of composite resin restorations was about 90% after 10 years, which was not different from that of amalgam. Restorations with compomers had a significantly lower longevity. The main reason for replacement were bulk fractures and caries adjacent to restorations. Both of these incidents were infrequent in most studies and accounted only for about 6% of all replaced restorations after 10 years. Restorations with macrofilled composites and compomer suffered significantly more loss of anatomical form than restorations with other types of material. Restorations that were placed without enamel acid etching and a dentin bonding agent showed significantly more marginal staining and detectable margins compared to those restorations placed using the enamel-etch or etch-and-rinse technique; restorations with self-etching systems were between the other groups. Restorations with compomer suffered significantly more chippings (repairable fracture) than restorations with other materials, which did not statistically differ among each other. Restorations that were placed with a rubber-dam showed significantly fewer material fractures that needed replacement, and this also had a significant effect on the overall longevity. Conclusion: Restorations with hybrid and microfilled composites that were placed with the enamel-etching technique and rubber-dam showed the best overall performance; the longevity of these restorations was similar to amalgam restorations. Compomer restorations, restorations placed with macrofilled composites, and resin restorations with no-etching or self-etching adhesives demonstrated significant shortcomings and shorter longevity.
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
OBJECTIVES: This is the first meta-analysis on the efficacy of composite resin restorations in anterior teeth. The objective of the present meta-analysis was to verify whether specific material classes, tooth conditioning methods and operational procedures influence the result for Class III and Class IV restorations. MATERIAL AND METHODS: The database SCOPUS and PubMed were searched for clinical trials on anterior resin composites without restricting the search to the year of publication. The inclusion criteria were: (1) prospective clinical trial with at least 2 years of observation; (2) minimal number of restorations at last recall=20; (3) report on drop-out rate; (4) report of operative technique and materials used in the trial, and (5) utilization of Ryge or modified Ryge evaluation criteria. For the statistical analysis, a linear mixed model was used with random effects to account for the heterogeneity between the studies. p-Values smaller than 0.05 were considered to be significant. RESULTS: Of the 84 clinical trials, 21 studies met the inclusion criteria, 14 of them for Class III restorations, 6 for Class IV restorations and 1 for closure of diastemata; the latter was included in the Class IV group. Twelve of the 21 studies started before 1991 and 18 before 2001. The estimated median overall success rate (without replacement) after 10 years for Class III composite resin restorations was 95% and for Class IV restorations 90%. The main reason for the replacement of Class IV restorations was bulk fractures, which occurred significantly more frequently with microfilled composites than with hybrid and macrofilled composites. Caries adjacent to restorations was infrequent in most studies and accounted only for about 2.5% of all replaced restorations after 10 years irrespective of the cavity class. Class III restorations with glass ionomer derivates suffered significantly more loss of anatomical form than did fillings with other types of material. When the enamel was acid-etched and no bonding agent was applied, significantly more restorations showed marginal staining and detectable margins compared to enamel etching with enamel bonding or the total etch technique; fillings with self-etching systems were in between of these two outcome variables. Bevelling of the enamel was associated with a significantly reduced deterioration of the anatomical form compared to no bevelling but not with less marginal staining or less detectable margins. The type of isolation (absolute/relative) had a statistically significant influence on marginal caries which, however, might be a random finding.
Resumo:
BACKGROUND: A previous individual patient data meta-analysis by the Meta-Analysis of Chemotherapy in Nasopharynx Carcinoma (MAC-NPC) collaborative group to assess the addition of chemotherapy to radiotherapy showed that it improves overall survival in nasopharyngeal carcinoma. This benefit was restricted to patients receiving concomitant chemotherapy and radiotherapy. The aim of this study was to update the meta-analysis, include recent trials, and to analyse separately the benefit of concomitant plus adjuvant chemotherapy. METHODS: We searched PubMed, Web of Science, Cochrane Controlled Trials meta-register, ClinicalTrials.gov, and meeting proceedings to identify published or unpublished randomised trials assessing radiotherapy with or without chemotherapy in patients with non-metastatic nasopharyngeal carcinoma and obtained updated data for previously analysed studies. The primary endpoint of interest was overall survival. All trial results were combined and analysed using a fixed-effects model. The statistical analysis plan was pre-specified in a protocol. All data were analysed on an intention-to-treat basis. FINDINGS: We analysed data from 19 trials and 4806 patients. Median follow-up was 7·7 years (IQR 6·2-11·9). We found that the addition of chemotherapy to radiotherapy significantly improved overall survival (hazard ratio [HR] 0·79, 95% CI 0·73-0·86, p<0·0001; absolute benefit at 5 years 6·3%, 95% CI 3·5-9·1). The interaction between treatment effect (benefit of chemotherapy) on overall survival and the timing of chemotherapy was significant (p=0·01) in favour of concomitant plus adjuvant chemotherapy (HR 0·65, 0·56-0·76) and concomitant without adjuvant chemotherapy (0·80, 0·70-0·93) but not adjuvant chemotherapy alone (0·87, 0·68-1·12) or induction chemotherapy alone (0·96, 0·80-1·16). The benefit of the addition of chemotherapy was consistent for all endpoints analysed (all p<0·0001): progression-free survival (HR 0·75, 95% CI 0·69-0·81), locoregional control (0·73, 0·64-0·83), distant control (0·67, 0·59-0·75), and cancer mortality (0·76, 0·69-0·84). INTERPRETATION: Our results confirm that the addition of concomitant chemotherapy to radiotherapy significantly improves survival in patients with locoregionally advanced nasopharyngeal carcinoma. To our knowledge, this is the first analysis that examines the effect of concomitant chemotherapy with and without adjuvant chemotherapy as distinct groups. Further studies on the specific benefits of adjuvant chemotherapy after concomitant chemoradiotherapy are needed. FUNDING: French Ministry of Health (Programme d'actions intégrées de recherche VADS), Ligue Nationale Contre le Cancer, and Sanofi-Aventis.
Resumo:
PURPOSE: To meta-analyze the literature on the clinical performance of Class V restorations to assess the factors that influence retention, marginal integrity, and marginal discoloration of cervical lesions restored with composite resins, glass-ionomer-cement-based materials [glass-ionomer cement (GIC) and resin-modified glass ionomers (RMGICs)], and polyacid-modified resin composites (PMRC). MATERIALS AND METHODS: The English literature was searched (MEDLINE and SCOPUS) for prospective clinical trials on cervical restorations with an observation period of at least 18 months. The studies had to report about retention, marginal discoloration, marginal integrity, and marginal caries and include a description of the operative technique (beveling of enamel, roughening of dentin, type of isolation). Eighty-one studies involving 185 experiments for 47 adhesives matched the inclusion criteria. The statistical analysis was carried out by using the following linear mixed model: log (-log (Y /100)) = β + α log(T ) + error with β = log(λ), where β is a summary measure of the non-linear deterioration occurring in each experiment, including a random study effect. RESULTS: On average, 12.3% of the cervical restorations were lost, 27.9% exhibited marginal discoloration, and 34.6% exhibited deterioration of marginal integrity after 5 years. The calculation of the clinical index was 17.4% of failures after 5 years and 32.3% after 8 years. A higher variability was found for retention loss and marginal discoloration. Hardly any secondary caries lesions were detected, even in the experiments with a follow-up time longer than 8 years. Restorations placed using rubber-dam in teeth whose dentin was roughened showed a statistically significantly higher retention rate than those placed in teeth with unprepared dentin or without rubber-dam (p < 0.05). However, enamel beveling had no influence on any of the examined variables. Significant differences were found between pairs of adhesive systems and also between pairs of classes of adhesive systems. One-step self-etching had a significantly worse clinically index than two-step self-etching and three-step etch-and-rinse (p = 0.026 and p = 0.002, respectively). CONCLUSION: The clinical performance is significantly influenced by the type of adhesive system and/or the adhesive class to which the system belongs. Whether the dentin/enamel is roughened or not and whether rubberdam isolation is used or not also significantly influenced the clinical performance. Composite resin restorations placed with two-step self-etching and three-step etch-and-rinse adhesive systems should be preferred over onestep self-etching adhesive systems, GIC-based materials, and PMRCs.
Resumo:
Objectives: We present the retrospective analysis of a single-institution experience for radiosurgery (RS) in brain metastasis (BM) with Gamma Knife (GK) and Linac. Methods: From July 2010 to July 2012, 28 patients (with 83 lesions) had RS with GK and 35 patients (with 47 lesions) with Linac. The primary outcome was the local progression-free survival (LPFS). The secondary outcome was the overall survival (OS). Apart a standard statistical analysis, we included a Cox regression model with shared frailty, to modulate the within-patient correlation (preliminary evaluation showed a significant frailty effect, meaning that the correlation within patient could be ignored). Results: The mean follow-up period was 11.7 months (median 7.9, 1.7-22.7) for GK and 18.1 (median 17, 7.5-28.7) for Linac. The median number of lesions per patient was 2.5 (1-9) in GK compared with 1 (1-3) in Linac. There were more radioresistant lesions (melanoma) and more lesions located in functional areas for the GK group. The median dose was 24 Gy (GK) compared with 20 Gy (Linac). The LPFS actuarial rate was as follows: for GK at 3, 6, 9, 12, and 17 months: 96.96, 96.96, 96.96, 88.1, and 81.5%, and remained stable till 32 months; for Linac at 3, 6, 12, 17, 24, and 33 months, it was 91.5, 91.5, 91.5, 79.9, 55.5, and 17.1%, respectively (p = 0.03, chi-square test). After the Cox regression analysis with shared frailty, the p-value was not statistically significant between groups. The median overall survival was 9.7 months for GK and 23.6 months for Linac group. Uni- and multivariate analysis showed a lower GPA score and noncontrolled systemic status were associated with lower OS. Cox regression analysis adjusting for these two parameters showed comparable OS rate. Conclusions: In this comparative report between GK and Linac, preliminary analysis showed that more difficult cases are treated by GK, with patients harboring more lesions, radioresistant tumors, and highly functional located. The groups look, in this sense, very heterogeneous at baseline. After a Cox frailty model, the LPFS rates seemed very similar (p < 0.05). The OS was similar, after adjusting for systemic status and GPA score (p < 0.05). The technical reasons for choosing GK instead of Linac were the anatomical location related to highly functional areas, histology, technical limitations of Linac movements, especially lower posterior fossa locations, or closeness of multiple lesions to highly functional areas optimal dosimetry with Linac
Resumo:
BACKGROUND: Cone-beam computed tomography (CBCT) image-guided radiotherapy (IGRT) systems are widely used tools to verify and correct the target position before each fraction, allowing to maximize treatment accuracy and precision. In this study, we evaluate automatic three-dimensional intensity-based rigid registration (RR) methods for prostate setup correction using CBCT scans and study the impact of rectal distension on registration quality. METHODS: We retrospectively analyzed 115 CBCT scans of 10 prostate patients. CT-to-CBCT registration was performed using (a) global RR, (b) bony RR, or (c) bony RR refined by a local prostate RR using the CT clinical target volume (CTV) expanded with 1-to-20-mm varying margins. After propagation of the manual CT contours, automatic CBCT contours were generated. For evaluation, a radiation oncologist manually delineated the CTV on the CBCT scans. The propagated and manual CBCT contours were compared using the Dice similarity and a measure based on the bidirectional local distance (BLD). We also conducted a blind visual assessment of the quality of the propagated segmentations. Moreover, we automatically quantified rectal distension between the CT and CBCT scans without using the manual CBCT contours and we investigated its correlation with the registration failures. To improve the registration quality, the air in the rectum was replaced with soft tissue using a filter. The results with and without filtering were compared. RESULTS: The statistical analysis of the Dice coefficients and the BLD values resulted in highly significant differences (p<10(-6)) for the 5-mm and 8-mm local RRs vs the global, bony and 1-mm local RRs. The 8-mm local RR provided the best compromise between accuracy and robustness (Dice median of 0.814 and 97% of success with filtering the air in the rectum). We observed that all failures were due to high rectal distension. Moreover, the visual assessment confirmed the superiority of the 8-mm local RR over the bony RR. CONCLUSION: The most successful CT-to-CBCT RR method proved to be the 8-mm local RR. We have shown the correlation between its registration failures and rectal distension. Furthermore, we have provided a simple (easily applicable in routine) and automatic method to quantify rectal distension and to predict registration failure using only the manual CT contours.
Resumo:
This work is focused on the development of a methodology for the use of chemical characteristic of tire traces to help answer the following question: "Is the offending tire at the origin of the trace found on the crime scene?". This methodology goes from the trace sampling on the road to statistical analysis of its chemical characteristics. Knowledge about the composition and manufacture of tread tires as well as a review of instrumental techniques used for the analysis of polymeric materials were studied to select, as an ansi vi cal technique for this research, pyrolysis coupled to a gas Chromatograph with a mass spectrometry detector (Py-GC/MS). An analytical method was developed and optimized to obtain the lowest variability between replicates of the same sample. Within-variability of the tread was evaluated regarding width and circumference with several samples taken from twelve tires of different brands and/or models. The variability within each of the treads (within-variability) and between the treads (between-variability) could be quantified. Different statistical methods have shown that within-variability is lower than between-variability, which helped differentiate these tires. Ten tire traces were produced with tires of different brands and/or models by braking tests. These traces have been adequately sampled using sheets of gelatine. Particles of each trace were analysed using the same methodology as for the tires at their origin. The general chemical profile of a trace or of a tire has been characterized by eighty-six compounds. Based on a statistical comparison of the chemical profiles obtained, it has been shown that a tire trace is not differentiable from the tire at its origin but is generally differentiable from tires that are not at its origin. Thereafter, a sample containing sixty tires was analysed to assess the discrimination potential of the developed methodology. The statistical results showed that most of the tires of different brands and models are differentiable. However, tires of the same brand and model with identical characteristics, such as country of manufacture, size and DOT number, are not differentiable. A model, based on a likelihood ratio approach, was chosen to evaluate the results of the comparisons between the chemical profiles of the traces and tires. The methodology developed was finally blindly tested using three simulated scenarios. Each scenario involved a trace of an unknown tire as well as two tires possibly at its origin. The correct results for the three scenarios were used to validate the developed methodology. The different steps of this work were useful to collect the required information to test and validate the underlying assumption that it is possible to help determine if an offending tire » or is not at the origin of a trace, by means of a statistical comparison of their chemical profile. This aid was formalized by a measure of the probative value of the evidence, which is represented by the chemical profile of the trace of the tire. - Ce travail s'est proposé de développer une méthodologie pour l'exploitation des caractéristiques chimiques des traces de pneumatiques dans le but d'aider à répondre à la question suivante : «Est-ce que le pneumatique incriminé est ou n'est pas à l'origine de la trace relevée sur les lieux ? ». Cette méthodologie s'est intéressée du prélèvement de la trace de pneumatique sur la chaussée à l'exploitation statistique de ses caractéristiques chimiques. L'acquisition de connaissances sur la composition et la fabrication de la bande de roulement des pneumatiques ainsi que la revue de techniques instrumentales utilisées pour l'analyse de matériaux polymériques ont permis de choisir, comme technique analytique pour la présente recherche, la pyrolyse couplée à un chromatographe en phase gazeuse avec un détecteur de spectrométrie de masse (Py-GC/MS). Une méthode analytique a été développée et optimisée afin d'obtenir la plus faible variabilité entre les réplicas d'un même échantillon. L'évaluation de l'intravariabilité de la bande de roulement a été entreprise dans sa largeur et sa circonférence à l'aide de plusieurs prélèvements effectués sur douze pneumatiques de marques et/ou modèles différents. La variabilité au sein de chacune des bandes de roulement (intravariabilité) ainsi qu'entre les bandes de roulement considérées (intervariabilité) a pu être quantifiée. Les différentes méthodes statistiques appliquées ont montré que l'intravariabilité est plus faible que l'intervariabilité, ce qui a permis de différencier ces pneumatiques. Dix traces de pneumatiques ont été produites à l'aide de pneumatiques de marques et/ou modèles différents en effectuant des tests de freinage. Ces traces ont pu être adéquatement prélevées à l'aide de feuilles de gélatine. Des particules de chaque trace ont été analysées selon la même méthodologie que pour les pneumatiques à leur origine. Le profil chimique général d'une trace de pneumatique ou d'un pneumatique a été caractérisé à l'aide de huitante-six composés. Sur la base de la comparaison statistique des profils chimiques obtenus, il a pu être montré qu'une trace de pneumatique n'est pas différenciable du pneumatique à son origine mais est, généralement, différenciable des pneumatiques qui ne sont pas à son origine. Par la suite, un échantillonnage comprenant soixante pneumatiques a été analysé afin d'évaluer le potentiel de discrimination de la méthodologie développée. Les méthodes statistiques appliquées ont mis en évidence que des pneumatiques de marques et modèles différents sont, majoritairement, différenciables entre eux. La méthodologie développée présente ainsi un bon potentiel de discrimination. Toutefois, des pneumatiques de la même marque et du même modèle qui présentent des caractéristiques PTD (i.e. pays de fabrication, taille et numéro DOT) identiques ne sont pas différenciables. Un modèle d'évaluation, basé sur une approche dite du likelihood ratio, a été adopté pour apporter une signification au résultat des comparaisons entre les profils chimiques des traces et des pneumatiques. La méthodologie mise en place a finalement été testée à l'aveugle à l'aide de la simulation de trois scénarios. Chaque scénario impliquait une trace de pneumatique inconnue et deux pneumatiques suspectés d'être à l'origine de cette trace. Les résultats corrects obtenus pour les trois scénarios ont permis de valider la méthodologie développée. Les différentes étapes de ce travail ont permis d'acquérir les informations nécessaires au test et à la validation de l'hypothèse fondamentale selon laquelle il est possible d'aider à déterminer si un pneumatique incriminé est ou n'est pas à l'origine d'une trace, par le biais d'une comparaison statistique de leur profil chimique. Cette aide a été formalisée par une mesure de la force probante de l'indice, qui est représenté par le profil chimique de la trace de pneumatique.
Resumo:
OBJECTIVE: To study emotional behaviors in an acute stroke population. BACKGROUND: Alterations in emotional behavior after stroke have been recently recognized, but little attention has been paid to these changes in the very acute phase of stroke. METHODS: Adult patients presenting with acute stroke were prospectively recruited and studied. We validated the Emotional Behavior Index (EBI), a 38-item scale designed to evaluate behavioral aspects of sadness, aggressiveness, disinhibition, adaptation, passivity, indifference, and denial. Clinical, historical, and imaging (computed tomography/magnetic resonance imaging) data were obtained on each subject through our Stroke Registry. Statistical analysis was performed with both univariate and multivariate tests. RESULTS: Of the 254 patients, 40% showed sadness, 49% passivity, 17% aggressiveness, 53% indifference, 76% disinhibition, 18% lack of adaptation, and 44% denial reactions. Several significant correlations were identified. Sadness was correlated with a personal history of alcohol abuse (r = P < 0.037), female gender (r = P < 0.028), and hemorrhagic nature of the stroke (r = P < 0.063). Aggressiveness was correlated with a personal history of depression (r = P < 0.046) and hemorrhage (r = P < 0.06). Denial was correlated with male gender (r = P < 0.035) and hemorrhagic lesions (r = P < 0.05). Emotional behavior did not correlate with either neurologic impairment or lesion localization, but there was an association between hemorrhage and aggressive behavior (P < 0.001), lack of adaptation (r = P < 0.015), indifference (r = P < 0.018), and denial (r = P < 0.045). CONCLUSIONS: Systematic observations of acute emotional behaviors after stroke suggest that emotional alterations are independent of mood and physical status and should be considered as a separate consequence of stroke.