160 resultados para Classification Protocols
Resumo:
Sleep-wake disturbances are frequently observed in stroke patients and are associated with poorer functional outcome. Until now the effects of sleep on stroke evolution are unknown. The purpose of the present study was to evaluate the effects of three sleep deprivation (SD) protocols on brain damages after focal cerebral ischemia in a rat model. Permanent occlusion of distal branches of the middle cerebral artery was induced in adult rats. The animals were then subjected to 6h SD, 12h SD or sleep disturbances (SDis) in which 3 x 12h sleep deprivation were performed by gentle handling. Infarct size and brain swelling were assessed by Cresyl violet staining, and the number of damaged cells was measured by terminal deoxynucleotidyl transferase mediated dUTP nick end labeling (TUNEL) staining. Behavioral tests, namely tape removal and cylinder tests, were performed for assessing sensorimotor function. In the 6h SD protocol, no significant difference (P > 0.05) was found either in infarct size (42.5 ± 30.4 mm3 in sleep deprived animals vs. 44.5 ± 20.5 mm3 in controls, mean ± s.d.), in brain swelling (10.2 ± 3.8 % in sleep deprived animals vs. 11.3 ± 2.0 % in controls) or in number of TUNEL-positive cells (21.7 ± 2.0/mm2 in sleep deprived animals vs. 23.0 ± 1.1/mm2 in controls). In contrast, 12h sleep deprivation increased infarct size by 40 % (82.8 ± 10.9 mm3 in SD group vs. 59.2 ± 13.9 mm3 in control group, P = 0.008) and number of TUNEL-positive cells by 137 % (46.8 ± 15/mm in SD group vs. 19.7 ± 7.7/mm2 in control group, P = 0.003). There was no significant difference (P > 0.05) in brain swelling (12.9 ± 6.3 % in sleep deprived animals vs. 11.6 ± 6.0 % in controls). The SDis protocol also increased infarct size by 76 % (3 x 12h SD 58.8 ± 20.4 mm3 vs. no SD 33.8 ± 6.3 mm3, P = 0.017) and number of TUNEL-positive cells by 219 % (32.9 ± 13.2/mm2 vs. 10.3 ± 2.5/mm2, P = 0.008). Brain swelling did not show any difference between the two groups (24.5 ± 8.4 % in SD group vs. 16.7 ± 8.9 % in control group, p > 0.05). Both behavioral tests did not show any concluding results. In summary, we demonstrate that sleep deprivation aggravates brain damages in a rat model of stroke. Further experiments are needed to unveil the mechanisms underlying these effects.
Resumo:
Résumé La cryptographie classique est basée sur des concepts mathématiques dont la sécurité dépend de la complexité du calcul de l'inverse des fonctions. Ce type de chiffrement est à la merci de la puissance de calcul des ordinateurs ainsi que la découverte d'algorithme permettant le calcul des inverses de certaines fonctions mathématiques en un temps «raisonnable ». L'utilisation d'un procédé dont la sécurité est scientifiquement prouvée s'avère donc indispensable surtout les échanges critiques (systèmes bancaires, gouvernements,...). La cryptographie quantique répond à ce besoin. En effet, sa sécurité est basée sur des lois de la physique quantique lui assurant un fonctionnement inconditionnellement sécurisé. Toutefois, l'application et l'intégration de la cryptographie quantique sont un souci pour les développeurs de ce type de solution. Cette thèse justifie la nécessité de l'utilisation de la cryptographie quantique. Elle montre que le coût engendré par le déploiement de cette solution est justifié. Elle propose un mécanisme simple et réalisable d'intégration de la cryptographie quantique dans des protocoles de communication largement utilisés comme les protocoles PPP, IPSec et le protocole 802.1li. Des scénarios d'application illustrent la faisabilité de ces solutions. Une méthodologie d'évaluation, selon les critères communs, des solutions basées sur la cryptographie quantique est également proposée dans ce document. Abstract Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions? This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.
Dissemination of the Swiss Model for Outcome Classification in Health Promotion and Prevention SMOC.
Resumo:
Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.
Resumo:
BACKGROUND: To compare the prognostic relevance of Masaoka and Müller-Hermelink classifications. METHODS: We treated 71 patients with thymic tumors at our institution between 1980 and 1997. Complete follow-up was achieved in 69 patients (97%) with a mean follow up-time of 8.3 years (range, 9 months to 17 years). RESULTS: Masaoka stage I was found in 31 patients (44.9%), stage II in 17 (24.6%), stage III in 19 (27.6%), and stage IV in 2 (2.9%). The 10-year overall survival rate was 83.5% for stage I, 100% for stage IIa, 58% for stage IIb, 44% for stage III, and 0% for stage IV. The disease-free survival rates were 100%, 70%, 40%, 38%, and 0%, respectively. Histologic classification according to Müller-Hermelink found medullary tumors in 7 patients (10.1%), mixed in 18 (26.1%), organoid in 14 (20.3%), cortical in 11 (15.9%), well-differentiated thymic carcinoma in 14 (20.3%), and endocrine carcinoma in 5 (7.3%), with 10-year overall survival rates of 100%, 75%, 92%, 87.5%, 30%, and 0%, respectively, and 10-year disease-free survival rates of 100%, 100%, 77%, 75%, 37%, and 0%, respectively. Medullary, mixed, and well-differentiated organoid tumors were correlated with stage I and II, and well-differentiated thymic carcinoma and endocrine carcinoma with stage III and IV (p < 0.001). Multivariate analysis showed age, gender, myasthenia gravis, and postoperative adjuvant therapy not to be significant predictors of overall and disease-free survival after complete resection, whereas the Müller-Hermelink and Masaoka classifications were independent significant predictors for overall (p < 0.05) and disease-free survival (p < 0.004; p < 0.0001). CONCLUSIONS: The consideration of staging and histology in thymic tumors has the potential to improve recurrence prediction and patient selection for combined treatment modalities.
Resumo:
When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.
Resumo:
For several years, the lack of consensus on definition, nomenclature, natural history, and biology of serrated polyps (SPs) of the colon has created considerable confusion among pathologists. According to the latest WHO classification, the family of SPs comprises hyperplastic polyps (HPs), sessile serrated adenomas/polyps (SSA/Ps), and traditional serrated adenomas (TSAs). The term SSA/P with dysplasia has replaced the category of mixed hyperplastic/adenomatous polyps (MPs). The present study aimed to evaluate the reproducibility of the diagnosis of SPs based on currently available diagnostic criteria and interactive consensus development. In an initial round, H&E slides of 70 cases of SPs were circulated among participating pathologists across Europe. This round was followed by a consensus discussion on diagnostic criteria. A second round was performed on the same 70 cases using the revised criteria and definitions according to the recent WHO classification. Data were evaluated for inter-observer agreement using Kappa statistics. In the initial round, for the total of 70 cases, a fair overall kappa value of 0.318 was reached, while in the second round overall kappa value improved to moderate (kappa = 0.557; p < 0.001). Overall kappa values for each diagnostic category also significantly improved in the final round, reaching 0.977 for HP, 0.912 for SSA/P, and 0.845 for TSA (p < 0.001). The diagnostic reproducibility of SPs improves when strictly defined, standardized diagnostic criteria adopted by consensus are applied.
Resumo:
The paper presents a novel method for monitoring network optimisation, based on a recent machine learning technique known as support vector machine. It is problem-oriented in the sense that it directly answers the question of whether the advised spatial location is important for the classification model. The method can be used to increase the accuracy of classification models by taking a small number of additional measurements. Traditionally, network optimisation is performed by means of the analysis of the kriging variances. The comparison of the method with the traditional approach is presented on a real case study with climate data.
Resumo:
An exhaustive classification of matrix effects occurring when a sample preparation is performed prior to liquid-chromatography coupled to mass spectrometry (LC-MS) analyses was proposed. A total of eight different situations were identified allowing the recognition of the matrix effect typology via the calculation of four recovery values. A set of 198 compounds was used to evaluate matrix effects after solid phase extraction (SPE) from plasma or urine samples prior to LC-ESI-MS analysis. Matrix effect identification was achieved for all compounds and classified through an organization chart. Only 17% of the tested compounds did not present significant matrix effects.
Resumo:
BACKGROUND: Many clinical studies are ultimately not fully published in peer-reviewed journals. Underreporting of clinical research is wasteful and can result in biased estimates of treatment effect or harm, leading to recommendations that are inappropriate or even dangerous. METHODS: We assembled a cohort of clinical studies approved 2000-2002 by the Research Ethics Committee of the University of Freiburg, Germany. Published full articles were searched in electronic databases and investigators contacted. Data on study characteristics were extracted from protocols and corresponding publications. We characterized the cohort, quantified its publication outcome and compared protocols and publications for selected aspects. RESULTS: Of 917 approved studies, 807 were started and 110 were not, either locally or as a whole. Of the started studies, 576 (71%) were completed according to protocol, 128 (16%) discontinued and 42 (5%) are still ongoing; for 61 (8%) there was no information about their course. We identified 782 full publications corresponding to 419 of the 807 initiated studies; the publication proportion was 52% (95% CI: 0.48-0.55). Study design was not significantly associated with subsequent publication. Multicentre status, international collaboration, large sample size and commercial or non-commercial funding were positively associated with subsequent publication. Commercial funding was mentioned in 203 (48%) protocols and in 205 (49%) of the publications. In most published studies (339; 81%) this information corresponded between protocol and publication. Most studies were published in English (367; 88%); some in German (25; 6%) or both languages (27; 6%). The local investigators were listed as (co-)authors in the publications corresponding to 259 (62%) studies. CONCLUSION: Half of the clinical research conducted at a large German university medical centre remains unpublished; future research is built on an incomplete database. Research resources are likely wasted as neither health care professionals nor patients nor policy makers can use the results when making decisions.
Resumo:
Lung cancer is characterized by the highest incidence of solid tumor-related brain metastases, which are reported with a growing incidence during the last decade. Prognostic assessment may help to identify subgroups of patients that could benefit from more aggressive therapy of metastatic disease, in particular when central nervous system is involved. The recent sub-classification of non-small cell lung cancer (NSCLC) into molecularly-defined "oncogene-addicted" tumors, the emergence of effective targeted treatments in molecularly defined patient subsets, global improvement of advanced NSCLC survival as well as the availability of refined new radiotherapy techniques are likely to impact on outcomes of patients with brain dissemination. The present review focuses on key evidence and research strategies for systemic treatment of patients with central nervous system involvement in non-small cell lung cancer.