885 resultados para fuzzy based evaluation method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we propose a highly accurate approximation procedure for ruin probabilities in the classical collective risk model, which is based on a quadrature/rational approximation procedure proposed in [2]. For a certain class of claim size distributions (which contains the completely monotone distributions) we give a theoretical justification for the method. We also show that under weaker assumptions on the claim size distribution, the method may still perform reasonably well in some cases. This in particular provides an efficient alternative to a related method proposed in [3]. A number of numerical illustrations for the performance of this procedure is provided for both completely monotone and other types of random variables.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this paper is to evaluate the risks associated with the use of fake fingerprints on a livescan supplied with a method of liveness detection. The method is based on optical properties of the skin. The sensor uses several polarizations and illuminations to capture the information of the different layers of the human skin. These experiments also allow for the determination under which conditions the system is deceived and if there is an influence respectively of the nature of the fake, the mould used for the production or the individuals involved in the attack. These experiments showed that current multispectral sensors can be deceived by the use of fake fingerprints created with or without the cooperation of the subject. Fakes created from direct casts perform better than those produced by fakes created from indirect casts. The results showed that the success of the attack is influenced by two main factors. The first is the quality of the fakes, and by extension the quality of the original fingerprint. The second is the combination of the general patterns involved in the attacks since an appropriate combination can strongly increase the rates of successful attacks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: Evaluation of the quantitative antibiogram as an epidemiological tool for the prospective typing of methicillin-resistant Staphylococcus aureus (MRSA), and comparison with ribotyping. METHODS: The method is based on the multivariate analysis of inhibition zone diameters of antibiotics in disk diffusion tests. Five antibiotics were used (erythromycin, clindamycin, cotrimoxazole, gentamicin, and ciprofloxacin). Ribotyping was performed using seven restriction enzymes (EcoRV, HindIII, KpnI, PstI, EcoRI, SfuI, and BamHI). SETTING: 1,000-bed tertiary university medical center. RESULTS: During a 1-year period, 31 patients were found to be infected or colonized with MRSA. Cluster analysis of antibiogram data showed nine distinct antibiotypes. Four antibiotypes were isolated from multiple patients (2, 4, 7, and 13, respectively). Five additional antibiotypes were isolated from the remaining five patients. When analyzed with respect to the epidemiological data, the method was found to be equivalent to ribotyping. Among 206 staff members who were screened, six were carriers of MRSA. Both typing methods identified concordant of MRSA types in staff members and in the patients under their care. CONCLUSIONS: The quantitative antibiogram was found to be equivalent to ribotyping as an epidemiological tool for typing of MRSA in our setting. Thus, this simple, rapid, and readily available method appears to be suitable for the prospective surveillance and control of MRSA for hospitals that do not have molecular typing facilities and in which MRSA isolates are not uniformly resistant or susceptible to the antibiotics tested.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

False identity documents constitute a potential powerful source of forensic intelligence because they are essential elements of transnational crime and provide cover for organized crime. In previous work, a systematic profiling method using false documents' visual features has been built within a forensic intelligence model. In the current study, the comparison process and metrics lying at the heart of this profiling method are described and evaluated. This evaluation takes advantage of 347 false identity documents of four different types seized in two countries whose sources were known to be common or different (following police investigations and dismantling of counterfeit factories). Intra-source and inter-sources variations were evaluated through the computation of more than 7500 similarity scores. The profiling method could thus be validated and its performance assessed using two complementary approaches to measuring type I and type II error rates: a binary classification and the computation of likelihood ratios. Very low error rates were measured across the four document types, demonstrating the validity and robustness of the method to link documents to a common source or to differentiate them. These results pave the way for an operational implementation of a systematic profiling process integrated in a developed forensic intelligence model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the use of supplementary cementing materials (SCMs) in concrete mixtures, salt scaling tests such as ASTM C672 have been found to be overly aggressive and do correlate well with field scaling performance. The reasons for this are thought to be because at high replacement levels, SCM mixtures can take longer to set and to develop their properties: neither of these factors is taken into account in the standard laboratory finishing and curing procedures. As a result, these variables were studied as well as a modified scaling test, based on the Quebec BNQ scaling test that had shown promise in other research. The experimental research focused on the evaluation of three scaling resistance tests, including the ASTM C672 test with normal curing as well as an accelerated curing regime used by VDOT for ASTM C1202 rapid chloride permeability tests and now included as an option in ASTM C1202. As well, several variations on the proposed draft ASTM WK9367 deicer scaling resistance test, based on the Quebec Ministry of Transportation BNQ test method, were evaluated for concretes containing varying amounts of slag cement. A total of 16 concrete mixtures were studied using both high alkali cement and low alkali cement, Grade 100 slag and Grade 120 slag with 0, 20, 35 and 50 percent slag replacement by mass of total cementing materials. Vinsol resin was used as the primary air entrainer and Micro Air® was used in two replicate mixes for comparison. Based on the results of this study, a draft alternative test method to ASTM C762 is proposed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD) related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI) and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles) as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox) for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Limited information exists regarding the association between serum uric acid (SUA) and psychiatric disorders. We explored the relationship between SUA and subtypes of major depressive disorder (MDD) and specific anxiety disorders. Additionally, we examined the association of SLC2A9 rs6855911 variant with anxiety disorders. Methods: We conducted a cross-sectional analysis on 3,716 individuals aged 35-66 years previously selected for the population-based CoLaus survey and who agreed to undergo further psychiatric evaluation. SUA was measured using uricase-PAP method. The French translation of the semi-structured Diagnostic Interview for Genetic Studies was used to establish lifetime and current diagnoses of depression and anxiety disorders according to the DSM-IV criteria. Results: Men reported significantly higher levels of SUA compared to women (357}74 μmol/L vs. 263}64 μmol/L). The prevalence of lifetime and current MDD was 44% and 18% respectively while the corresponding estimates for any anxiety disorders were 18% and 10% respectively. A quadratic hockey-stick shaped curve explained the relationship between SUA and social phobia better than a linear trend. However, with regards to the other specific anxiety disorders and other subtypes of MDD, there was no consistent pattern of association. Further analyses using SLC2A9 rs6855911 variant, known to be strongly associated with SUA, supported the quadratic relationship observed between SUA phenotype and social phobia. Conclusions: A quadratic relationship between SUA and social phobia was observed consistent with a protective effect of moderately elevated SUA on social phobia, which disappears at higher concentrations. Further studies are needed to confirm our observations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a new non parametric atlas registration framework, derived from the optical flow model and the active contour theory, applied to automatic subthalamic nucleus (STN) targeting in deep brain stimulation (DBS) surgery. In a previous work, we demonstrated that the STN position can be predicted based on the position of surrounding visible structures, namely the lateral and third ventricles. A STN targeting process can thus be obtained by registering these structures of interest between a brain atlas and the patient image. Here we aim to improve the results of the state of the art targeting methods and at the same time to reduce the computational time. Our simultaneous segmentation and registration model shows mean STN localization errors statistically similar to the most performing registration algorithms tested so far and to the targeting expert's variability. Moreover, the computational time of our registration method is much lower, which is a worthwhile improvement from a clinical point of view.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Evaluation of syncope remains often unstructured. The aim of the study was to assess the effectiveness of a standardized protocol designed to improve the diagnosis of syncope. METHODS: Consecutive patients with syncope presenting to the emergency departments of two primary and tertiary care hospitals over a period of 18 months underwent a two-phase evaluation including: 1) noninvasive assessment (phase I); and 2) specialized tests (phase II), if syncope remained unexplained after phase I. During phase II, the evaluation strategy was alternately left to physicians in charge of patients (control), or guided by a standardized protocol relying on cardiac status and frequency of events (intervention). The primary outcomes were the diagnostic yield of each phase, and the impact of the intervention (phase II) measured by multivariable analysis. RESULTS: Among 1725 patients with syncope, 1579 (92%) entered phase I which permitted to establish a diagnosis in 1061 (67%) of them, including mainly reflex causes and orthostatic hypotension. Five-hundred-eighteen patients (33%) were considered as having unexplained syncope and 363 (70%) entered phase II. A cause for syncope was found in 67 (38%) of 174 patients during intervention periods, compared to 18 (9%) of 189 during control (p<0.001). Compared to control periods, intervention permitted diagnosing more cardiac (8%, vs 3%, p=0.04) and reflex syncope (25% vs 6%, p<0.001), and increased the odds of identifying a cause for syncope by a factor of 4.5 (95% CI: 2.6-8.7, p<0.001). Overall, adding the diagnostic yield obtained during phase I and phase II (intervention periods) permitted establishing the cause of syncope in 76% of patients. CONCLUSION: Application of a standardized diagnostic protocol in patients with syncope improved the likelihood of identifying a cause for this symptom. Future trials should assess the efficacy of diagnosis-specific therapy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we present the segmentation of the headand neck lymph node regions using a new active contourbased atlas registration model. We propose to segment thelymph node regions without directly including them in theatlas registration process; instead, they are segmentedusing the dense deformation field computed from theregistration of the atlas structures with distinctboundaries. This approach results in robust and accuratesegmentation of the lymph node regions even in thepresence of significant anatomical variations between theatlas-image and the patient's image to be segmented. Wealso present a quantitative evaluation of lymph noderegions segmentation using various statistical as well asgeometrical metrics: sensitivity, specificity, dicesimilarity coefficient and Hausdorff distance. Acomparison of the proposed method with two other state ofthe art methods is presented. The robustness of theproposed method to the atlas selection, in segmenting thelymph node regions, is also evaluated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main objective of this work was to compare two methods to estimate the deposition of pesticide applied by aerial spraying. Hundred and fifty pieces of water sensitive paper were distributed over an area of 50 m length by 75 m width for sampling droplets sprayed by an aircraft calibrated to apply a spray volume of 32 L/ha. The samples were analysed by visual microscopic method using NG 2 Porton graticule and by an image analyser computer program. The results reached by visual microscopic method were the following: volume median diameter, 398±62 mum; number median diameter, 159±22 mum; droplet density, 22.5±7.0 droplets/cm² and estimated deposited volume, 22.2±9.4 L/ha. The respective ones reached with the computer program were: 402±58 mum, 161±32 mum, 21.9±7.5 droplets/cm² and 21.9±9.2 L/ha. Graphs of the spatial distribution of droplet density and deposited spray volume on the area were produced by the computer program.