246 resultados para LIKELIHOOD METHODS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND/AIMS: Treatment of chronic HCV infection has become a priority in HIV+ patients, given the faster progression to end-stage liver disease. The primary endpoint of this study was to evaluate and compare antiviral efficacy of Peginterferon alpha 2a plus ribavirin in HIV-HCV co-infected and HCV mono-infected patients, and to examine whether 6 months of therapy would have the same efficacy in HIV patients with favourable genotypes 2 and 3 as in mono-infected patients, to minimise HCV-therapy-related toxicities. Secondary endpoints were to evaluate predictors of sustained virological response (SVR) and frequency of side-effects. METHODS: Patients with genotypes 1 and 4 were treated for 48 weeks with Pegasys 180 microg/week plus Copegus 1000-1200 mg/day according to body weight; patients with genotypes 2 and 3 for 24 weeks with Pegasys 180 microg/week plus Copegus 800 mg/day. RESULTS: 132 patients were enrolled in the study: 85 HCV mono-infected (38: genotypes 1 and 4; 47: genotypes 2 and 3), 47 HIV-HCV co-infected patients (23: genotypes 1 and 4; 24: genotypes 2 and 3). In an intention-to-treat analysis, SVR for genotypes 1 and 4 was observed in 58% of HCV mono-infected and in 13% of HIV-HCV co-infected patients (P = 0.001). For genotypes 2 and 3, SVR was observed in 70% of HCV mono-infected and in 67% of HIV-HCV co-infected patients (P = 0.973). Undetectable HCV-RNA at week 4 had a positive predictive value for SVR for mono-infected patients with genotypes 1 and 4 of 0.78 (95% CI: 0.54-0.93) and of 0.81 (95% CI: 0.64-0.92) for genotypes 2 and 3. For co-infected patients with genotypes 2 and 3, the positive predictive value of SVR of undetectable HCV-RNA at week 4 was 0.76 (95%CI, 0.50-0.93). Study not completed by 22 patients (36%): genotypes 1 and 4 and by 12 patients (17%): genotypes 2 and 3. CONCLUSION: Genotypes 2 or 3 predict the likelihood of SVR in HCV mono-infected and in HIV-HCV co-infected patients. A 6-month treatment with Peginterferon alpha 2a plus ribavirin has the same efficacy in HIV-HCV co-infected patients with genotypes 2 and 3 as in mono-infected patients. HCV-RNA negativity at 4 weeks has a positive predictive value for SVR. Aggressive treatment of adverse effects to avoid dose reduction, consent withdrawal or drop-out is crucial to increase the rate of SVR, especially when duration of treatment is 48 weeks. Sixty-one percent of HIV-HCV co-infected patients with genotypes 1 and 4 did not complete the study against 4% with genotypes 2 and 3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Studies about beverage preferences in a country in which wine drinking is relatively widespread (like Switzerland) are scarce. Therefore, the main aims of the present study were to examine the associations between beverage preferences and drinking patterns, alcohol-related consequences and the use of other substances among Swiss young men. METHODS: The analytical sample consisted of 5399 Swiss men who participated in the Cohort Study on Substance Use Risk Factors (C-SURF) and had been drinking alcohol over the preceding 12 months. Logistic regression analyses were conducted to study the associations between preference for a particular beverage and (i) drinking patterns, (ii) negative alcohol-related consequences and (iii) the (at-risk) use of cigarettes, cannabis and other illicit drugs. RESULTS: Preference for beer was associated with risky drinking patterns and, comparable with a preference for strong alcohol, with the use of illicit substances (cannabis and other illicit drugs). In contrast, a preference for wine was associated with low-risk alcohol consumption and a reduced likelihood of experiencing at least four negative alcohol-related consequences or of daily cigarette smoking. Furthermore, the likelihood of negative outcomes (alcohol-related consequences; use of other substances) increased among people with risky drinking behaviours, independent of beverage preference. CONCLUSIONS: In our survey, beer preference was associated with risky drinking patterns and illicit drug use. Alcohol polices to prevent large quantities of alcohol consumption, especially of cheaper spirits like beer, should be considered to reduce total alcohol consumption and the negative consequences associated with these beverage types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A short overview is given on the most important analytical body composition methods. Principles of the methods and advantages and limitations of the methods are discussed also in relation to other fields of research such as energy metabolism. Attention is given to some new developments in body composition research such as chemical multiple-compartment models, computerized tomography or nuclear magnetic resonance imaging (tissue level), and multifrequency bioelectrical impedance. Possible future directions of body composition research in the light of these new developments are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RATIONALE AND OBJECTIVES: To systematically review and meta-analyze published data about the diagnostic accuracy of fluorine-18-fluorodeoxyglucose ((18)F-FDG) positron emission tomography (PET) and PET/computed tomography (CT) in the differential diagnosis between malignant and benign pleural lesions. METHODS AND MATERIALS: A comprehensive literature search of studies published through June 2013 regarding the diagnostic performance of (18)F-FDG-PET and PET/CT in the differential diagnosis of pleural lesions was carried out. All retrieved studies were reviewed and qualitatively analyzed. Pooled sensitivity, specificity, positive and negative likelihood ratio (LR+ and LR-) and diagnostic odds ratio (DOR) of (18)F-FDG-PET or PET/CT in the differential diagnosis of pleural lesions on a per-patient-based analysis were calculated. The area under the summary receiver operating characteristic curve (AUC) was calculated to measure the accuracy of these methods. Subanalyses considering device used (PET or PET/CT) were performed. RESULTS: Sixteen studies including 745 patients were included in the systematic review. The meta-analysis of 11 selected studies provided the following results: sensitivity 95% (95% confidence interval [95%CI]: 92-97%), specificity 82% (95%CI: 76-88%), LR+ 5.3 (95%CI: 2.4-11.8), LR- 0.09 (95%CI: 0.05-0.14), DOR 74 (95%CI: 34-161). The AUC was 0.95. No significant improvement of the diagnostic accuracy considering PET/CT studies only was found. CONCLUSIONS: (18)F-FDG-PET and PET/CT demonstrated to be accurate diagnostic imaging methods in the differential diagnosis between malignant and benign pleural lesions; nevertheless, possible sources of false-negative and false-positive results should be kept in mind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: The current gold standard in Barrett's esophagus monitoring consists of four-quadrant biopsies every 1-2 cm in accordance with the Seattle protocol. Adding brush cytology processed by digital image cytometry (DICM) may further increase the detection of patients with Barrett's esophagus who are at risk of neoplasia. The aim of the present study was to assess the additional diagnostic value and accuracy of DICM when added to the standard histological analysis in a cross-sectional multicenter study of patients with Barrett's esophagus in Switzerland. METHODS: One hundred sixty-four patients with Barrett's esophagus underwent 239 endoscopies with biopsy and brush cytology. DICM was carried out on 239 cytology specimens. Measures of the test accuracy of DICM (relative risk, sensitivity, specificity, likelihood ratios) were obtained by dichotomizing the histopathology results (high-grade dysplasia or adenocarcinoma vs. all others) and DICM results (aneuploidy/intermediate pattern vs. diploidy). RESULTS: DICM revealed diploidy in 83% of 239 endoscopies, an intermediate pattern in 8.8%, and aneuploidy in 8.4%. An intermediate DICM result carried a relative risk (RR) of 12 and aneuploidy a RR of 27 for high-grade dysplasia/adenocarcinoma. Adding DICM to the standard biopsy protocol, a pathological cytometry result (aneuploid or intermediate) was found in 25 of 239 endoscopies (11%; 18 patients) with low-risk histology (no high-grade dysplasia or adenocarcinoma). During follow-up of 14 of these 18 patients, histological deterioration was seen in 3 (21%). CONCLUSION: DICM from brush cytology may add important information to a standard biopsy protocol by identifying a subgroup of BE-patients with high-risk cellular abnormalities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

P>1. Entomopathogenic nematodes can function as indirect defence for plants that are attacked by root herbivores. By releasing volatile organic compounds (VOCs), plants signal the presence of host insects and thereby attract nematodes.2. Nonetheless, how roots deploy indirect defences, how indirect defences relate to direct defences, and the ecological consequences of root defence allocation for herbivores and plant biomass are essentially unknown.3. We investigate a natural below-ground tritrophic system, involving common milkweed, a specialist root-boring beetle and entomopathogenic nematodes, and asked whether there is a negative genetic correlation between direct defences (root cardenolides) and indirect defences (emission of volatiles in the roots and nematode attraction), and between constitutive and inducible defences.4. Volatiles of roots were analysed using two distinct sampling methods. First, we collected emissions from living Asclepias syriaca roots by dynamic headspace sampling. This method showed that attacked A. syriaca plants emit five times higher levels of volatiles than control plants. Secondly, we used a solid phase micro-extraction (SPME) method to sample the full pool of volatiles in roots for genetic correlations of volatile biosynthesis.5. Field experiments showed that entomopathogenic nematodes prevent the loss of biomass to root herbivory. Additionally, suppression of root herbivores was mediated directly by cardenolides and indirectly by the attraction of nematodes. Genetic families of plants with high cardenolides benefited less from nematodes compared to low-cardenolide families, suggesting that direct and indirect defences may be redundant. Although constitutive and induced root defences traded off within each strategy (for both direct and indirect defence, cardenolides and VOCs, respectively), we found no trade-off between the two strategies.6. Synthesis. Constitutive expression and inducibility of defences may trade off because of resource limitation or because they are redundant. Direct and indirect defences do not trade off, likely because they may not share a limiting resource and because independently they may promote defence across the patchiness of herbivore attack and nematode presence in the field. Indeed, some redundancy in strategies may be necessary to increase effective defence, but for each strategy, an economy of deployment reduces overall costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.