961 resultados para multiclass classification problems
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
Objectives: The aim of this study was to evaluate the efficacy of brief motivational intervention (BMI) in reducing alcohol use and related problems among binge drinkers randomly selected from a census of 20 year-old French speaking Swiss men and to test the hypothesis that BMI contributes to maintain low-risk drinking among non-bingers. Methods: Randomized controlled trial comparing the impact of BMI on weekly alcohol use, frequency of binge drinking and occurrence of alcohol-related problems. Setting: Army recruitment center. Participants: A random sample of 622 men were asked to participate, 178 either refused, or missed appointment, or had to follow military assessment procedures instead, resulting in 418 men randomized into BMI or control conditions, 88.7% completing the 6-month follow-up assessment. Intervention: A single face-to-face BMI session exploring alcohol use and related problems in order to stimulate behaviour change perspective in a non-judgmental, empathic manner based on the principles of motivational interviewing (MI). Main outcome measures: Weekly alcohol use, binge drinking frequency and the occurrence of 12 alcohol-related consequences. Results: Among binge drinkers, we observed a 20% change in drinking induced by BMI, with a reduction in weekly drinking of 1.5 drink in the BMI group, compared to an increase of 0.8 drink per week in the control group (incidence rate ratio 0.8, 95% confidence interval 0,66 to 0,98, p = 0.03). BMI did not influence the frequency of binge drinking and the occurrence of 12 possible alcohol-related consequences. However, BMI induced a reduction in the alcohol use of participants who, after drinking over the past 12 months, experienced alcohol-related consequences, i.e., hangover (-20%), missed a class (-53%), got behind at school (-54%), argued with friends (-38%), engaged in unplanned sex (-45%) or did not use protection when having sex (-64%). BMI did not reduce weekly drinking in those who experienced the six other problems screened. Among non-bingers, BMI did not contribute to maintain low-risk drinking. Conclusions: At army conscription, BMI reduced alcohol use in binge drinkers, particularly in those who recently experienced alcohol-related adverse consequences. No preventive effect of BMI was observed among non-bingers. BMI is an interesting preventive option in young binge drinkers, particularly in countries with mandatory army recruitment.
Resumo:
The P-median problem is a classical location model par excellence . In this paper we, firstexamine the early origins of the problem, formulated independently by Louis Hakimi andCharles ReVelle, two of the fathers of the burgeoning multidisciplinary field of researchknown today as Facility Location Theory and Modelling. We then examine some of thetraditional heuristic and exact methods developed to solve the problem. In the third sectionwe analyze the impact of the model in the field. We end the paper by proposing new lines ofresearch related to such a classical problem.
Resumo:
We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.
Resumo:
Introduction: Quantitative measures of degree of lumbar spinal stenosis (LSS) such as antero-posterior diameter of the canal or dural sac cross sectional area vary widely and do not correlate with clinical symptoms or results of surgical decompression. In an effort to improve quantification of stenosis we have developed a grading system based on the morphology of the dural sac and its contents as seen on T2 axial images. The grading comprises seven categories ranging form normal to the most severe stenosis and takes into account the ratio of rootlet/CSF content. Material and methods: Fifty T2 axial MRI images taken at disc level from twenty seven symptomatic lumbar spinal stenosis patients who underwent decompressive surgery were classified into seven categories by five observers and reclassified 2 weeks later by the same investigators. Intra- and inter-observer reliability of the classification were assessed using Cohen's and Fleiss' kappa statistics, respectively. Results: Generally, the morphology grading system itself was well adopted by the observers. Its success in application is strongly influenced by the identification of the dural sac. The average intraobserver Cohen's kappa was 0.53 ± 0.2. The inter-observer Fleiss' kappa was 0.38 ± 0.02 in the first rating and 0.3 ± 0.03 in the second rating repeated after two weeks. Discussion: In this attempt, the teaching of the observers was limited to an introduction to the general idea of the morphology grading system and one example MRI image per category. The identification of the dimension of the dural sac may be a difficult issue in absence of complete T1 T2 MRI image series as it was the case here. The similarity of the CSF to possibly present fat on T2 images was the main reason of mismatch in the assignment of the cases to a category. The Fleiss correlation factors of the five observers are fair and the proposed morphology grading system is promising.
Resumo:
There is a large and growing literature that studies the effects of weak enforcement institutions on economic performance. This literature has focused almost exclusively on primary markets, in which assets are issued and traded to improve the allocation of investment and consumption. The general conclusion is that weak enforcement institutions impair the workings of these markets, giving rise to various inefficiencies.But weak enforcement institutions also create incentives to develop secondary markets, in which the assets issued in primary markets are retraded. This paper shows that trading in secondary markets counteracts the effects of weak enforcement institutions and, in the absence of further frictions, restores efficiency.
Resumo:
We present a polyhedral framework for establishing general structural properties on optimal solutions of stochastic scheduling problems, where multiple job classes vie for service resources: the existence of an optimal priority policy in a given family, characterized by a greedoid(whose feasible class subsets may receive higher priority), where optimal priorities are determined by class-ranking indices, under restricted linear performance objectives (partial indexability). This framework extends that of Bertsimas and Niño-Mora (1996), which explained the optimality of priority-index policies under all linear objectives (general indexability). We show that, if performance measures satisfy partial conservation laws (with respect to the greedoid), which extend previous generalized conservation laws, then theproblem admits a strong LP relaxation over a so-called extended greedoid polytope, which has strong structural and algorithmic properties. We present an adaptive-greedy algorithm (which extends Klimov's) taking as input the linear objective coefficients, which (1) determines whether the optimal LP solution is achievable by a policy in the given family; and (2) if so, computes a set of class-ranking indices that characterize optimal priority policies in the family. In the special case of project scheduling, we show that, under additional conditions, the optimal indices can be computed separately for each project (index decomposition). We further apply the framework to the important restless bandit model (two-action Markov decision chains), obtaining new index policies, that extend Whittle's (1988), and simple sufficient conditions for their validity. These results highlight the power of polyhedral methods (the so-called achievable region approach) in dynamic and stochastic optimization.
Resumo:
The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.
Resumo:
Liposomal pegylated doxorubicin is an encapsulation form of doxorubicin, with an improved pharmacokinetic profile and the ability to selectively accumulate into tumor tissue. As a result, the tolerated dose of the drug can be increased, followed by a reduced incidence of neutropenia and cardiotoxicity in comparison to doxorubucin treatment. However, a common adverse dose-schedule limiting effect of the treatment is palmoplantar erythrodysesthesia syndrome. In this retrospective study we included six patients hospitalised in the University Hospital of Zurich during the last 2 years, in connection with side effects caused by pegylated liposomal doxorubicin. These patients received this chemotherapeutic agent for treatment of various malignancies such as breast cancer, ovarian cancer, mycosis fungoides and cutaneous B-cell lymphoma. Three of six patients in this study developed classical palmoplantar erythrodysesthesia, one developed palmoplantar erythrodysesthesia associated with extensive bullous disease, one developed eruption of lymphocyte recovery syndrome and one developed intertrigo like dermatitis with stomatitis. Pegylated liposomal doxorubicin induces various skin reactions including palmoplantar erythrodysesthesia syndrome. However, the exact clinical presentation might depend on pre-existing skin diseases.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.
Resumo:
Different types of cell death are often defined by morphological criteria, without a clear reference to precise biochemical mechanisms. The Nomenclature Committee on Cell Death (NCCD) proposes unified criteria for the definition of cell death and of its different morphologies, while formulating several caveats against the misuse of words and concepts that slow down progress in the area of cell death research. Authors, reviewers and editors of scientific periodicals are invited to abandon expressions like 'percentage apoptosis' and to replace them with more accurate descriptions of the biochemical and cellular parameters that are actually measured. Moreover, at the present stage, it should be accepted that caspase-independent mechanisms can cooperate with (or substitute for) caspases in the execution of lethal signaling pathways and that 'autophagic cell death' is a type of cell death occurring together with (but not necessarily by) autophagic vacuolization. This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including 'entosis', 'mitotic catastrophe', 'necrosis', 'necroptosis' and 'pyroptosis'.