971 resultados para Meta heuristic algorithm
Resumo:
Several studies have reported high performance of simple decision heuristics multi-attribute decision making. In this paper, we focus on situations where attributes are binary and analyze the performance of Deterministic-Elimination-By-Aspects (DEBA) and similar decision heuristics. We consider non-increasing weights and two probabilistic models for the attribute values: one where attribute values are independent Bernoulli randomvariables; the other one where they are binary random variables with inter-attribute positive correlations. Using these models, we show that good performance of DEBA is explained by the presence of cumulative as opposed to simple dominance. We therefore introduce the concepts of cumulative dominance compliance and fully cumulative dominance compliance and show that DEBA satisfies those properties. We derive a lower bound with which cumulative dominance compliant heuristics will choose a best alternative and show that, even with many attributes, this is not small. We also derive an upper bound for the expected loss of fully cumulative compliance heuristics and show that this is moderateeven when the number of attributes is large. Both bounds are independent of the values ofthe weights.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
We develop a mathematical programming approach for the classicalPSPACE - hard restless bandit problem in stochastic optimization.We introduce a hierarchy of n (where n is the number of bandits)increasingly stronger linear programming relaxations, the lastof which is exact and corresponds to the (exponential size)formulation of the problem as a Markov decision chain, while theother relaxations provide bounds and are efficiently computed. Wealso propose a priority-index heuristic scheduling policy fromthe solution to the first-order relaxation, where the indices aredefined in terms of optimal dual variables. In this way wepropose a policy and a suboptimality guarantee. We report resultsof computational experiments that suggest that the proposedheuristic policy is nearly optimal. Moreover, the second-orderrelaxation is found to provide strong bounds on the optimalvalue.
Resumo:
Abstract: To have an added value over BMD, a CRF of osteoporotic fracture must be predictable of the fracture, independent of BMD, reversible and quantifiable. Many major recognized CRF exist.Out of these factorsmany of themare indirect factor of bone quality. TBS predicts fracture independently of BMD as demonstrated from previous studies. The aim of the study is to verify if TBS can be considered as a major CRF of osteoporotic fracture. Existing validated datasets of Caucasian women were analyzed. These datasets stem from different studies performed by the authors of this report or provided to our group. However, the level of evidence of these studies will vary. Thus, the different datasets were weighted differently according to their design. This meta-like analysis involves more than 32000 women (≥50 years) with 2000 osteoporotic fractures from two prospective studies (OFELY&MANITOBA) and 7 crosssectional studies. Weighted relative risk (RR) for TBS was expressed for each decrease of one standard deviation as well as per tertile difference (TBS=1.300 and 1.200) and compared with those obtained for the major CRF included in FRAX®. Overall TBS RR obtained (adjusted for age) was 1.79 [95%CI-1.37-2.37]. For all women combined, RR for fracture for the lowest comparedwith themiddle TBS tertilewas 1.55[1.46- 1.68] and for the lowest compared with the highest TBS tertile was 2.8[2.70-3.00]. TBS is comparable to most of the major CRF (Fig 1) and thus could be used as one of them. Further studies have to be conducted to confirm these first findings.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
OBJECTIVES: To investigate whether associations of smoking with depression and anxiety are likely to be causal, using a Mendelian randomisation approach. DESIGN: Mendelian randomisation meta-analyses using a genetic variant (rs16969968/rs1051730) as a proxy for smoking heaviness, and observational meta-analyses of the associations of smoking status and smoking heaviness with depression, anxiety and psychological distress. PARTICIPANTS: Current, former and never smokers of European ancestry aged ≥16 years from 25 studies in the Consortium for Causal Analysis Research in Tobacco and Alcohol (CARTA). PRIMARY OUTCOME MEASURES: Binary definitions of depression, anxiety and psychological distress assessed by clinical interview, symptom scales or self-reported recall of clinician diagnosis. RESULTS: The analytic sample included up to 58 176 never smokers, 37 428 former smokers and 32 028 current smokers (total N=127 632). In observational analyses, current smokers had 1.85 times greater odds of depression (95% CI 1.65 to 2.07), 1.71 times greater odds of anxiety (95% CI 1.54 to 1.90) and 1.69 times greater odds of psychological distress (95% CI 1.56 to 1.83) than never smokers. Former smokers also had greater odds of depression, anxiety and psychological distress than never smokers. There was evidence for positive associations of smoking heaviness with depression, anxiety and psychological distress (ORs per cigarette per day: 1.03 (95% CI 1.02 to 1.04), 1.03 (95% CI 1.02 to 1.04) and 1.02 (95% CI 1.02 to 1.03) respectively). In Mendelian randomisation analyses, there was no strong evidence that the minor allele of rs16969968/rs1051730 was associated with depression (OR=1.00, 95% CI 0.95 to 1.05), anxiety (OR=1.02, 95% CI 0.97 to 1.07) or psychological distress (OR=1.02, 95% CI 0.98 to 1.06) in current smokers. Results were similar for former smokers. CONCLUSIONS: Findings from Mendelian randomisation analyses do not support a causal role of smoking heaviness in the development of depression and anxiety.
Resumo:
Cannabis use is highly prevalent among people with schizophrenia, and coupled with impaired cognition, is thought to heighten the risk of illness onset. However, while heavy cannabis use has been associated with cognitive deficits in long-term users, studies among patients with schizophrenia have been contradictory. This article consists of 2 studies. In Study I, a meta-analysis of 10 studies comprising 572 patients with established schizophrenia (with and without comorbid cannabis use) was conducted. Patients with a history of cannabis use were found to have superior neuropsychological functioning. This finding was largely driven by studies that included patients with a lifetime history of cannabis use rather than current or recent use. In Study II, we examined the neuropsychological performance of 85 patients with first-episode psychosis (FEP) and 43 healthy nonusing controls. Relative to controls, FEP patients with a history of cannabis use (FEP + CANN; n = 59) displayed only selective neuropsychological impairments while those without a history (FEP - CANN; n = 26) displayed generalized deficits. When directly compared, FEP + CANN patients performed better on tests of visual memory, working memory, and executive functioning. Patients with early onset cannabis use had less neuropsychological impairment than patients with later onset use. Together, these findings suggest that patients with schizophrenia or FEP with a history of cannabis use have superior neuropsychological functioning compared with nonusing patients. This association between better cognitive performance and cannabis use in schizophrenia may be driven by a subgroup of "neurocognitively less impaired" patients, who only developed psychosis after a relatively early initiation into cannabis use.
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
Toperform a meta-analysis of FDG-PET performances in the diagnosis of largevessels vasculitis (Giant Cell Arteritis (GCA) associated or not withPolymyalgia Rheumatica(PMR), Takayasu). Materials and methods : The MEDLINE,Cochrane Library, Embase were searched for relevant original articlesdescribing FDG-PET for vasculitis assessment, using MesH terms ("GiantCell Arteritis or Vasculitis" AND "PET"). Criteria for inclusionwere:(1)FDG-PET for diagnosis of vasculitis(2)American College of Rheumatologycriteria as reference standard(3)control group. After data extraction, analyseswere performed using a random-effects model. Results : Of 184 citations(database search and references screening),70 articles were reviewed of which12 eligible studies were extracted (sensitivity range from 32% to 97%). 7studies fulfilled all inclusion criteria. Owing to overlapping population, 1study was excluded. Statistical heterogeneity justified the random-effectsmodel. Pooled 6 studies analysis(116 vasculitis,224 controls) showed a 81%sensitivity (95%CI:70-89%);a 89% specificity (95%CI:77-95%);a 85%PPV(95%CI:63-95%); a 90% NPV(95%CI:79-95%);a 7.1 positive LR(95%CI:3.4-14.9); a0.2 negative LR(95%CI:0.14-0.35) and 90.1 DOR(95%CI: 18.6-437). Conclusion :FDG-PET has good diagnostic performances in the detection of large vesselsvasculitis. Its promising role could be extended to follow up patients undertreatment, but further studies are needed to confirm this possibility.