951 resultados para Algorithmic Probability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a custom classification algorithm based on linear discriminant analysis and probability-based weights is implemented and applied to the hippocampus measurements of structural magnetic resonance images from healthy subjects and Alzheimer’s Disease sufferers; and then attempts to diagnose them as accurately as possible. The classifier works by classifying each measurement of a hippocampal volume as healthy controlsized or Alzheimer’s Disease-sized, these new features are then weighted and used to classify the subject as a healthy control or suffering from Alzheimer’s Disease. The preliminary results obtained reach an accuracy of 85.8% and this is a similar accuracy to state-of-the-art methods such as a Naive Bayes classifier and a Support Vector Machine. An advantage of the method proposed in this paper over the aforementioned state of the art classifiers is the descriptive ability of the classifications it produces. The descriptive model can be of great help to aid a doctor in the diagnosis of Alzheimer’s Disease, or even further the understand of how Alzheimer’s Disease affects the hippocampus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iso-score curves graph (iSCG) and mathematical relationships between Scoring Parameters (SP) and Forecasting Parameters (FP) can be used in Economic Scoring Formulas (ESF) used in tendering to distribute the score among bidders in the economic part of a proposal. Each contracting authority must set an ESF when publishing tender specifications and the strategy of each bidder will differ depending on the ESF selected and the weight of the overall proposal scoring. The various mathematical relationships and density distributions that describe the main SPs and FPs, and the representation of tendering data by means of iSCGs, enable the generation of two new types of graphs that can be very useful for bidders who want to be more competitive: the scoring and position probability graphs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders' bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder's probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper derives both lower and upper bounds for the probability distribution function of stationary ACD(p, q) processes. For the purpose of illustration, I specialize the results to the main parent distributions in duration analysis. Simulations show that the lower bound is much tighter than the upper bound.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper I will investigate the conditions under which a convex capacity (or a non-additive probability which exhibts uncertainty aversion) can be represented as a squeeze of a(n) (additive) probability measure associate to an uncertainty aversion function. Then I will present two alternatives forrnulations of the Choquet integral (and I will extend these forrnulations to the Choquet expected utility) in a parametric approach that will enable me to do comparative static exercises over the uncertainty aversion function in an easy way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a new algorithmic of Analog-to-Digital Converter is presented. This new topology use the current-mode technique that allows a large dynamic range and can be implemented in digital CMOS process. The ADC proposed is very small and can handle high sampling rates. Simulation results using a 1.2um CMOS process show that an 8-b ADC can support a sampling rate of 50MHz.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to determine the classification error probabilities, as lean or obese, in hypercaloric diet-induced obesity, which depends on the variable used to characterize animal obesity. In addition, the misclassification probabilities in animals submitted to normocaloric diet were also evaluated. Male Wistar rats were randomly distributed into two groups: normal diet (ND; n=3 1; 3,5 Kcal/g) and hypercaloric diet (HD; n=31; 4,6 Kcal/g). The ND group received commercial Labina rat feed and HD animals a cycle of five hypercaloric diets for a 14-week period. The variables analysed were body weight, body composition, body weight to length ratio, Lee index, body mass index and misclassification probability A 5% significance level was used. The hypercaloric pellet-diet cycle promoted increase of body weight, carcass fat, body weight to length ratio and Lee index. The total misclassification probabilities ranged from 19.21 % to 40.91 %. In Conclusion, the results of this experiment show that rnisclassification probabilities Occur when dietary manipulation is used to promote obesity in animals. This misjudgement ranges from 19.49% to 40.52% in hypercaloric diet and 18.94% to 41.30% in normocaloric diet.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An indirect estimate of consumable food and probability of acquiring food in a blowfly species, Chrysomya putoria, is presented. This alternative procedure combines three distinct models to estimate consumable food in the context of the exploitative competition experienced by immature individuals in blowfly populations. The relevant parameters are derived from data for pupal weight and survival and estimates of density-independent larval mortality in twenty different larval densities. As part of this procedure, the probability of acquiring food per unit of time and the time taken to exhaust the food supply are also calculated. The procedure employed here may be valuable for estimations in insects whose immature stages develop inside the food substrate, where it is difficult to partial out confounding effects such as separation of faeces. This procedure also has the advantage of taking into account the population dynamics of immatures living under crowded conditions, which are particularly characteristic of blowflies and other insects as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compute the survival probability {vertical bar S vertical bar(2)} of large rapidity gaps (LRG) in a QCD based eikonal model with a dynamical gluon mass, where this dynamical infrared mass scale represents the onset of nonperturbative contributions to the diffractive hadron-hadron scattering. Since rapidity gaps can occur in the case of Higgs boson production via fusion of electroweak bosons, we focus on WW -> H fusion processes and show that the resulting {vertical bar S vertical bar(2)} decreases with the increase of the energy of the incoming hadrons; in line with the available experimental data for LRG. We obtain {vertical bar S vertical bar(2)} = 27.6 +/- 7.8% (18.2 +/- 17.0%) at Tevatron (CERN-LHC) energy for a dynamical gluon mass m(g) = 400 MeV. (c) 2006 Elsevier B.V. All rights reserved.