978 resultados para Shortest path problem
Resumo:
BACKGROUND Illiteracy, a universal problem, limits the utilization of the most widely used short cognitive tests. Our objective was to assess and compare the effectiveness and cost for cognitive impairment (CI) and dementia (DEM) screening of three short cognitive tests applicable to illiterates. METHODS Phase III diagnostic test evaluation study was performed during one year in four Primary Care centers, prospectively including individuals with suspicion of CI or DEM. All underwent the Eurotest, Memory Alteration Test (M@T), and Phototest, applied in a balanced manner. Clinical, functional, and cognitive studies were independently performed in a blinded fashion in a Cognitive Behavioral Neurology Unit, and the gold standard diagnosis was established by consensus of expert neurologists on the basis of these results. Effectiveness of tests was assessed as the proportion of correct diagnoses (diagnostic accuracy [DA]) and the kappa index of concordance (k) with respect to gold standard diagnoses. Costs were based on public prices at the time and hospital accounts. RESULTS The study included 139 individuals: 47 with DEM, 36 with CI, and 56 without CI. No significant differences in effectiveness were found among the tests. For DEM screening: Eurotest (k = 0.71 [0.59-0.83], DA = 0.87 [0.80-0.92]), M@T (k = 0.72 [0.60-0.84], DA = 0.87 [0.80-0.92]), Phototest (k = 0.70 [0.57-0.82], DA = 0.86 [0.79-0.91]). For CI screening: Eurotest (k = 0.67 [0.55-0.79]; DA = 0.83 [0.76-0.89]), M@T (k = 0.52 [0.37-0.67]; DA = 0.80 [0.72-0.86]), Phototest (k = 0.59 [0.46-0.72]; DA = 0.79 [0.71-0.86]). There were no differences in the cost of DEM screening, but the cost of CI screening was significantly higher with M@T (330.7 ± 177.1 €, mean ± sd) than with Eurotest (294.1 ± 195.0 €) or Phototest (296.0 ± 196. 5 €). Application time was shorter with Phototest (2.8 ± 0.8 min) than with Eurotest (7.1 ± 1.8 min) or M@T (6.8 ± 2.2 min). CONCLUSIONS Eurotest, M@T, and Phototest are equally effective. Eurotest and Phototest are both less expensive options but Phototest is the most efficient, requiring the shortest application time.
Resumo:
Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
The statistical analysis of literary style is the part of stylometry that compares measurable characteristicsin a text that are rarely controlled by the author, with those in other texts. When thegoal is to settle authorship questions, these characteristics should relate to the author’s style andnot to the genre, epoch or editor, and they should be such that their variation between authors islarger than the variation within comparable texts from the same author.For an overview of the literature on stylometry and some of the techniques involved, see for exampleMosteller and Wallace (1964, 82), Herdan (1964), Morton (1978), Holmes (1985), Oakes (1998) orLebart, Salem and Berry (1998).Tirant lo Blanc, a chivalry book, is the main work in catalan literature and it was hailed to be“the best book of its kind in the world” by Cervantes in Don Quixote. Considered by writterslike Vargas Llosa or Damaso Alonso to be the first modern novel in Europe, it has been translatedseveral times into Spanish, Italian and French, with modern English translations by Rosenthal(1996) and La Fontaine (1993). The main body of this book was written between 1460 and 1465,but it was not printed until 1490.There is an intense and long lasting debate around its authorship sprouting from its first edition,where its introduction states that the whole book is the work of Martorell (1413?-1468), while atthe end it is stated that the last one fourth of the book is by Galba (?-1490), after the death ofMartorell. Some of the authors that support the theory of single authorship are Riquer (1990),Chiner (1993) and Badia (1993), while some of those supporting the double authorship are Riquer(1947), Coromines (1956) and Ferrando (1995). For an overview of this debate, see Riquer (1990).Neither of the two candidate authors left any text comparable to the one under study, and thereforediscriminant analysis can not be used to help classify chapters by author. By using sample textsencompassing about ten percent of the book, and looking at word length and at the use of 44conjunctions, prepositions and articles, Ginebra and Cabos (1998) detect heterogeneities that mightindicate the existence of two authors. By analyzing the diversity of the vocabulary, Riba andGinebra (2000) estimates that stylistic boundary to be near chapter 383.Following the lead of the extensive literature, this paper looks into word length, the use of the mostfrequent words and into the use of vowels in each chapter of the book. Given that the featuresselected are categorical, that leads to three contingency tables of ordered rows and therefore tothree sequences of multinomial observations.Section 2 explores these sequences graphically, observing a clear shift in their distribution. Section 3describes the problem of the estimation of a suden change-point in those sequences, in the followingsections we propose various ways to estimate change-points in multinomial sequences; the methodin section 4 involves fitting models for polytomous data, the one in Section 5 fits gamma modelsonto the sequence of Chi-square distances between each row profiles and the average profile, theone in Section 6 fits models onto the sequence of values taken by the first component of thecorrespondence analysis as well as onto sequences of other summary measures like the averageword length. In Section 7 we fit models onto the marginal binomial sequences to identify thefeatures that distinguish the chapters before and after that boundary. Most methods rely heavilyon the use of generalized linear models
Resumo:
Epipolar geometry is a key point in computer vision and the fundamental matrix estimation is the only way to compute it. This article surveys several methods of fundamental matrix estimation which have been classified into linear methods, iterative methods and robust methods. All of these methods have been programmed and their accuracy analysed using real images. A summary, accompanied with experimental results, is given
Resumo:
In this paper, different recovery methods applied at different network layers and time scales are used in order to enhance the network reliability. Each layer deploys its own fault management methods. However, current recovery methods are applied to only a specific layer. New protection schemes, based on the proposed partial disjoint path algorithm, are defined in order to avoid protection duplications in a multi-layer scenario. The new protection schemes also encompass shared segment backup computation and shared risk link group identification. A complete set of experiments proves the efficiency of the proposed methods in relation with previous ones, in terms of resources used to protect the network, the failure recovery time and the request rejection ratio
Resumo:
Hip or knee arthroplasty is proposed after osteoarthritis or an accident. It is decided after a long path of pain and decrease in the quality of life. This research explores the period of illness until surgery. Twenty-four semi-structured interviews were conducted one month before surgery and a thematic discourse analysis performed. The diversity and complexity of the patient experience, in a commonly performed surgical intervention underlines important topics, requiring attention in order to improve patient preparations and information prior to arthroplasty: information adapted to individual concerns, needs and representations. Psychological and physical acceptance is necessary for integration of the prosthesis.
Resumo:
La crisi i esfondrament del pensament metafísic heretat de la modernitat deixa la filosofia contemporània davant d'un nou paradigma on el coneixement s’ha de construir prescindint de tota identitat i fonamentació. El meu projecte s’estableix com un recorregut descendent que parteix d’un àmbit concret, com és el del problema de la manca de fonamentació en la filosofia política contemporània, per arribar a la veritable arrel del problema general que no és altre que la mateixa naturalesa del llenguatge filosòfic. El punt de partida és la pregunta sobre la possibilitat d’una filosofia política en termes postmetafísics. La filosofia política, atrapada entre les forces de la tirania unitària del concepte metafísic i la dissolució pràctica en pro de la realitat instrumental, traça ponts cap a l’estètica i la deconstrucció, que tenen com a corol•lari final qüestionar-nos els propis límits del pensament polític. El concepte d’impolític és una sortida deconstructiva a aquest atzucac. Des d’Esposito, Rancière, Nancy, però sobretot Massimo Cacciari, he aprofundit en el paradigma postmetafísic que origina aquesta negació política de la pròpia política, política com els seus límits, relació com a distància i identitat com a silenci. És evident que la clau de volta és l’herència i recepció contemporània de Nietzsche i la seva crítica a la transcendentalitat moderna en el sí de l’elaboració d’un coneixement en un naufragi constant pel fracàs de la síntesi que anhela. És aquesta herència la que ha possibilitat aquest pensament negatiu contemporani, el del joc wittgensteinià, la deconstrucció del valor que queda convertit en el seu propi marge (Derrida). Definim així no només una comunitat política basada en la incommensurabilitat dels seus membres alhora buits de contingut (Musil), sinó un model de llenguatge que és el seu propi silenci, un llenguatge en contínua lluita contra sí mateix. El meu projecte és una relectura d’aquesta veritat no identitària on el concepte de diàleg pren una importància cabdal. La filosofia de la música aquí es presenta com un terreny fèrtil d’eines conceptuals a l’hora de desenvolupar-ho. La música és el llenguatge negatiu que només troba possibilitat en la seva pròpia impossibilitat de contingut sintètic. Més enllà de les referències obligatòries a chönberg i Adorno entre altres, el camí iniciat per Bergson amb la introducció de la temporalitat a la discussió obra la porta al paper de l’esdveniment en aquest discurs sobre la impossibilitat. Tornant a la filosofia, on el propi llenguatge filosòfic es defineix ja com a impossible, l’esdeveniment reobre l’antiga tensió entre l’escriptura i la paraula viva, veritable fonament del problema, i vèrtex de la possibilitat d’aquesta filosofia impossible.
Resumo:
First: A continuous-time version of Kyle's model (Kyle 1985), known as the Back's model (Back 1992), of asset pricing with asymmetric information, is studied. A larger class of price processes and of noise traders' processes are studied. The price process, as in Kyle's model, is allowed to depend on the path of the market order. The process of the noise traders' is an inhomogeneous Lévy process. Solutions are found by the Hamilton-Jacobi-Bellman equations. With the insider being risk-neutral, the price pressure is constant, and there is no equilibirium in the presence of jumps. If the insider is risk-averse, there is no equilibirium in the presence of either jumps or drifts. Also, it is analised when the release time is unknown. A general relation is established between the problem of finding an equilibrium and of enlargement of filtrations. Random announcement time is random is also considered. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time. Second: Power variations. it is considered, the asymptotic behavior of the power variation of processes of the form _integral_0^t u(s-)dS(s), where S_ is an alpha-stable process with index of stability 0&alpha&2 and the integral is an Itô integral. Stable convergence of corresponding fluctuations is established. These results provide statistical tools to infer the process u from discrete observations. Third: A bond market is studied where short rates r(t) evolve as an integral of g(t-s)sigma(s) with respect to W(ds), where g and sigma are deterministic and W is the stochastic Wiener measure. Processes of this type are particular cases of ambit processes. These processes are in general not of the semimartingale kind.
Resumo:
We formulate a necessary and sufficient condition for polynomials to be dense in a space of continuous functions on the real line, with respect to Bernstein's weighted uniform norm. Equivalently, for a positive finite measure [lletra "mu" minúscula de l'alfabet grec] on the real line we give a criterion for density of polynomials in Lp[lletra "mu" minúscula de l'alfabet grec entre parèntesis].
Resumo:
Chagas disease or American trypanosomiasis is, together with geohelminths, the neglected disease that causes more loss of years of healthy life due to disability in Latin America. Chagas disease, as determined by the factors and determinants, shows that different contexts require different actions, preventing new cases or reducing the burden of disease. Control strategies must combine two general courses of action including prevention of transmission to prevent the occurrence of new cases (these measures are cost effective), as well as opportune diagnosis and treatment of infected individuals in order to prevent the clinical evolution of the disease and to allow them to recuperate their health. All actions should be implemented as fully as possible and with an integrated way, to maximise the impact. Chagas disease cannot be eradicated due because of the demonstrated existence of infected wild triatomines in permanent contact with domestic cycles and it contributes to the occurrence of at least few new cases. However, it is possible to interrupt the transmission ofTrypanosoma cruziin a large territory and to eliminate Chagas disease as a public health problem with a dramatic reduction of burden of the disease.