915 resultados para Uncertainty in Illness Theory
Resumo:
Winter weather in Iowa is often unpredictable and can have an adverse impact on traffic flow. The Iowa Department of Transportation (Iowa DOT) attempts to lessen the impact of winter weather events on traffic speeds with various proactive maintenance operations. In order to assess the performance of these maintenance operations, it would be beneficial to develop a model for expected speed reduction based on weather variables and normal maintenance schedules. Such a model would allow the Iowa DOT to identify situations in which speed reductions were much greater than or less than would be expected for a given set of storm conditions, and make modifications to improve efficiency and effectiveness. The objective of this work was to predict speed changes relative to baseline speed under normal conditions, based on nominal maintenance schedules and winter weather covariates (snow type, temperature, and wind speed), as measured by roadside weather stations. This allows for an assessment of the impact of winter weather covariates on traffic speed changes, and estimation of the effect of regular maintenance passes. The researchers chose events from Adair County, Iowa and fit a linear model incorporating the covariates mentioned previously. A Bayesian analysis was conducted to estimate the values of the parameters of this model. Specifically, the analysis produces a distribution for the parameter value that represents the impact of maintenance on traffic speeds. The effect of maintenance is not a constant, but rather a value that the researchers have some uncertainty about and this distribution represents what they know about the effects of maintenance. Similarly, examinations of the distributions for the effects of winter weather covariates are possible. Plots of observed and expected traffic speed changes allow a visual assessment of the model fit. Future work involves expanding this model to incorporate many events at multiple locations. This would allow for assessment of the impact of winter weather maintenance across various situations, and eventually identify locations and times in which maintenance could be improved.
Resumo:
The article is concerned with the formal definition of a largely unnoticed factor in narrative structure. Based on the assumptions that (1) the semantics of a written text depend, among other factors, directly on its visual alignment in space, that (2) the formal structure of a text has to meet that of its spatial presentation and that (3) these assumptions hold true also for narrative texts (which, however, in modern times typically conceal their spatial dimensions by a low-key linear layout), it is argued that, how ever low-key, the expected material shape of a given narrative determines the configuration of its plot by its author. The ,implied book' thus denotes an author's historically assumable, not necessarily conscious idea of how his text, which is still in the process of creation, will be dimensionally presented and under these circumstances visually absorbed. Assuming that an author's knowledge of this later (potentially) substantiated material form influences the composition, the implied book is to be understood as a text-genetically determined, structuring moment of the text. Historically reconstructed, it thus serves the methodical analysis of structural characteristics of a completed text.
Resumo:
The use of the Bayes factor (BF) or likelihood ratio as a metric to assess the probative value of forensic traces is largely supported by operational standards and recommendations in different forensic disciplines. However, the progress towards more widespread consensus about foundational principles is still fragile as it raises new problems about which views differ. It is not uncommon e.g. to encounter scientists who feel the need to compute the probability distribution of a given expression of evidential value (i.e. a BF), or to place intervals or significance probabilities on such a quantity. The article here presents arguments to show that such views involve a misconception of principles and abuse of language. The conclusion of the discussion is that, in a given case at hand, forensic scientists ought to offer to a court of justice a given single value for the BF, rather than an expression based on a distribution over a range of values.
Resumo:
In this article I intend to show that certain aspects of A.N. Whitehead's philosophy of organism and especially his epochal theory of time, as mainly exposed in his well-known work Process and Reality, can serve in clarify the underlying assumptions that shape nonstandard mathematical theories as such and also as metatheories of quantum mechanics. Concerning the latter issue, I point to an already significant research on nonstandard versions of quantum mechanics; two of these approaches are chosen to be critically presented in relation to the scope of this work. The main point of the paper is that, insofar as we can refer a nonstandard mathematical entity to a kind of axiomatical formalization essentially 'codifying' an underlying mental process indescribable as such by analytic means, we can possibly apply certain principles of Whitehead's metaphysical scheme focused on the key notion of process which is generally conceived as the becoming of actual entities. This is done in the sense of a unifying approach to provide an interpretation of nonstandard mathematical theories as such and also, in their metatheoretical status, as a formalization of the empirical-experimental context of quantum mechanics.
Resumo:
Optimization of quantum measurement processes has a pivotal role in carrying out better, more accurate or less disrupting, measurements and experiments on a quantum system. Especially, convex optimization, i.e., identifying the extreme points of the convex sets and subsets of quantum measuring devices plays an important part in quantum optimization since the typical figures of merit for measuring processes are affine functionals. In this thesis, we discuss results determining the extreme quantum devices and their relevance, e.g., in quantum-compatibility-related questions. Especially, we see that a compatible device pair where one device is extreme can be joined into a single apparatus essentially in a unique way. Moreover, we show that the question whether a pair of quantum observables can be measured jointly can often be formulated in a weaker form when some of the observables involved are extreme. Another major line of research treated in this thesis deals with convex analysis of special restricted quantum device sets, covariance structures or, in particular, generalized imprimitivity systems. Some results on the structure ofcovariant observables and instruments are listed as well as results identifying the extreme points of covariance structures in quantum theory. As a special case study, not published anywhere before, we study the structure of Euclidean-covariant localization observables for spin-0-particles. We also discuss the general form of Weyl-covariant phase-space instruments. Finally, certain optimality measures originating from convex geometry are introduced for quantum devices, namely, boundariness measuring how ‘close’ to the algebraic boundary of the device set a quantum apparatus is and the robustness of incompatibility quantifying the level of incompatibility for a quantum device pair by measuring the highest amount of noise the pair tolerates without becoming compatible. Boundariness is further associated to minimum-error discrimination of quantum devices, and robustness of incompatibility is shown to behave monotonically under certain compatibility-non-decreasing operations. Moreover, the value of robustness of incompatibility is given for a few special device pairs.
Resumo:
We provide an algorithm that automatically derives many provable theorems in the equational theory of allegories. This was accomplished by noticing properties of an existing decision algorithm that could be extended to provide a derivation in addition to a decision certificate. We also suggest improvements and corrections to previous research in order to motivate further work on a complete derivation mechanism. The results presented here are significant for those interested in relational theories, since we essentially have a subtheory where automatic proof-generation is possible. This is also relevant to program verification since relations are well-suited to describe the behaviour of computer programs. It is likely that extensions of the theory of allegories are also decidable and possibly suitable for further expansions of the algorithm presented here.
Resumo:
Philippe van Parijs (2003) has argued that an egalitarian ethos cannot be part of a post- Political Liberalism Rawlsian view of justice, because the demands of political justice are confined to principles for institutions of the basic structure alone. This paper argues, by contrast, that certain principles for individual conduct—including a principle requiring relatively advantaged individuals to sometimes make their economic choices with the aim of maximising the prospects of the least advantaged—are an integral part of a Rawlsian political conception of justice. It concludes that incentive payments will have a clearly limited role in a Rawlsian theory of justice.
Resumo:
Gowers, dans son article sur les matrices quasi-aléatoires, étudie la question, posée par Babai et Sos, de l'existence d'une constante $c>0$ telle que tout groupe fini possède un sous-ensemble sans produit de taille supérieure ou égale a $c|G|$. En prouvant que, pour tout nombre premier $p$ assez grand, le groupe $PSL_2(\mathbb{F}_p)$ (d'ordre noté $n$) ne posséde aucun sous-ensemble sans produit de taille $c n^{8/9}$, il y répond par la négative. Nous allons considérer le probléme dans le cas des groupes compacts finis, et plus particuliérement des groupes profinis $SL_k(\mathbb{Z}_p)$ et $Sp_{2k}(\mathbb{Z}_p)$. La premiére partie de cette thése est dédiée à l'obtention de bornes inférieures et supérieures exponentielles pour la mesure suprémale des ensembles sans produit. La preuve nécessite d'établir préalablement une borne inférieure sur la dimension des représentations non-triviales des groupes finis $SL_k(\mathbb{Z}/(p^n\mathbb{Z}))$ et $Sp_{2k}(\mathbb{Z}/(p^n\mathbb{Z}))$. Notre théoréme prolonge le travail de Landazuri et Seitz, qui considérent le degré minimal des représentations pour les groupes de Chevalley sur les corps finis, tout en offrant une preuve plus simple que la leur. La seconde partie de la thése à trait à la théorie algébrique des nombres. Un polynome monogéne $f$ est un polynome unitaire irréductible à coefficients entiers qui endengre un corps de nombres monogéne. Pour un nombre premier $q$ donné, nous allons montrer, en utilisant le théoréme de densité de Tchebotariov, que la densité des nombres premiers $p$ tels que $t^q -p$ soit monogéne est supérieure ou égale à $(q-1)/q$. Nous allons également démontrer que, quand $q=3$, la densité des nombres premiers $p$ tels que $\mathbb{Q}(\sqrt[3]{p})$ soit non monogéne est supérieure ou égale à $1/9$.
Resumo:
Examina los aspectos que son importantes para una enseñanza y aprendizaje eficaz en las escuelas. El recurso está dividido en tres partes que reflejan las tres tareas fundamentales de la formación docente: la primera parte se centra en ayudar a los profesores a reforzar su conocimiento y comprensión de una enseñanza eficaz y el aprendizaje del alumno; la segunda parte se centra en los aspectos individuales de buenas prácticas en el aula; desde la creación de la experiencia de aprendizaje teniendo en cuenta las diferencias pedagógicas fundamentales hasta el establecimiento de relaciones constructivas con los alumnos. La tercera parte reflexiona sobre la experiencia docente. Esta edición incluye el aprendizaje personalizado, los nuevos avances en la TIC, la enseñanza interactiva. También contiene una terminología actualizada.
Resumo:
The energy decomposition scheme proposed in a recent paper has been realized by performing numerical integrations. The sample calculations carried out for some simple molecules show excellent agreement with the chemical picture of molecules, indicating that such an energy decomposition analysis can be useful from the point of view of connecting quantum mechanics with the genuine chemical concepts
Resumo:
Faced by the realities of a changing climate, decision makers in a wide variety of organisations are increasingly seeking quantitative predictions of regional and local climate. An important issue for these decision makers, and for organisations that fund climate research, is what is the potential for climate science to deliver improvements - especially reductions in uncertainty - in such predictions? Uncertainty in climate predictions arises from three distinct sources: internal variability, model uncertainty and scenario uncertainty. Using data from a suite of climate models we separate and quantify these sources. For predictions of changes in surface air temperature on decadal timescales and regional spatial scales, we show that uncertainty for the next few decades is dominated by sources (model uncertainty and internal variability) that are potentially reducible through progress in climate science. Furthermore, we find that model uncertainty is of greater importance than internal variability. Our findings have implications for managing adaptation to a changing climate. Because the costs of adaptation are very large, and greater uncertainty about future climate is likely to be associated with more expensive adaptation, reducing uncertainty in climate predictions is potentially of enormous economic value. We highlight the need for much more work to compare: a) the cost of various degrees of adaptation, given current levels of uncertainty; and b) the cost of new investments in climate science to reduce current levels of uncertainty. Our study also highlights the importance of targeting climate science investments on the most promising opportunities to reduce prediction uncertainty.