915 resultados para Uncertainty in Illness Theory
Resumo:
Measuring productive efficiency provides information on the likely effects of regulatory reform. We present a Data Envelopment Analysis (DEA) of a sample of 38 vehicle inspection units under a concession regime, between the years 2000 and 2004. The differences in efficiency scores show the potential technical efficiency benefit of introducing some form of incentive regulation or of progressing towards liberalization. We also compute scale efficiency scores, showing that only units in territories with very low population density operate at a sub-optimal scale. Among those that operate at an optimal scale, there are significant differences in size; the largest ones operate in territories with the highest population density. This suggests that the introduction of new units in the most densely populated territories (a likely effect of some form of liberalization) would not be detrimental in terms of scale efficiency. We also find that inspection units belonging to a large, diversified firm show higher technical efficiency, reflecting economies of scale or scope at the firm level. Finally, we show that between 2002 and 2004, a period of high regulatory uncertainty in the sample’s region, technical change was almost zero. Regulatory reform should take due account of scale and diversification effects, while at the same time avoiding regulatory uncertainty.
Resumo:
This paper aims at providing a Bayesian parametric framework to tackle the accessibility problem across space in urban theory. Adopting continuous variables in a probabilistic setting we are able to associate with the distribution density to the Kendall's tau index and replicate the general issues related to the role of proximity in a more general context. In addition, by referring to the Beta and Gamma distribution, we are able to introduce a differentiation feature in each spatial unit without incurring in any a-priori definition of territorial units. We are also providing an empirical application of our theoretical setting to study the density distribution of the population across Massachusetts.
Resumo:
Forensic scientists working in 12 state or private laboratories participated in collaborative tests to improve the reliability of the presentation of DNA data at trial. These tests were motivated in response to the growing criticism of the power of DNA evidence. The experts' conclusions in the tests are presented and discussed in the context of the Bayesian approach to interpretation. The use of a Bayesian approach and subjective probabilities in trace evaluation permits, in an easy and intuitive manner, the integration into the decision procedure of any revision of the measure of uncertainty in the light of new information. Such an integration is especially useful with forensic evidence. Furthermore, we believe that this probabilistic model is a useful tool (a) to assist scientists in the assessment of the value of scientific evidence, (b) to help jurists in the interpretation of judicial facts and (c) to clarify the respective roles of scientists and of members of the court. Respondents to the survey were reluctant to apply this methodology in the assessment of DNA evidence.
Resumo:
AbstractBreast cancer is one of the most common cancers affecting one in eight women during their lives. Survival rates have increased steadily thanks to early diagnosis with mammography screening and more efficient treatment strategies. Post-operative radiation therapy is a standard of care in the management of breast cancer and has been shown to reduce efficiently both local recurrence rate and breast cancer mortality. Radiation therapy is however associated with some late effects for long-term survivors. Radiation-induced secondary cancer is a relatively rare but severe late effect of radiation therapy. Currently, radiotherapy plans are essentially optimized to maximize tumor control and minimize late deterministic effects (tissue reactions) that are mainly associated with high doses (» 1 Gy). With improved cure rates and new radiation therapy technologies, it is also important to evaluate and minimize secondary cancer risks for different treatment techniques. This is a particularly challenging task due to the large uncertainties in the dose-response relationship.In contrast with late deterministic effects, secondary cancers may be associated with much lower doses and therefore out-of-field doses (also called peripheral doses) that are typically inferior to 1 Gy need to be determined accurately. Out-of-field doses result from patient scatter and head scatter from the treatment unit. These doses are particularly challenging to compute and we characterized it by Monte Carlo (MC) calculation. A detailed MC model of the Siemens Primus linear accelerator has been thoroughly validated with measurements. We investigated the accuracy of such a model for retrospective dosimetry in epidemiological studies on secondary cancers. Considering that patients in such large studies could be treated on a variety of machines, we assessed the uncertainty in reconstructed peripheral dose due to the variability of peripheral dose among various linac geometries. For large open fields (> 10x10 cm2), the uncertainty would be less than 50%, but for small fields and wedged fields the uncertainty in reconstructed dose could rise up to a factor of 10. It was concluded that such a model could be used for conventional treatments using large open fields only.The MC model of the Siemens Primus linac was then used to compare out-of-field doses for different treatment techniques in a female whole-body CT-based phantom. Current techniques such as conformai wedged-based radiotherapy and hybrid IMRT were investigated and compared to older two-dimensional radiotherapy techniques. MC doses were also compared to those of a commercial Treatment Planning System (TPS). While the TPS is routinely used to determine the dose to the contralateral breast and the ipsilateral lung which are mostly out of the treatment fields, we have shown that these doses may be highly inaccurate depending on the treatment technique investigated. MC shows that hybrid IMRT is dosimetrically similar to three-dimensional wedge-based radiotherapy within the field, but offers substantially reduced doses to out-of-field healthy organs.Finally, many different approaches to risk estimations extracted from the literature were applied to the calculated MC dose distribution. Absolute risks varied substantially as did the ratio of risk between two treatment techniques, reflecting the large uncertainties involved with current risk models. Despite all these uncertainties, the hybrid IMRT investigated resulted in systematically lower cancer risks than any of the other treatment techniques. More epidemiological studies with accurate dosimetry are required in the future to construct robust risk models. In the meantime, any treatment strategy that reduces out-of-field doses to healthy organs should be investigated. Electron radiotherapy might offer interesting possibilities with this regard.RésuméLe cancer du sein affecte une femme sur huit au cours de sa vie. Grâce au dépistage précoce et à des thérapies de plus en plus efficaces, le taux de guérison a augmenté au cours du temps. La radiothérapie postopératoire joue un rôle important dans le traitement du cancer du sein en réduisant le taux de récidive et la mortalité. Malheureusement, la radiothérapie peut aussi induire des toxicités tardives chez les patients guéris. En particulier, les cancers secondaires radio-induits sont une complication rare mais sévère de la radiothérapie. En routine clinique, les plans de radiothérapie sont essentiellement optimisées pour un contrôle local le plus élevé possible tout en minimisant les réactions tissulaires tardives qui sont essentiellement associées avec des hautes doses (» 1 Gy). Toutefois, avec l'introduction de différentes nouvelles techniques et avec l'augmentation des taux de survie, il devient impératif d'évaluer et de minimiser les risques de cancer secondaire pour différentes techniques de traitement. Une telle évaluation du risque est une tâche ardue étant donné les nombreuses incertitudes liées à la relation dose-risque.Contrairement aux effets tissulaires, les cancers secondaires peuvent aussi être induits par des basses doses dans des organes qui se trouvent hors des champs d'irradiation. Ces organes reçoivent des doses périphériques typiquement inférieures à 1 Gy qui résultent du diffusé du patient et du diffusé de l'accélérateur. Ces doses sont difficiles à calculer précisément, mais les algorithmes Monte Carlo (MC) permettent de les estimer avec une bonne précision. Un modèle MC détaillé de l'accélérateur Primus de Siemens a été élaboré et validé avec des mesures. La précision de ce modèle a également été déterminée pour la reconstruction de dose en épidémiologie. Si on considère que les patients inclus dans de larges cohortes sont traités sur une variété de machines, l'incertitude dans la reconstruction de dose périphérique a été étudiée en fonction de la variabilité de la dose périphérique pour différents types d'accélérateurs. Pour de grands champs (> 10x10 cm ), l'incertitude est inférieure à 50%, mais pour de petits champs et des champs filtrés, l'incertitude de la dose peut monter jusqu'à un facteur 10. En conclusion, un tel modèle ne peut être utilisé que pour les traitements conventionnels utilisant des grands champs.Le modèle MC de l'accélérateur Primus a été utilisé ensuite pour déterminer la dose périphérique pour différentes techniques dans un fantôme corps entier basé sur des coupes CT d'une patiente. Les techniques actuelles utilisant des champs filtrés ou encore l'IMRT hybride ont été étudiées et comparées par rapport aux techniques plus anciennes. Les doses calculées par MC ont été comparées à celles obtenues d'un logiciel de planification commercial (TPS). Alors que le TPS est utilisé en routine pour déterminer la dose au sein contralatéral et au poumon ipsilatéral qui sont principalement hors des faisceaux, nous avons montré que ces doses peuvent être plus ou moins précises selon la technTque étudiée. Les calculs MC montrent que la technique IMRT est dosimétriquement équivalente à celle basée sur des champs filtrés à l'intérieur des champs de traitement, mais offre une réduction importante de la dose aux organes périphériques.Finalement différents modèles de risque ont été étudiés sur la base des distributions de dose calculées par MC. Les risques absolus et le rapport des risques entre deux techniques de traitement varient grandement, ce qui reflète les grandes incertitudes liées aux différents modèles de risque. Malgré ces incertitudes, on a pu montrer que la technique IMRT offrait une réduction du risque systématique par rapport aux autres techniques. En attendant des données épidémiologiques supplémentaires sur la relation dose-risque, toute technique offrant une réduction des doses périphériques aux organes sains mérite d'être étudiée. La radiothérapie avec des électrons offre à ce titre des possibilités intéressantes.
Resumo:
Hem establert les bases metodològiques i teòriques per investigar la pregunta “Tenen les nacions sense estat el dret de controlar el seu propi espai de comunicació?”. La investigació ajusta el concepte d’espai de comunicació a la teoria política, cercant els seus límits en els drets individuals i, des de la perspectiva del liberalisme 2, aportant la justificació del seu control en quant que plataforma que incideix en la conservació i supervivència d’una cultura nacional. El primer article i fase de la tesi és l’adaptació i definició del concepte espai de comunicació. Fins ara, la recerca ha proposat diferents models d’espai de comunicació entenent si es tracta d’una visió emfatitzant la distribució i la producció de material marcat amb els símbols de la identitat nacional de la societat emissora, o bé si emfatitza la idea d’un espai de circulació de fluxos comunicatiu ajustat a un territori tradicionalment vinculat a una identitat nacional o nació sense estat. Igualment, es distingeix la dimensió d’emissió –sortir del territori al món- i la de recepció –fluxos informatius rebuts des del món al territori, concretament, al ciutadà; el paper d’intervenció de les institucions democràtiques és diferent en una dimensió o una altra i, per tant, també són diferents els drets afectats i les teories o principis que neguen o justifiquen el control de l’espai de comunicació. També s’ha indagat en les teories sobre els efectes cognitius dels mitjans de comunicació per relacionar-los amb la construcció nacional com a cohesió simbòlica i cultural. Si bé els mitjans no poden fer canviar de pensament immediatament, sí que poden conformar a llarg termini una percepció nacional general. Una comunitat és imaginada, donada la distància física dels seus components, i la comunicació social és, juntament amb l’educació, el principal factor de construcció nacional, avui en dia.
Resumo:
In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.
Resumo:
In legal medicine, the post mortem interval (PMI) of interest covers the last 50 years. When only human skeletal remains are found, determining the PMI currently relies mostly on the experience of the forensic anthropologist, with few techniques available to help. Recently, several radiometric methods have been proposed to reveal PMI. For instance, (14)C and (90)Sr bomb pulse dating covers the last 60 years and give reliable PMI when teeth or bones are available. (232)Th series dating has also been proposed but requires a large amount of bones. In addition, (210)Pb dating is promising but is submitted to diagenesis and individual habits like smoking that must be handled carefully. Here we determine PMI on 29 cases of forensic interest using (90)Sr bomb pulse. In 12 cases, (210)Pb dating was added to narrow the PMI interval. In addition, anthropological investigations were carried out on 15 cases to confront anthropological expertise to the radiometric method. Results show that 10 of the 29 cases can be discarded as having no forensic interest (PMI>50 years) based only on the (90)Sr bomb pulse dating. For 10 other cases, the additional (210)Pb dating restricts the PMI uncertainty to a few years. In 15 cases, anthropological investigations corroborate the radiometric PMI. This study also shows that diagenesis and inter-individual difference in radionuclide uptake represent the main sources of uncertainty in the PMI determination using radiometric methods.
Resumo:
El canvi climàtic del segle XXI és una realitat, hi ha moltes evidències científiques que indiquen que l’escalfament del sistema climàtic és inequívoc. Malgrat això, també hi ha moltes incerteses respecte els impactes que pot comportar aquest canvi climàtic global. L’objectiu d’aquest projecte és estudiar la possible evolució futura de tres variables climàtiques, que són el rang de la temperatura diürna a prop de la superfície (DTR), la temperatura mitjana a prop de la superfície (MT) i la precipitació mensual (PL_mes) i valorar l’exposició que poden experimentar diferents cobertes del sòl i diferents regions biogeogràfiques del continent europeu davant d’aquests possibles patrons de canvi. Per això s’han utilitzat Models Climàtics Globals que fan projeccions de variables climàtiques que permeten preveure el possible clima futur. Mitjançant l’aplicatiu informàtic Tetyn s’han extret els paràmetres climàtics dels conjunts de dades del Tyndall Centre for Climate Change Research, del futur (TYN SC) i del passat (CRU TS). Les variables obtingudes s’han processat amb eines de sistemes d’informació geogràfica (SIG) per obtenir els patrons de canvi de les variables a cada coberta del sòl. Els resultats obtinguts mostren que hi ha una gran variabilitat, que augmenta amb el temps, entre els diferents models climàtics i escenaris considerats, que posa de manifest la incertesa associada a la modelització climàtica, a la generació d’escenaris d’emissions i a la naturalesa dinàmica i no determinista del sistema climàtic. Però en general, mostren que les glaceres seran una de les cobertes més exposades al canvi climàtic, i la mediterrània, una de les regions més vulnerables
Resumo:
BACKGROUND AND OBJECTIVE: Deciding about treatment goals at the end of life is a frequent and difficult challenge to medical staff. As more health care institutions issue ethico-legal guidelines to their staff the effects of such a guideline should be investigated in a pilot project.¦PARTICIPANTS AND METHODS: Prospective evaluation study using the pre-post method. Physicians and nurses working in ten intensive care units of a university medical center in Germany answered a specially designed questionnaire before and one year after issuance of the guideline.¦RESULTS: 197 analyzable answers were obtained from the first (pre-guideline) and 251 from the second (post-guideline) survey (54 % and 58 % response rate, respectively). Initially the clinicians expressed their need for guidelines, advice on ethical problems, and continuing education. One year after introduction of the guideline one third of the clinicians was familiar with the guideline's content and another third was aware of its existence. 90% of those who knew the document welcomed it. Explanation of the legal aspects was seen as its most useful element. The pre- and post-guideline comparison demonstrated that uncertainty in decision making and fear of legal consequences were reduced, while knowledge of legal aspects and the value given to advance directives increased. The residents had derived the greatest benefit.¦CONCLUSION: By promoting the knowledge of legal aspects and ethical considerations, guidelines given to medical staff can lead to more certainty when making in end of life decision.
Resumo:
L’objecte principal que ha guiat la recerca dels darrers tres anys ha estat la arqueometal•lúrgia. Esta memòria s’estructura en 3 parts: Els objectius inicials, el treball realitzat cada any i els resultats de la meva investigació. Evidentment, és en aquest darrer apartat on s’emfatitza la tasca de recerca i s’exposen els 3 àmbits d’actuació principals. El primer d’ells, necessari, és la formació teòric-metodològica en arqueometal•lúrgia. El segon, l’estudi de la composició elemental de les societats del Ha B2 de l’oest d’Europa. El darrer queda configurat per la recerca en curs en torn a la producció metal•lúrgica argàrica. També s’adjunta una selecció d’annex amb informes i publicacions redactats durant la beca i que ofereixen mirades en detall dels diferents elements que composen la recerca.
Resumo:
We prove two-sided inequalities between the integral moduli of smoothness of a function on R d[superscript] / T d[superscript] and the weighted tail-type integrals of its Fourier transform/series. Sharpness of obtained results in particular is given by the equivalence results for functions satisfying certain regular conditions. Applications include a quantitative form of the Riemann-Lebesgue lemma as well as several other questions in approximation theory and the theory of function spaces.
Resumo:
The increase in mortality risk associated with long-term exposure to particulate air pollution is one of the most important, and best-characterised, effects of air pollution on health. This report presents estimates of the size of this effect on mortality in local authority areas in the UK, building upon the attributable fractions reported as an indicator in the public health outcomes framework for England. It discusses the concepts and assumptions underlying these calculations and gives information on how such estimates can be made. The estimates are expected to be useful to health and wellbeing boards when assessing local public health priorities, as well as to others working in the field of air quality and public health. The estimates of mortality burden are based on modelled annual average concentrations of fine particulate matter (PM2.5) in each local authority area originating from human activities. Local data on the adult population and adult mortality rates is also used. Central estimates of the fraction of mortality attributable to long-term exposure to current levels of anthropogenic (human-made) particulate air pollution range from around 2.5% in some local authorities in rural areas of Scotland and Northern Ireland and between 3 and 5% in Wales, to over 8% in some London boroughs. Because of uncertainty in the increase in mortality risk associated with ambient PM2.5, the actual burdens associated with these modelled concentrations could range from approximately one-sixth to about double these figures. Thus, current levels of particulate air pollution have a considerable impact on public health. Measures to reduce levels of particulate air pollution, or to reduce exposure of the population to such pollution, are regarded as an important public health initiative.
Resumo:
Power law distributions, a well-known model in the theory of real random variables, characterize a wide variety of natural and man made phenomena. The intensity of earthquakes, the word frequencies, the solar ares and the sizes of power outages are distributed according to a power law distribution. Recently, given the usage of power laws in the scientific community, several articles have been published criticizing the statistical methods used to estimate the power law behaviour and establishing new techniques to their estimation with proven reliability. The main object of the present study is to go in deep understanding of this kind of distribution and its analysis, and introduce the half-lives of the radioactive isotopes as a new candidate in the nature following a power law distribution, as well as a \canonical laboratory" to test statistical methods appropriate for long-tailed distributions.
Resumo:
This paper studies the limits of discrete time repeated games with public monitoring. We solve and characterize the Abreu, Milgrom and Pearce (1991) problem. We found that for the "bad" ("good") news model the lower (higher) magnitude events suggest cooperation, i.e., zero punishment probability, while the highrt (lower) magnitude events suggest defection, i.e., punishment with probability one. Public correlation is used to connect these two sets of signals and to make the enforceability to bind. The dynamic and limit behavior of the punishment probabilities for variations in ... (the discount rate) and ... (the time interval) are characterized, as well as the limit payo¤s for all these scenarios (We also introduce uncertainty in the time domain). The obtained ... limits are to the best of my knowledge, new. The obtained ... limits coincide with Fudenberg and Levine (2007) and Fudenberg and Olszewski (2011), with the exception that we clearly state the precise informational conditions that cause the limit to converge from above, to converge from below or to degenerate. JEL: C73, D82, D86. KEYWORDS: Repeated Games, Frequent Monitoring, Random Pub- lic Monitoring, Moral Hazard, Stochastic Processes.
Resumo:
It can be assumed that the composition of Mercury’s thin gas envelope (exosphere) is related to thecomposition of the planets crustal materials. If this relationship is true, then inferences regarding the bulkchemistry of the planet might be made from a thorough exospheric study. The most vexing of allunsolved problems is the uncertainty in the source of each component. Historically, it has been believedthat H and He come primarily from the solar wind, while Na and K originate from volatilized materialspartitioned between Mercury’s crust and meteoritic impactors. The processes that eject atoms andmolecules into the exosphere of Mercury are generally considered to be thermal vaporization, photonstimulateddesorption (PSD), impact vaporization, and ion sputtering. Each of these processes has its owntemporal and spatial dependence. The exosphere is strongly influenced by Mercury’s highly ellipticalorbit and rapid orbital speed. As a consequence the surface undergoes large fluctuations in temperatureand experiences differences of insolation with longitude. We will discuss these processes but focus moreon the expected surface composition and solar wind particle sputtering which releases material like Caand other elements from the surface minerals and discuss the relevance of composition modelling