998 resultados para application modalities
Application of standard and refined heat balance integral methods to one-dimensional Stefan problems
Resumo:
The work in this paper concerns the study of conventional and refined heat balance integral methods for a number of phase change problems. These include standard test problems, both with one and two phase changes, which have exact solutions to enable us to test the accuracy of the approximate solutions. We also consider situations where no analytical solution is available and compare these to numerical solutions. It is popular to use a quadratic profile as an approximation of the temperature, but we show that a cubic profile, seldom considered in the literature, is far more accurate in most circumstances. In addition, the refined integral method can give greater improvement still and we develop a variation on this method which turns out to be optimal in some cases. We assess which integral method is better for various problems, showing that it is largely dependent on the specified boundary conditions.
Resumo:
This paper investigates vulnerability to poverty in Haiti. Research in vulnerability in developing countries has been scarce due to the high data requirements of vulnerability studies (e.g. panel or long series of cross-sections). The methodology adopted here allows the assessment of vulnerability to poverty by exploiting the short panel structure of nested data at different levels. The decomposition method reveals that vulnerability in Haiti is largely a rural phenomenon and that schooling correlates negatively with vulnerability. Most importantly, among the different shocks affecting household's income, it is found that meso-level shocks are in general far more important than covariate shocks. This finding points to some interesting policy implications in decentralizing policies to alleviate vulnerability to poverty.
Resumo:
ABSTRACT: BACKGROUND: Cardiovascular magnetic resonance (CMR) has favorable characteristics for diagnostic evaluation and risk stratification of patients with known or suspected CAD. CMR utilization in CAD detection is growing fast. However, data on its cost-effectiveness are scarce. The goal of this study is to compare the costs of two strategies for detection of significant coronary artery stenoses in patients with suspected coronary artery disease (CAD): 1) Performing CMR first to assess myocardial ischemia and/or infarct scar before referring positive patients (defined as presence of ischemia and/or infarct scar to coronary angiography (CXA) versus 2) a hypothetical CXA performed in all patients as a single test to detect CAD. METHODS: A subgroup of the European CMR pilot registry was used including 2,717 consecutive patients who underwent stress-CMR. From these patients, 21% were positive for CAD (ischemia and/or infarct scar), 73% negative, and 6% uncertain and underwent additional testing. The diagnostic costs were evaluated using invoicing costs of each test performed. Costs analysis was performed from a health care payer perspective in German, United Kingdom, Swiss, and United States health care settings. RESULTS: In the public sectors of the German, United Kingdom, and Swiss health care systems, cost savings from the CMR-driven strategy were 50%, 25% and 23%, respectively, versus outpatient CXA. If CXA was carried out as an inpatient procedure, cost savings were 46%, 50% and 48%, respectively. In the United States context, cost savings were 51% when compared with inpatient CXA, but higher for CMR by 8% versus outpatient CXA. CONCLUSION: This analysis suggests that from an economic perspective, the use of CMR should be encouraged as a management option for patients with suspected CAD.
Resumo:
Since 1895, when X-rays were discovered, ionizing radiation became part of our life. Its use in medicine has brought significant health benefits to the population globally. The benefit of any diagnostic procedure is to reduce the uncertainty about the patient's health. However, there are potential detrimental effects of radiation exposure. Therefore, radiation protection authorities have become strict regarding the control of radiation risks.¦There are various situations where the radiation risk needs to be evaluated. International authority bodies point to the increasing number of radiologic procedures and recommend population surveys. These surveys provide valuable data to public health authorities which helps them to prioritize and focus on patient groups in the population that are most highly exposed. On the other hand, physicians need to be aware of radiation risks from diagnostic procedures in order to justify and optimize the procedure and inform the patient.¦The aim of this work was to examine the different aspects of radiation protection and investigate a new method to estimate patient radiation risks.¦The first part of this work concerned radiation risk assessment from the regulatory authority point of view. To do so, a population dose survey was performed to evaluate the annual population exposure. This survey determined the contribution of different imaging modalities to the total collective dose as well as the annual effective dose per caput. It was revealed that although interventional procedures are not so frequent, they significantly contribute to the collective dose. Among the main results of this work, it was shown that interventional cardiological procedures are dose-intensive and therefore more attention should be paid to optimize the exposure.¦The second part of the project was related to the patient and physician oriented risk assessment. In this part, interventional cardiology procedures were studied by means of Monte Carlo simulations. The organ radiation doses as well as effective doses were estimated. Cancer incidence risks for different organs were calculated for different sex and age-at-exposure using the lifetime attributable risks provided by the Biological Effects of Ionizing Radiations Report VII. Advantages and disadvantages of the latter results were examined as an alternative method to estimate radiation risks. The results show that this method is the most accurate, currently available, to estimate radiation risks. The conclusions of this work may guide future studies in the field of radiation protection in medicine.¦-¦Depuis la découverte des rayons X en 1895, ce type de rayonnement a joué un rôle important dans de nombreux domaines. Son utilisation en médecine a bénéficié à la population mondiale puisque l'avantage d'un examen diagnostique est de réduire les incertitudes sur l'état de santé du patient. Cependant, leur utilisation peut conduire à l'apparition de cancers radio-induits. Par conséquent, les autorités sanitaires sont strictes quant au contrôle du risque radiologique.¦Le risque lié aux radiations doit être estimé dans différentes situations pratiques, dont l'utilisation médicale des rayons X. Les autorités internationales de radioprotection indiquent que le nombre d'examens et de procédures radiologiques augmente et elles recommandent des enquêtes visant à déterminer les doses de radiation délivrées à la population. Ces enquêtes assurent que les groupes de patients les plus à risque soient prioritaires. D'un autre côté, les médecins ont également besoin de connaître le risque lié aux radiations afin de justifier et optimiser les procédures et informer les patients.¦Le présent travail a pour objectif d'examiner les différents aspects de la radioprotection et de proposer une manière efficace pour estimer le risque radiologique au patient.¦Premièrement, le risque a été évalué du point de vue des autorités sanitaires. Une enquête nationale a été réalisée pour déterminer la contribution des différentes modalités radiologiques et des divers types d'examens à la dose efficace collective due à l'application médicale des rayons X. Bien que les procédures interventionnelles soient rares, elles contribuent de façon significative à la dose délivrée à la population. Parmi les principaux résultats de ce travail, il a été montré que les procédures de cardiologie interventionnelle délivrent des doses élevées et devraient donc être optimisées en priorité.¦La seconde approche concerne l'évaluation du risque du point de vue du patient et du médecin. Dans cette partie, des procédures interventionnelles cardiaques ont été étudiées au moyen de simulations Monte Carlo. La dose délivrée aux organes ainsi que la dose efficace ont été estimées. Les risques de développer des cancers dans plusieurs organes ont été calculés en fonction du sexe et de l'âge en utilisant la méthode établie dans Biological Effects of Ionizing Radiations Report VII. Les avantages et inconvénients de cette nouvelle technique ont été examinés et comparés à ceux de la dose efficace. Les résultats ont montré que cette méthode est la plus précise actuellement disponible pour estimer le risque lié aux radiations. Les conclusions de ce travail pourront guider de futures études dans le domaine de la radioprotection en médicine.
Resumo:
This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.
Resumo:
Background: Intranasal administration of high amount of allergen was shown to induce tolerance and to reverse the allergic phenotype. However, mechanisms of tolerance induction via the mucosal route are still unclear. Objectives: To characterize the therapeutic effects of intranasal application of ovalbumin (OVA) in a mouse model of bronchial inflammation as well as the cellular and molecular mechanisms leading to protection upon re-exposure to allergen. Methods: After induction of bronchial inflammation, mice were treated intranasally with OVA and re-exposed to OVA aerosols 10 days later. Bronchoalveolar lavage fluid (BALF), T cell proliferation and cytokine secretion were examined. The respective role of CD4(+)CD25(+) and CD4(+)CD25(-) T cells in the induction of tolerance was analysed. Results: Intranasal treatment with OVA drastically reduced inflammatory cell recruitment into BALF and bronchial hyperresponsiveness upon re-exposure to allergen. Both OVA- specific-proliferation of T cells, T(h)1 and T(h)2 cytokine production from lung and bronchial lymph nodes were inhibited. Transfer of CD4(+)CD25(-) T cells, which strongly expressed membrane-bound transforming growth factor beta (mTGF beta), from tolerized mice protected asthmatic recipient mice from subsequent aerosol challenges. The presence of CD4(+)CD25(+)(Foxp3(+)) T cells during the process of tolerization was indispensable to CD4(+)CD25(-) T cells to acquire regulatory properties. Whereas the presence of IL-10 appeared dispensable in this model, the suppression of CD4(+)CD25(-)mTGF beta(+) T cells in transfer experiments significantly impaired the down-regulation of airways inflammation. Conclusion: Nasal application of OVA in established asthma led to the induction of CD4(+)CD25(-)mTGF beta(+) T cells with regulatory properties, able to confer protection upon allergen re-exposure.
Resumo:
Since the management of atrial fibrillation may be difficult in the individual patient, our purpose was to develop simple clinical recommendations to help the general internist manage this common clinical problem. Systematic review of the literature with evaluation of data-related evidence and framing of graded recommendations. Atrial fibrillation affects some 1% of the population in Western countries and is linked to a significant increase in morbidity and mortality. The management of atrial fibrillation requires individualised evaluation of the risks and benefits of therapeutic modalities, relying whenever possible on simple and validated tools. The two main points requiring a decision in clinical management are 1) whether or not to implement thromboembolic prevention therapy, and 2) whether preference should be given to a "rate control" or "rhythm control" strategy. Thromboembolic prophylaxis should be prescribed after individualised risk assessment: for patients at risk, oral anticoagulation with warfarin decreases the rate of embolic complications by 60% and aspirin by 20%, at the expense of an increased incidence of haemorrhagic complications. "Rate control" and "rhythm control" strategies are probably equivalent, and the choice should also be made on an individualised basis. To assist the physician in making his choices for the care of an atrial fibrillation patient we propose specific tables and algorithms, with graded recommendations. On the evidence of data from the literature we propose simple algorithms and tables for the clinical management of atrial fibrillation in the individual patient.
Resumo:
Detailed field mapping and paleontological dating in the central and southeastern Nicoya Peninsula has revealed Late Cretaceous and Paleogene radiolarian-bearing siliceous mudstones. These rocks belong to two terranes (Matambfi and Manzanillo) that are partially contemporaneous with the Nicoya Complex, but are genetically different. While the Nicoya Complex is formed exclusively by intraplate igneous rocks with associated radiolarites, the studied sections include variable amounts of are-derived volcanic and terrigenous materials. These fore-arc terranes include mafic to intermediate volcaniclastics and associated pelagic and hemipelagic rocks rich in biogenic silica. Radiolarian preservation in these sediments is often enhanced by the presence of silica-saturated volcanic tuffs and debris. Seven out of 29 samples from different outcrops yielded relatively well-preserved radiolarian faunas. In total, 60 species belonging to 34 genera were present in these faunas, ranging in age from middle Turonian-Santonian to late Thanetian-Ypresian.
Resumo:
Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.
Resumo:
Sm15 and Sm13 are recognized by antibodies from mice protectively vaccinated with tegumental membranes, suggesting a potential role in protective immunity. In order to raise antibodies for immunochemical investigations, the genes for these antigens were expressed in pGEX and pMal vectors so that comparisons could be made among different expression systems and different genes. The fusion proteins corresponding to several parts of the gene for the precursor of Sm15 failed in producing antibodies recognizing the parasite counterpart. On the other hand, antibodies raised against Sm13 MBP-fusion proteins recognized the 13 kDa tegumental protein. Thus the peculiarities of the gene of interest are important and the choice of the expression system must sometimes be decided on an empirical basis
Resumo:
In recent years traditional inequality measures have been used to quite a considerable extent to examine the international distribution of environmental indicators. One of its main characteristics is that each one assigns different weights to the changes that occur in the different sections of the variable distribution and, consequently, the results they yield can potentially be very different. Hence, we suggest the appropriateness of using a range of well-recommended measures to achieve more robust results. We also provide an empirical test for the comparative behaviour of several suitable inequality measures and environmental indicators. Our findings support the hypothesis that in some cases there are differences among measures in both the sign of the evolution and its size. JEL codes: D39; Q43; Q56. Keywords: international environment factor distribution; Kaya factors; Inequality measurement
Resumo:
In a recent paper Bermúdez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermúdez [2009] and it is shown that the modelling of the data set can be improved considerably.