926 resultados para Static average-case analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In recent years, some epidemiologic studies have attributed adverse effects of air pollutants on health not only to particles and sulfur dioxide but also to photochemical air pollutants (nitrogen dioxide and ozone). The effects are usually small, leading to some inconsistencies in the results of the studies. Furthermore, the different methodologic approaches of the studies used has made it difficult to derive generic conclusions. We provide here a quantitative summary of the short-term effects of photochemical air pollutants on mortality in seven Spanish cities involved in the EMECAM project, using generalized additive models from analyses of single and multiple pollutants. Nitrogen dioxide and ozone data were provided by seven EMECAM cities (Barcelona, Gijón, Huelva, Madrid, Oviedo, Seville, and Valencia). Mortality indicators included daily total mortality from all causes excluding external causes, daily cardiovascular mortality, and daily respiratory mortality. Individual estimates, obtained from city-specific generalized additive Poisson autoregressive models, were combined by means of fixed effects models and, if significant heterogeneity among local estimates was found, also by random effects models. Significant positive associations were found between daily mortality (all causes and cardiovascular) and NO(2), once the rest of air pollutants were taken into account. A 10 microg/m(3) increase in the 24-hr average 1-day NO(2)level was associated with an increase in the daily number of deaths of 0.43% [95% confidence interval (CI), -0.003-0.86%] for all causes excluding external. In the case of significant relationships, relative risks for cause-specific mortality were nearly twice as much as that for total mortality for all the photochemical pollutants. Ozone was independently related only to cardiovascular daily mortality. No independent statistically significant relationship between photochemical air pollutants and respiratory mortality was found. The results in this study suggest that, given the present levels of photochemical pollutants, people living in Spanish cities are exposed to health risks derived from air pollution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper addresses the application of a PCA analysis on categorical data prior to diagnose a patients data set using a Case-Based Reasoning (CBR) system. The particularity is that the standard PCA techniques are designed to deal with numerical attributes, but our medical data set contains many categorical data and alternative methods as RS-PCA are required. Thus, we propose to hybridize RS-PCA (Regular Simplex PCA) and a simple CBR. Results show how the hybrid system produces similar results when diagnosing a medical data set, that the ones obtained when using the original attributes. These results are quite promising since they allow to diagnose with less computation effort and memory storage

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Palliative care, which is intended to keep patients at home as long as possible, is increasingly proposed for patients who live at home, with their family, or in retirement homes. Although their condition is expected to have a lethal evolution, the patients-or more often their families or entourages-are sometimes confronted with sudden situations of respiratory distress, convulsions, hemorrhage, coma, anxiety, or pain. Prehospital emergency services are therefore often confronted with palliative care situations, situations in which medical teams are not skilled and therefore frequently feel awkward.We conducted a retrospective study about cases of palliative care situations that were managed by prehospital emergency physicians (EPs) over a period of 8 months in 2012, in the urban region of Lausanne in the State of Vaud, Switzerland.The prehospital EPs managed 1586 prehospital emergencies during the study period. We report 4 situations of respiratory distress or neurological disorders in advanced cancer patients, highlighting end-of-life and palliative care situations that may be encountered by prehospital emergency services.The similarity of the cases, the reasons leading to the involvement of prehospital EPs, and the ethical dilemma illustrated by these situations are discussed. These situations highlight the need for more formal education in palliative care for EPs and prehospital emergency teams, and the need to fully communicate the planning and implementation of palliative care with patients and patients' family members.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE Identify the direct cost of reprocessing double and single cotton-woven drapes of the surgical LAP package. METHOD A quantitative, exploratory and descriptive case study, performed at a teaching hospital. The direct cost of reprocessing cotton-woven surgical drapes was calculated by multiplying the time spent by professionals involved in reprocessing the unit with the direct cost of labor, adding to the cost of materials. The Brazilian currency (R$) originally used for the calculations was converted to US currency at the rate of US$0.42/R$. RESULTS The average total cost for surgical LAP package was US$9.72, with the predominance being in the cost of materials (US$8.70 or 89.65%). It is noteworthy that the average total cost of materials was mostly impacted by the cost of the cotton-woven drapes (US$7.99 or 91.90%). CONCLUSION The knowledge gained will subsidize discussions about replacing reusable cotton-woven surgical drapes for disposable ones, favoring arguments regarding the advantages and disadvantages of this possibility considering human resources, materials, as well as structural, environmental and financial resources.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although cigarette smoking and alcohol consumption increase risk for head and neck cancers, there have been few attempts to model risks quantitatively and to formally evaluate cancer site-specific risks. The authors pooled data from 15 case-control studies and modeled the excess odds ratio (EOR) to assess risk by total exposure (pack-years and drink-years) and its modification by exposure rate (cigarettes/day and drinks/day). The smoking analysis included 1,761 laryngeal, 2,453 pharyngeal, and 1,990 oral cavity cancers, and the alcohol analysis included 2,551 laryngeal, 3,693 pharyngeal, and 3,116 oval cavity cancers, with over 8,000 controls. Above 15 cigarettes/day, the EOR/pack-year decreased with increasing cigarettes/day, suggesting that greater cigarettes/day for a shorter duration was less deleterious than fewer cigarettes/day for a longer duration. Estimates of EOR/pack-year were homogeneous across sites, while the effects of cigarettes/day varied, indicating that the greater laryngeal cancer risk derived from differential cigarettes/day effects and not pack-years. EOR/drink-year estimates increased through 10 drinks/day, suggesting that greater drinks/day for a shorter duration was more deleterious than fewer drinks/day for a longer duration. Above 10 drinks/day, data were limited. EOR/drink-year estimates varied by site, while drinks/day effects were homogeneous, indicating that the greater pharyngeal/oral cavity cancer risk with alcohol consumption derived from the differential effects of drink-years and not drinks/day.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Short description of the proposed presentation * lees than 100 words This paper describes the interdisciplinary work done in Uspantán, Guatemala, a city vulnerable to natural hazards. We investigated local responses to landslides that happened in 2007 and 2010 and had a strong impact on the local community. We show a complete example of a systemic approach that incorporates physical, social and environmental aspects in order to understand risks. The objective of this work is to present the combination of social and geological data (mapping), and describe the methodology used for identification and assessment of risk. The article discusses both the limitations and methodological challenges encountered when conducting interdisciplinary research. Describe why it is important to present this topic at the Global Platform in less than 50 words This work shows the benefits of addressing risk in an interdisciplinary perspective, in particular how integrating social sciences can help identify new phenomena and natural hazards and assess risk. It gives a practical example of how one can integrate data from different fields. What is innovative about this presentation? * The use of mapping to combine qualitative and quantitative data. By coupling approaches, we could associate a hazard map with qualitative data gathered by interviews with the population. This map is an important document for the authorities. Indeed, it allows them to be aware of the most dangerous zones, the affected families and the places where it is most urgent to intervene.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The case of two transition tables is considered, that is two squareasymmetric matrices of frequencies where the rows and columns of thematrices are the same objects observed at three different timepoints. Different ways of visualizing the tables, either separatelyor jointly, are examined. We generalize an existing idea where asquare matrix is descomposed into symmetric and skew-symmetric partsto two matrices, leading to a decomposition into four components: (1)average symmetric, (2) average skew-symmetric, (3) symmetricdifference from average, and (4) skew-symmetric difference fromaverage. The method is illustrated with an artificial example and anexample using real data from a study of changing values over threegenerations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We review methods to estimate the average crystal (grain) size and the crystal (grain) size distribution in solid rocks. Average grain sizes often provide the base for stress estimates or rheological calculations requiring the quantification of grain sizes in a rock's microstructure. The primary data for grain size data are either 1D (i.e. line intercept methods), 2D (area analysis) or 3D (e.g., computed tomography, serial sectioning). These data have been used for different data treatments over the years, whereas several studies assume a certain probability function (e.g., logarithm, square root) to calculate statistical parameters as the mean, median, mode or the skewness of a crystal size distribution. The finally calculated average grain sizes have to be compatible between the different grain size estimation approaches in order to be properly applied, for example, in paleo-piezometers or grain size sensitive flow laws. Such compatibility is tested for different data treatments using one- and two-dimensional measurements. We propose an empirical conversion matrix for different datasets. These conversion factors provide the option to make different datasets compatible with each other, although the primary calculations were obtained in different ways. In order to present an average grain size, we propose to use the area-weighted and volume-weighted mean in the case of unimodal grain size distributions, respectively, for 2D and 3D measurements. The shape of the crystal size distribution is important for studies of nucleation and growth of minerals. The shape of the crystal size distribution of garnet populations is compared between different 2D and 3D measurements, which are serial sectioning and computed tomography. The comparison of different direct measured 3D data; stereological data and direct presented 20 data show the problems of the quality of the smallest grain sizes and the overestimation of small grain sizes in stereological tools, depending on the type of CSD. (C) 2011 Published by Elsevier Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Valid individualized case conceptualization methodologies, such as plan analysis, are rarely used for the psychotherapeutic treatment conceptualization and planning of bipolar affective disorder (BD), even if data do exist showing that psychotherapy interventions might be enhanced by applying such analyses for treatment planning for several groups of patients. We applied plan analysis as a research tool (Caspar, 1995) to N=30 inpatients presenting BD, who were interviewed twice. Our study aimed at producing a prototypical plan structure encompassing the most relevant data from the 30 individual case conceptualizations. Special focus was given to links with emotions and coping plans. Inter-rater reliability of these plan analyses was considered sufficient. Results suggest the presence of two subtypes based on plananalytic principles: emotion control and relationship control, along with a mixed form. These subtypes are discussed with regard to inherent plananalytic conflicts, specific emotions and coping plans, as well as symptom level and type. Finally, conclusions are drawn for enhancing psychotherapeutic practice with BD patients, based on the motive-oriented therapeutic relationship.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper tests some hypothesis about the determinants of the local tax structure. In particular, we focus on the effects that the property tax deductibility in the national income tax has on the relative use of the property tax and user charges. We deal with the incentive effects that local governments face regarding the different sources of revenue by means of a model in which the local tax structure and the level of public expenditure arise as a result of the maximizing behaviour of local politicians subject to the economic effects of the tax system. We attempt to test the hypothesis developed with data corresponding to a set of Spanish municipalities during the period 1987-9l. We find that tax deductibility provides incentives to raise revenues from the property tax but does not introduce a biass against user charges or in favor of overall spending growth

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper tests some hypothesis about the determinants of the local tax structure. In particular, we focus on the effects that the property tax deductibility in the national income tax has on the relative use of the property tax and user charges. We deal with the incentive effects that local governments face regarding the different sources of revenue by means of a model in which the local tax structure and the level of public expenditure arise as a result of the maximizing behaviour of local politicians subject to the economic effects of the tax system. We attempt to test the hypothesis developed with data corresponding to a set of Spanish municipalities during the period 1987-9l. We find that tax deductibility provides incentives to raise revenues from the property tax but does not introduce a biass against user charges or in favor of overall spending growth

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the trend in molecular epidemiology towards both genome-wide association studies and complex modelling, the need for large sample sizes to detect small effects and to allow for the estimation of many parameters within a model continues to increase. Unfortunately, most methods of association analysis have been restricted to either a family-based or a case-control design, resulting in the lack of synthesis of data from multiple studies. Transmission disequilibrium-type methods for detecting linkage disequilibrium from family data were developed as an effective way of preventing the detection of association due to population stratification. Because these methods condition on parental genotype, however, they have precluded the joint analysis of family and case-control data, although methods for case-control data may not protect against population stratification and do not allow for familial correlations. We present here an extension of a family-based association analysis method for continuous traits that will simultaneously test for, and if necessary control for, population stratification. We further extend this method to analyse binary traits (and therefore family and case-control data together) and accurately to estimate genetic effects in the population, even when using an ascertained family sample. Finally, we present the power of this binary extension for both family-only and joint family and case-control data, and demonstrate the accuracy of the association parameter and variance components in an ascertained family sample.