940 resultados para Methods: Data Analysis
Resumo:
Background: The literature shows how gender mandates contribute to differences in exposure and vulnerability to certain health risk factors. This paper presents the results of a study developed in the south of Spain, where research aimed at understanding men from a gender perspective is still limited. Objective: The aim of this paper is to explore the lay perceptions and meanings ascribed to the idea of masculinity, identifying ways in which gender displays are related to health. Design: The study is based on a mixed-methods data collection strategy typical of qualitative research. We performed a qualitative content analysis focused on manifest and latent content. Results: Our analysis showed that the relationship between masculinity and health was mainly defined with regard to behavioural explanations with an evident performative meaning. With regard to issues such as driving, the use of recreational drugs, aggressive behaviour, sexuality, and body image, important connections were established between manhood acts and health outcomes. Different ways of understanding and performing the male identity also emerged from the results. The findings revealed the implications of these aspects in the processes of change in the identity codes of men and women. Conclusions: The study provides insights into how the category ‘man’ is highly dependent on collective practices and performative acts. Consideration of how males perform manhood acts might be required in guidance on the development of programmes and policies aimed at addressing gender inequalities in health in a particular local context.
Resumo:
Fragile X syndrome is the most common inherited form of intellectual disability. Here we report on a study based on a collaborative registry, involving 12 Spanish centres, of molecular diagnostic tests in 1105 fragile X families comprising 5062 individuals, of whom, 1655 carried a full mutation or were mosaic, three cases had deletions, 1840 had a premutation, and 102 had intermediate alleles. Two patients with the full mutation also had Klinefelter syndrome. We have used this registry to assess the risk of expansion from parents to children. From mothers with premutation, the overall rate of allele expansion to full mutation is 52.5%, and we found that this rate is higher for male than female offspring (63.6% versus 45.6%; P < 0.001). Furthermore, in mothers with intermediate alleles (45-54 repeats), there were 10 cases of expansion to a premutation allele, and for the smallest premutation alleles (55-59 repeats), there was a 6.4% risk of expansion to a full mutation, with 56 repeats being the smallest allele that expanded to a full mutation allele in a single meiosis. Hence, in our series the risk for alleles of <59 repeats is somewhat higher than in other published series. These findings are important for genetic counselling.
Resumo:
In recent years, some epidemiologic studies have attributed adverse effects of air pollutants on health not only to particles and sulfur dioxide but also to photochemical air pollutants (nitrogen dioxide and ozone). The effects are usually small, leading to some inconsistencies in the results of the studies. Furthermore, the different methodologic approaches of the studies used has made it difficult to derive generic conclusions. We provide here a quantitative summary of the short-term effects of photochemical air pollutants on mortality in seven Spanish cities involved in the EMECAM project, using generalized additive models from analyses of single and multiple pollutants. Nitrogen dioxide and ozone data were provided by seven EMECAM cities (Barcelona, Gijón, Huelva, Madrid, Oviedo, Seville, and Valencia). Mortality indicators included daily total mortality from all causes excluding external causes, daily cardiovascular mortality, and daily respiratory mortality. Individual estimates, obtained from city-specific generalized additive Poisson autoregressive models, were combined by means of fixed effects models and, if significant heterogeneity among local estimates was found, also by random effects models. Significant positive associations were found between daily mortality (all causes and cardiovascular) and NO(2), once the rest of air pollutants were taken into account. A 10 microg/m(3) increase in the 24-hr average 1-day NO(2)level was associated with an increase in the daily number of deaths of 0.43% [95% confidence interval (CI), -0.003-0.86%] for all causes excluding external. In the case of significant relationships, relative risks for cause-specific mortality were nearly twice as much as that for total mortality for all the photochemical pollutants. Ozone was independently related only to cardiovascular daily mortality. No independent statistically significant relationship between photochemical air pollutants and respiratory mortality was found. The results in this study suggest that, given the present levels of photochemical pollutants, people living in Spanish cities are exposed to health risks derived from air pollution.
Resumo:
The impact of the adequacy of empirical therapy on outcome for patients with bloodstream infections (BSI) is key for determining whether adequate empirical coverage should be prioritized over other, more conservative approaches. Recent systematic reviews outlined the need for new studies in the field, using improved methodologies. We assessed the impact of inadequate empirical treatment on the mortality of patients with BSI in the present-day context, incorporating recent methodological recommendations. A prospective multicenter cohort including all BSI episodes in adult patients was performed in 15 hospitals in Andalucía, Spain, over a 2-month period in 2006 to 2007. The main outcome variables were 14- and 30-day mortality. Adjusted analyses were performed by multivariate analysis and propensity score-based matching. Eight hundred one episodes were included. Inadequate empirical therapy was administered in 199 (24.8%) episodes; mortality at days 14 and 30 was 18.55% and 22.6%, respectively. After controlling for age, Charlson index, Pitt score, neutropenia, source, etiology, and presentation with severe sepsis or shock, inadequate empirical treatment was associated with increased mortality at days 14 and 30 (odds ratios [ORs], 2.12 and 1.56; 95% confidence intervals [95% CI], 1.34 to 3.34 and 1.01 to 2.40, respectively). The adjusted ORs after a propensity score-based matched analysis were 3.03 and 1.70 (95% CI, 1.60 to 5.74 and 0.98 to 2.98, respectively). In conclusion, inadequate empirical therapy is independently associated with increased mortality in patients with BSI. Programs to improve the quality of empirical therapy in patients with suspicion of BSI and optimization of definitive therapy should be implemented.
Resumo:
Descriptive epidemiology research involves collecting data from large numbers of subjects. Obtaining these data requires approaches designed to achieve maximum participation or response rates among respondents possessing the desired information. We analyze participation and response rates in a population-based epidemiological study though a telephone survey and identify factors implicated in consenting to participate. Rates found exceeded those reported in the literature and they were higher for afternoon calls than for morning calls. Women and subjects older than 40 years were the most likely to answer the telephone. The study identified geographical differences, with higher RRs in districts in southern Spain that are not considered urbanized. This information may be helpful for designing more efficient community epidemiology projects.
Resumo:
Several eco-toxicological studies have shown that insectivorous mammals, due to theirfeeding habits, easily accumulate high amounts of pollutants in relation to other mammal species. To assess the bio-accumulation levels of toxic metals and their in°uenceon essential metals, we quantified the concentration of 19 elements (Ca, K, Fe, B, P,S, Na, Al, Zn, Ba, Rb, Sr, Cu, Mn, Hg, Cd, Mo, Cr and Pb) in bones of 105 greaterwhite-toothed shrews (Crocidura russula) from a polluted (Ebro Delta) and a control(Medas Islands) area. Since chemical contents of a bio-indicator are mainly compositional data, conventional statistical analyses currently used in eco-toxicology can givemisleading results. Therefore, to improve the interpretation of the data obtained, weused statistical techniques for compositional data analysis to define groups of metalsand to evaluate the relationships between them, from an inter-population viewpoint.Hypothesis testing on the adequate balance-coordinates allow us to confirm intuitionbased hypothesis and some previous results. The main statistical goal was to test equalmeans of balance-coordinates for the two defined populations. After checking normality,one-way ANOVA or Mann-Whitney tests were carried out for the inter-group balances
Resumo:
Pounamu (NZ jade), or nephrite, is a protected mineral in its natural form following thetransfer of ownership back to Ngai Tahu under the Ngai Tahu (Pounamu Vesting) Act 1997.Any theft of nephrite is prosecutable under the Crimes Act 1961. Scientific evidence isessential in cases where origin is disputed. A robust method for discrimination of thismaterial through the use of elemental analysis and compositional data analysis is required.Initial studies have characterised the variability within a given nephrite source. This hasincluded investigation of both in situ outcrops and alluvial material. Methods for thediscrimination of two geographically close nephrite sources are being developed.Key Words: forensic, jade, nephrite, laser ablation, inductively coupled plasma massspectrometry, multivariate analysis, elemental analysis, compositional data analysis
Resumo:
Purpose: To evaluate whether the correlation between in vitro bond strength data and estimated clinical retention rates of cervical restorations after two years depends on pooled data obtained from multicenter studies or single-test data. Materials and Methods: Pooled mean data for six dentin adhesive systems (Adper Prompt L-Pop, Clearfil SE, OptiBond FL, Prime & Bond NT, Single Bond, and Scotchbond Multipurpose) and four laboratory methods (macroshear, microshear, macrotensile and microtensile bond strength test) (Scherrer et al, 2010) were correlated to estimated pooled two-year retention rates of Class V restorations using the same adhesive systems. For bond strength data from a single test institute, the literature search in SCOPUS revealed one study that tested all six adhesive systems (microtensile) and two that tested five of the six systems (microtensile, macroshear). The correlation was determined with a database designed to perform a meta-analysis on the clinical performance of cervical restorations (Heintze et al, 2010). The clinical data were pooled and adjusted in a linear mixed model, taking the study effect, dentin preparation, type of isolation and bevelling of enamel into account. A regression analysis was carried out to evaluate the correlation between clinical and laboratory findings. Results: The results of the regression analysis for the pooled data revealed that only the macrotensile (adjusted R2 = 0.86) and microtensile tests (adjusted R2 = 0.64), but not the shear and the microshear tests, correlated well with the clinical findings. As regards the data from a single-test institute, the correlation was not statistically significant. Conclusion: Macrotensile and microtensile bond strength tests showed an adequate correlation with the retention rate of cervical restorations after two years. Bond strength tests should be carried out by different operators and/or research institutes to determine the reliability and technique sensitivity of the material under investigation.
Resumo:
BACKGROUND: Solexa/Illumina short-read ultra-high throughput DNA sequencing technology produces millions of short tags (up to 36 bases) by parallel sequencing-by-synthesis of DNA colonies. The processing and statistical analysis of such high-throughput data poses new challenges; currently a fair proportion of the tags are routinely discarded due to an inability to match them to a reference sequence, thereby reducing the effective throughput of the technology. RESULTS: We propose a novel base calling algorithm using model-based clustering and probability theory to identify ambiguous bases and code them with IUPAC symbols. We also select optimal sub-tags using a score based on information content to remove uncertain bases towards the ends of the reads. CONCLUSION: We show that the method improves genome coverage and number of usable tags as compared with Solexa's data processing pipeline by an average of 15%. An R package is provided which allows fast and accurate base calling of Solexa's fluorescence intensity files and the production of informative diagnostic plots.
Resumo:
Background: Recent reviews of randomized control trials have shown that pharmacist interventions improve cardiovascular diseases (CVD) risk factors in outpatients. Various interventions were evaluated in different settings, and a substantial heterogeneity was observed in the effect estimates. To better express uncertainties in the effect estimates, prediction intervals (PI) have been proposed but are, however, rarely reported. Objective: Pooling data from two systematic reviews, we estimated the effect of pharmacist interventions on systolic blood pressure (BP), computed PI, and evaluated potential causes of heterogeneity. Methods: Data were pooled from systematic reviews assessing the effect of pharmacist interventions on CVD risk factors in patients with or without diabetes, respectively. Effects were estimated using random effect models. Results: Systolic BP was the outcome in 31 trials including 12 373 patients. Pharmacist interventions included patient educational interventions, patient-reminder systems, measurement of BP, medication management and feedback to physician, or educational intervention to health care professionals. Pharmacist interventions were associated with a large reduction in systolic BP (-7.5 mmHg; 95% CI: -9.0 to -5.9). There was a substantial heterogeneity (I2: 66%). The 95% PI ranged from -13.9 to -1.0 mmHg. The effect tended to be larger if the intervention was conducted in a community pharmacy and if the pharmacist intervened at least monthly. Conclusion: On average, the effect of pharmacist interventions on BP was substantial. However, the wide PI suggests that the effect differed between interventions, with some having modest effects and others very large effects on BP. Part of the heterogeneity could be due to differences in the setting and in the frequency of the interventions.
Resumo:
Background In a previous study, the European Organisation for Research and Treatment of Cancer (EORTC) reported a scoring system to predict survival of patients with low-grade gliomas (LGGs). A major issue in the diagnosis of brain tumors is the lack of agreement among pathologists. New models in patients with LGGs diagnosed by central pathology review are needed. Methods Data from 339 EORTC patients with LGGs diagnosed by central pathology review were used to develop new prognostic models for progression-free survival (PFS) and overall survival (OS). Data from 450 patients with centrally diagnosed LGGs recruited into 2 large studies conducted by North American cooperative groups were used to validate the models. Results Both PFS and OS were negatively influenced by the presence of baseline neurological deficits, a shorter time since first symptoms (<30 wk), an astrocytic tumor type, and tumors larger than 5 cm in diameter. Early irradiation improved PFS but not OS. Three risk groups have been identified (low, intermediate, and high) and validated. Conclusions We have developed new prognostic models in a more homogeneous LGG population diagnosed by central pathology review. This population better fits with modern practice, where patients are enrolled in clinical trials based on central or panel pathology review. We could validate the models in a large, external, and independent dataset. The models can divide LGG patients into 3 risk groups and provide reliable individual survival predictions. Inclusion of other clinical and molecular factors might still improve models' predictions.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.
Resumo:
Modern methods of compositional data analysis are not well known in biomedical research.Moreover, there appear to be few mathematical and statistical researchersworking on compositional biomedical problems. Like the earth and environmental sciences,biomedicine has many problems in which the relevant scienti c information isencoded in the relative abundance of key species or categories. I introduce three problemsin cancer research in which analysis of compositions plays an important role. Theproblems involve 1) the classi cation of serum proteomic pro les for early detection oflung cancer, 2) inference of the relative amounts of di erent tissue types in a diagnostictumor biopsy, and 3) the subcellular localization of the BRCA1 protein, and it'srole in breast cancer patient prognosis. For each of these problems I outline a partialsolution. However, none of these problems is \solved". I attempt to identify areas inwhich additional statistical development is needed with the hope of encouraging morecompositional data analysts to become involved in biomedical research
Resumo:
The application of correspondence analysis to square asymmetrictables is often unsuccessful because of the strong role played by thediagonal entries of the matrix, obscuring the data off the diagonal. A simplemodification of the centering of the matrix, coupled with the correspondingchange in row and column masses and row and column metrics, allows the tableto be decomposed into symmetric and skew--symmetric components, which canthen be analyzed separately. The symmetric and skew--symmetric analyses canbe performed using a simple correspondence analysis program if the data areset up in a special block format.
Resumo:
We consider the joint visualization of two matrices which have common rowsand columns, for example multivariate data observed at two time pointsor split accord-ing to a dichotomous variable. Methods of interest includeprincipal components analysis for interval-scaled data, or correspondenceanalysis for frequency data or ratio-scaled variables on commensuratescales. A simple result in matrix algebra shows that by setting up thematrices in a particular block format, matrix sum and difference componentscan be visualized. The case when we have more than two matrices is alsodiscussed and the methodology is applied to data from the InternationalSocial Survey Program.