906 resultados para Exploratory statistical data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes.Against this background the relevance of perturbations and subcompositions can beclearly seen. Moreover we can identify a number of hypotheses of interest involvingthe specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION According to several series, hospital hyponutrition involves 30-50% of hospitalized patients. The high prevalence justifies the need for early detection from admission. There several classical screening tools that show important limitations in their systematic application in daily clinical practice. OBJECTIVES To analyze the relationship between hyponutrition, detected by our screening method, and mortality, hospital stay, or re-admissions. To analyze, as well, the relationship between hyponutrition and prescription of nutritional support. To compare different nutritional screening methods at admission on a random sample of hospitalized patients. Validation of the INFORNUT method for nutritional screening. MATERIAL AND METHODS In a previous phase from the study design, a retrospective analysis with data from the year 2003 was carried out in order to know the situation of hyponutrition in Virgen de la Victoria Hospital, at Malaga, gathering data from the MBDS (Minimal Basic Data Set), laboratory analysis of nutritional risk (FILNUT filter), and prescription of nutritional support. In the experimental phase, a cross-sectional cohort study was done with a random sample of 255 patients, on May of 2004. Anthropometrical study, Subjective Global Assessment (SGA), Mini-Nutritional Assessment (MNA), Nutritional Risk Screening (NRS), Gassull's method, CONUT and INFORNUT were done. The settings of the INFORNUT filter were: albumin < 3.5 g/dL, and/or total proteins <5 g/dL, and/or prealbumin <18 mg/dL, with or without total lymphocyte count < 1.600 cells/mm3 and/or total cholesterol <180 mg/dL. In order to compare the different methods, a gold standard is created based on the recommendations of the SENPE on anthropometrical and laboratory data. The statistical association analysis was done by the chi-squared test (a: 0.05) and agreement by the k index. RESULTS In the study performed in the previous phase, it is observed that the prevalence of hospital hyponutrition is 53.9%. One thousand six hundred and forty four patients received nutritional support, of which 66.9% suffered from hyponutrition. We also observed that hyponutrition is one of the factors favoring the increase in mortality (hyponourished patients 15.19% vs. non-hyponourished 2.58%), hospital stay (hyponourished patients 20.95 days vs. non-hyponourished 8.75 days), and re-admissions (hyponourished patients 14.30% vs. non-hyponourished 6%). The results from the experimental study are as follows: the prevalence of hyponutrition obtained by the gold standard was 61%, INFORNUT 60%. Agreement levels between INFORNUT, CONUT, and GASSULL are good or very good between them (k: 0.67 INFORNUT with CONUT, and k: 0.94 INFORNUT and GASSULL) and wit the gold standard (k: 0.83; k: 0.64 CONUT; k: 0.89 GASSULL). However, structured tests (SGA, MNA, NRS) show low agreement indexes with the gold standard and laboratory or mixed tests (Gassull), although they show a low to intermediate level of agreement when compared one to each other (k: 0.489 NRS with SGA). INFORNUT shows sensitivity of 92.3%, a positive predictive value of 94.1%, and specificity of 91.2%. After the filer phase, a preliminary report is sent, on which anthropometrical and intake data are added and a Nutritional Risk Report is done. CONCLUSIONS Hyponutrition prevalence in our study (60%) is similar to that found by other authors. Hyponutrition is associated to increased mortality, hospital stay, and re-admission rate. There are no tools that have proven to be effective to show early hyponutrition at the hospital setting without important applicability limitations. FILNUT, as the first phase of the filter process of INFORNUT represents a valid tool: it has sensitivity and specificity for nutritional screening at admission. The main advantages of the process would be early detection of patients with risk for hyponutrition, having a teaching and sensitization function to health care staff implicating them in nutritional assessment of their patients, and doing a hyponutrition diagnosis and nutritional support need in the discharge report that would be registered by the Clinical Documentation Department. Therefore, INFORNUT would be a universal screening method with a good cost-effectiveness ratio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology of exploratory data analysis investigating the phenomenon of orographic precipitation enhancement is proposed. The precipitation observations obtained from three Swiss Doppler weather radars are analysed for the major precipitation event of August 2005 in the Alps. Image processing techniques are used to detect significant precipitation cells/pixels from radar images while filtering out spurious effects due to ground clutter. The contribution of topography to precipitation patterns is described by an extensive set of topographical descriptors computed from the digital elevation model at multiple spatial scales. Additionally, the motion vector field is derived from subsequent radar images and integrated into a set of topographic features to highlight the slopes exposed to main flows. Following the exploratory data analysis with a recent algorithm of spectral clustering, it is shown that orographic precipitation cells are generated under specific flow and topographic conditions. Repeatability of precipitation patterns in particular spatial locations is found to be linked to specific local terrain shapes, e.g. at the top of hills and on the upwind side of the mountains. This methodology and our empirical findings for the Alpine region provide a basis for building computational data-driven models of orographic enhancement and triggering of precipitation. Copyright (C) 2011 Royal Meteorological Society .

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyzing functional data often leads to finding common factors, for which functional principal component analysis proves to be a useful tool to summarize and characterize the random variation in a function space. The representation in terms of eigenfunctions is optimal in the sense of L-2 approximation. However, the eigenfunctions are not always directed towards an interesting and interpretable direction in the context of functional data and thus could obscure the underlying structure. To overcome such difficulty, an alternative to functional principal component analysis is proposed that produces directed components which may be more informative and easier to interpret. These structural components are similar to principal components, but are adapted to situations in which the domain of the function may be decomposed into disjoint intervals such that there is effectively independence between intervals and positive correlation within intervals. The approach is demonstrated with synthetic examples as well as real data. Properties for special cases are also studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Monotherapy against HIV has undoubted theoretical advantages and has good scientific fundaments. However, it is still controversial and here we will analyze the efficacy and safety of MT with darunavir with ritonavir (DRV/r) on patients who have received this treatment in our hospitals. MATERIALS AND METHODS Observational retrospective study that includes patients from 10 Andalusian hospitals that have received DRV/r in MT and that have been followed over a minimum of 12 months. We carried out a statistical descriptive analysis based on the profile of patients who had been prescribed MT and the efficacy and safety that were observed, paying special attention to treatment failure and virological evolution. RESULTS DRV/r was prescribed to 604 patients, of which 41.1% had a CD4 nadir <200/mmc. 33.1% had chronic hepatitis caused by HCV, had received an average of five lines of previous treatment and had a history of treatment failure to analogues in 33%, to non-analogues 22 and protease inhibitors (PI) in 19.5%. 76.6% proceeded from a previous treatment with PI. The simplification was the main criteria for the instauration of MT in the 81.5% and the adverse effects in the 18.5%. We managed to maintain MT in 84% of cases, with only 4.8% of virological failure (VF) with viral load (VL) >200 c/mL and 3.6% additional losses due to VF with VL between 50 and 200 copies/mL. Thirty three genotypes were performed after failure without findings of resistance mutations to DRV/r or other IPs. Only 23.7% of patients presented some blips during the period of exposition to MT. Eighty seven percent of all determinations of VL had <50 copies/mL, and only 4.99% had >200 copies/mL. Although up to 14.9% registered at some point an AE, only 2.6% abandoned MT because of AE and 1.2% because of voluntary decision. Although the average of total and LDL cholesterol increases 10 mg/dL after 2 years of follow-up, so did HDL cholesterol in 3mg/dL and the values of triglycerides (-14 mg/dL) and GPT (-6 UI/mL) decreased. The average count of CD4 lymphocytes increased from 642 to 714/mm(3) at 24 weeks. CONCLUSIONS In a very broad series of patients obtained from clinical practice, data from clinical trials was confirmed: MT with DRV as a de-escalation strategy is very safe, it's associated to a negligible rate of adverse effects and maintains a good suppression of HIV replication. VF (with >50 or >200 copies/mL) is always under 10% and in any case without consequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a well established fact that the entry of women into higher-level professional occupations has not resulted in their equal distribution within these occupations. Indeed, the emergence and persistence of horizontal and vertical gender segregation within the professions has been at the heart of the development of a range of alternative theoretical perspectives on both the "feminisation process" and the future of the "professions"more generally. Through an in-depth comparative analysis of the recent changes in the organisation and administration of the medical profession in Britain and France, this paper draws upon statistical data and biographical interviews with male and female general practitioners (GPs) in both countries in order to discuss and review a variety of approaches that have been adopted to explain and analyse the "eminisation" process of higher-level professions. Our conclusions review the theoretical debates in the light of the evidence we have presented. It is argued that, despite important elements of continuity in respect of gendered occupational structuring in both countries, national variations in both professional and domestic gendered architectures lead to different outcomes as far as the extent and patterns of internal occupational segregation are concerned. Both female and male doctors are currently seeking - with some effect - to resist thepressures of medicine on family life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new quantitative approach of the mandibular sexual dimorphism, based on computer-aided image analysis and elliptical Fourier analysis of the mandibular outline in lateral view is presented. This method was applied to a series of 117 dentulous mandibles from 69 male and 48 female individuals native of Rhenish countries. Statistical discriminant analysis of the elliptical Fourier harmonics allowed the demonstration of a significant sexual dimorphism in 97.1% of males and 91.7% of females, i.e. in a higher proportion than in previous studies using classical metrical approaches. This original method opens interesting perspectives for increasing the accuracy of sex identification in current anthropological practice and in forensic procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pounamu (NZ jade), or nephrite, is a protected mineral in its natural form following thetransfer of ownership back to Ngai Tahu under the Ngai Tahu (Pounamu Vesting) Act 1997.Any theft of nephrite is prosecutable under the Crimes Act 1961. Scientific evidence isessential in cases where origin is disputed. A robust method for discrimination of thismaterial through the use of elemental analysis and compositional data analysis is required.Initial studies have characterised the variability within a given nephrite source. This hasincluded investigation of both in situ outcrops and alluvial material. Methods for thediscrimination of two geographically close nephrite sources are being developed.Key Words: forensic, jade, nephrite, laser ablation, inductively coupled plasma massspectrometry, multivariate analysis, elemental analysis, compositional data analysis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Implementation of social investments through corporate foundations is growing and, therefore, it is important to study their governance aspects better. Governance is conceptualized as a set of control and incentive mechanisms to overcome the so-called agency conflicts, which originate from the separation of property and management in for-profit organizations, a concept also applied to nonprofit institutions. It is argued that corporate foundations have the characteristics both of companies and of civil society organizations, which distinguishes them from both types of organizations. This paper analyses a study in which a set of governance mechanisms, adapted from those identified by a literature review of corporate and nonprofit governance, was selected for study. It is an exploratory descriptive case study, which analyzed data about eight organizations collected through publications and interviews with their CEOs. The data analysis indicates that it is appropriate to distinguish the different organization types and to apply the agency theory. Research results indicate that the selected governance mechanisms may be adapted and used in corporate foundations. However, they are only partially applied in the observed cases, which suggests the need for further studies that might consolidate these practices in such organizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Patients with cystic fibrosis (CF) are more susceptible to pathogens like P. aeruginosa (PA). PA primo-­‐infections require particular attention, as with failure in eradication, there is accelerated lung deterioration. The main aim of this study is to assess the rate of PA eradication according to our particular protocol with inhaled tobramycin and oral ciprofloxacin, as there is no consensus in the literature on what eradication protocol the best is. Methods: Retrospective single centre study with data analysis from June 1st 2007 to June 1st 2011 of patients who had primo-­‐infections exclusively treated by 3 x 28 days of inhaled tobramycin and oral ciprofloxacin for the first and last 21 days. Success in eradication is defined by ≥ 3 negative bacteriologies for 6 months after the beginning of the protocol. If ≥ 1 bacteriology is positive, we consider the eradication as a failure. Results: Out of 41 patients, 18 were included in our analysis. 7 girls (38.9%) and 11 boys (61.1%) followed the eradication protocol. Boys had 12 primo-­‐infections and girls had 8. Among these 20 primo-­‐infections, 16 (80%) had an all-­‐overall success in eradication and 4 (20%) a failure. No significant statistical difference for age between these groups (t-­‐test = 0.07, p = 0.94), neither for FEV1% (t-­‐test = 0.96, p = 0.41) nor BMI (t-­‐test = 1.35, p = 0.27). Rate of success was 100% for girls and 66.6% for boys. Conclusion: Our protocol succeeded in an overall eradication rate of 80%, without statistical significant impact on FEV1 % and BMI values. However, there is a sex difference with eradication rates in girls (100%) and boys (66.6%). A sex difference has not yet been reported in the literature. This should be evaluated in further studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1993, Iowa Workforce Development (then the Department of Employment Services) conducted a survey to determine if there was a gender gap in wages paid. The results of that survey indicated that women were paid 68 cents per dollar paid to males. We felt a need to determine if this relationship of wages paid to each gender has changed since the 1993 study. In 1999, the Commission on the Status of Women requested that Iowa Workforce Development conduct research to update the 1993 information. A survey, cosponsored by the Commission on the Status of Women and Iowa Workforce Development, was conducted in 1999. The results of the survey showed that women earned 73 percent of what men earned when both jobs were considered. (The survey asked respondents to provide information on a primary job and a secondary job.) The ratio for the primary job was 72 percent, while the ratio for the secondary job was 85 percent. Additional survey results detail the types of jobs respondents had, the types of companies for which they worked and the education and experience levels. All of these characteristics can contribute to these ratios. While the large influx of women into the labor force may be over, it is still important to look at such information to determine if future action is needed. We present these results with that goal in mind. We are indebted to those Iowans, female and male, who voluntarily completed the survey. This study was completed under the general direction of Judy Erickson. The report was written by Shazada Khan, Teresa Wageman, Ann Wagner, and Yvonne Younes with administrative and technical assistance from Michael Blank, Margaret Lee and Gary Wilson. The Iowa State University Statistical Lab provided sampling advice, data entry and coding and data analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.