943 resultados para Weighted average power tests
Resumo:
ABSTRACT – Background: According to the Report on Carcinogens, formaldehyde ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Given its economic importance and widespread use, many people are exposed to formaldehyde environmentally and/or occupationally. Presently, the International Agency for Research on Cancer classifies formaldehyde as carcinogenic to humans (Group 1), based on sufficient evidence in humans and in experimental animals. Manyfold in vitro studies clearly indicated that formaldehyde can induce genotoxic effects in proliferating cultured mammalian cells. Furthermore, some in vivo studies have found changes in epithelial cells and in peripheral blood lymphocytes related to formaldehyde exposure. Methods: A study was carried out in Portugal, using 80 workers occupationally exposed to formaldehyde vapours: 30 workers from formaldehyde and formaldehyde-based resins production factory and 50 from 10 pathology and anatomy laboratories. A control group of 85 non-exposed subjects was considered. Exposure assessment was performed by applying simultaneously two techniques of air monitoring: NIOSH Method 2541 and Photo Ionization Detection equipment with simultaneously video recording. Evaluation of genotoxic effects was performed by application of micronucleus test in exfoliated epithelial cells from buccal mucosa and peripheral blood lymphocytes. Results: Time-weighted average concentrations not exceeded the reference value (0.75 ppm) in the two occupational settings studied. Ceiling concentrations, on the other hand, were higher than reference value (0.3 ppm) in both. The frequency of micronucleus in peripheral blood lymphocytes and in epithelial cells was significantly higher in both exposed groups than in the control group (p < 0.001). Moreover, the frequency of micronucleus in peripheral blood lymphocytes was significantly higher in the laboratories group than in the factory workers (p < 0.05). A moderate positive correlation was found between duration of occupational exposure to formaldehyde (years of exposure) and micronucleus frequency in peripheral blood lymphocytes (r = 0.401; p < 0.001) and in epithelial cells (r = 0.209; p < 0.01). Conclusions: The population studied is exposed to high peak concentrations of formaldehyde with a long-term exposure. These two aspects, cumulatively, can be the cause of the observed genotoxic endpoint effects. The association of these cytogenetic effects with formaldehyde exposure gives important information to risk assessment process and may also be used to assess health risks for exposed worker
Resumo:
Standarização de um posto de trabalho não é mais que definir o melhor método de trabalho que vai ser seguido por todos os operadores que trabalham no mesmo. Uma vez definido esse método, é importante para uma empresa ter noção da produtividade que podem alcançar, dado que pode ser retirado a partir deste método, e é no seguimento disto que surge o estudo dos métodos e tempos, mais concretamente o estudo dos tempos por cronometragem. A aplicação deste estudo foi despoletada pela necessidade do IKEA Industry de Paços de Ferreira, em dar o próximo passo na standarização dos seus postos de trabalho, área a área, e da necessidade de terem uma pessoa em cada área que analisa-se o trabalho que estava a ser feito e calcula-se o tempo de cada rotina. Neste documento, é realizada uma interligação entre os conceitos teóricos que o método exige, como todo o conjunto de fórmulas, restrições, análises e ponderações, com o contexto laboral onde o mesmo foi aplicado e a estratégia desenvolvida pelo IKEA na realização do estudo. O estudo dos métodos e tempos por cronometragem, de todos os métodos existentes, pode ser considerado o mais completo e complexo, uma vez que é mais que observar, registar e retirar uma média ponderada das observações. Este método baseia-se num modelo matemático, que interliga uma série de conceitos e que tem sempre o operador em consideração, seja na avaliação e análise das tarefas que requerem mais esforço dos mesmos, físico ou psicológico, seja em termos de tempos de pausas pessoais que a lei obriga a que as empresas deem. Este detalhe, neste método, é de grande importância, uma vez que a standarização é sempre vista pelos operadores como uma punição. As desvantagens deste método estão no grau de conhecimento e capacidade de observação exigidas ao analista para o executar. Melhor dizendo, um analista que vá executar este trabalho necessita observar muito bem a rotina de trabalho e conhecer onde começa, acaba e tudo o que a ela não pertence, antes de começar a registar seja que tempos forem. Para além disso, é exigido ao analista que perceba o ritmo de trabalho dos operadores através da observação dos mesmos, de modo a que ninguém seja prejudicado. E por fim, é necessária uma grande disponibilidade da parte do analista para retirar o máximo de observações possíveis. Com o intuito de facilitar esta análise, o IKEA Industry criou um ficheiro que compila toda a informação relacionada com o método, e uma explicação de todos os parâmetros que o analista necessita ter em atenção. Esta folha de trabalho foi validada à luz do método, como é possível verificar no decorrer do documento. Um detalhe importante a referir, é que por muito fidedigno que seja este método, tal como qualquer método de standarização, a mínima alteração da rotina de trabalho invalida de imediato o tempo total da rotina, tornando necessário realizar o estudo novamente. Uma vantagem do documento criado pelo IKEA, está na rápida adaptação a estas alterações, uma vez que, caso seja acrescentado ou removido um elemento à rotina, basta alterar o documento, observar e cronometrar os operadores a executar esse novo elemento, e quase automaticamente é definido um novo tempo total padronizado na rotina. Este documento foi criado para fins académicos e de conclusão de um grau académico, mas o estudo quando aplicado na empresa deu origem a contratações, o que só por si mostra as vantagens e impacto que o mesmo pode ter em contexto laboral. Em termos de produtividade, uma vez que a sua aplicação não foi executada a tempo de ser estudada neste documento, não foi possível avaliar a mesma.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
Background. Accurate quantification of the prevalence of human immunodeficiency virus type 1 (HIV-1) drug resistance in patients who are receiving antiretroviral therapy (ART) is difficult, and results from previous studies vary. We attempted to assess the prevalence and dynamics of resistance in a highly representative patient cohort from Switzerland. Methods. On the basis of genotypic resistance test results and clinical data, we grouped patients according to their risk of harboring resistant viruses. Estimates of resistance prevalence were calculated on the basis of either the proportion of individuals with a virologic failure or confirmed drug resistance (lower estimate) or the frequency-weighted average of risk group-specific probabilities for the presence of drug resistance mutations (upper estimate). Results. Lower and upper estimates of drug resistance prevalence in 8064 ART-exposed patients were 50% and 57% in 1999 and 37% and 45% in 2007, respectively. This decrease was driven by 2 mechanisms: loss to follow-up or death of high-risk patients exposed to mono- or dual-nucleoside reverse-transcriptase inhibitor therapy (lower estimates range from 72% to 75%) and continued enrollment of low-risk patients who were taking combination ART containing boosted protease inhibitors or nonnucleoside reverse-transcriptase inhibitors as first-line therapy (lower estimates range from 7% to 12%). A subset of 4184 participants (52%) had 1 study visit per year during 2002-2007. In this subset, lower and upper estimates increased from 45% to 49% and from 52% to 55%, respectively. Yearly increases in prevalence were becoming smaller in later years. Conclusions. Contrary to earlier predictions, in situations of free access to drugs, close monitoring, and rapid introduction of new potent therapies, the emergence of drug-resistant viruses can be minimized at the population level. Moreover, this study demonstrates the necessity of interpreting time trends in the context of evolving cohort populations.
Resumo:
The paper demonstrates that the ratio of the Yitzhaki (1994) to the conventional measure of between-group inequality is in general equal to one minus twice the weighted average probability that a random member of a richer (on average) group is poorer than a random member of a poorer (on average) group, and may therefore be interpreted as an index of stratification in its own right.
Resumo:
As part of a project to use the long-lived (T(1/2)=1200a) (166m)Ho as reference source in its reference ionisation chamber, IRA standardised a commercially acquired solution of this nuclide using the 4pibeta-gamma coincidence and 4pigamma (NaI) methods. The (166m)Ho solution supplied by Isotope Product Laboratories was measured to have about 5% Europium impurities (3% (154)Eu, 0.94% (152)Eu and 0.9% (155)Eu). Holmium had therefore to be separated from europium, and this was carried out by means of ion-exchange chromatography. The holmium fractions were collected without europium contamination: 162h long HPGe gamma measurements indicated no europium impurity (detection limits of 0.01% for (152)Eu and (154)Eu, and 0.03% for (155)Eu). The primary measurement of the purified (166m)Ho solution with the 4pi (PC) beta-gamma coincidence technique was carried out at three gamma energy settings: a window around the 184.4keV peak and gamma thresholds at 121.8 and 637.3keV. The results show very good self-consistency, and the activity concentration of the solution was evaluated to be 45.640+/-0.098kBq/g (0.21% with k=1). The activity concentration of this solution was also measured by integral counting with a well-type 5''x5'' NaI(Tl) detector and efficiencies computed by Monte Carlo simulations using the GEANT code. These measurements were mutually consistent, while the resulting weighted average of the 4pi NaI(Tl) method was found to agree within 0.15% with the result of the 4pibeta-gamma coincidence technique. An ampoule of this solution and the measured value of the concentration were submitted to the BIPM as a contribution to the Système International de Référence.
Resumo:
This study examined the effects of intermittent hypoxic training (IHT) on skeletal muscle monocarboxylate lactate transporter (MCT) expression and anaerobic performance in trained athletes. Cyclists were assigned to two interventions, either normoxic (N; n = 8; 150 mmHg PIO2) or hypoxic (H; n = 10; ∼3000 m, 100 mmHg PIO2) over a three week training (5×1 h-1h30.week-1) period. Prior to and after training, an incremental exercise test to exhaustion (EXT) was performed in normoxia together with a 2 min time trial (TT). Biopsy samples from the vastus lateralis were analyzed for MCT1 and MCT4 using immuno-blotting techniques. The peak power output (PPO) increased (p<0.05) after training (7.2% and 6.6% for N and H, respectively), but VO2max showed no significant change. The average power output in the TT improved significantly (7.3% and 6.4% for N and H, respectively). No differences were found in MCT1 and MCT4 protein content, before and after the training in either the N or H group. These results indicate there are no additional benefits of IHT when compared to similar normoxic training. Hence, the addition of the hypoxic stimulus on anaerobic performance or MCT expression after a three-week training period is ineffective.
Resumo:
Linking the structural connectivity of brain circuits to their cooperative dynamics and emergent functions is a central aim of neuroscience research. Graph theory has recently been applied to study the structure-function relationship of networks, where dynamical similarity of different nodes has been turned into a "static" functional connection. However, the capability of the brain to adapt, learn and process external stimuli requires a constant dynamical functional rewiring between circuitries and cell assemblies. Hence, we must capture the changes of network functional connectivity over time. Multi-electrode array data present a unique challenge within this framework. We study the dynamics of gamma oscillations in acute slices of the somatosensory cortex from juvenile mice recorded by planar multi-electrode arrays. Bursts of gamma oscillatory activity lasting a few hundred milliseconds could be initiated only by brief trains of electrical stimulations applied at the deepest cortical layers and simultaneously delivered at multiple locations. Local field potentials were used to study the spatio-temporal properties and the instantaneous synchronization profile of the gamma oscillatory activity, combined with current source density (CSD) analysis. Pair-wise differences in the oscillation phase were used to determine the presence of instantaneous synchronization between the different sites of the circuitry during the oscillatory period. Despite variation in the duration of the oscillatory response over successive trials, they showed a constant average power, suggesting that the rate of expenditure of energy during the gamma bursts is consistent across repeated stimulations. Within each gamma burst, the functional connectivity map reflected the columnar organization of the neocortex. Over successive trials, an apparently random rearrangement of the functional connectivity was observed, with a more stable columnar than horizontal organization. This work reveals new features of evoked gamma oscillations in developing cortex.
Resumo:
While intermittent hypoxic training (IHT) has been reported to evoke cellular responses via hypoxia inducible factors (HIFs) but without substantial performance benefits in endurance athletes, we hypothesized that repeated sprint training in hypoxia could enhance repeated sprint ability (RSA) performed in normoxia via improved glycolysis and O(2) utilization. 40 trained subjects completed 8 cycling repeated sprint sessions in hypoxia (RSH, 3000 m) or normoxia (RSN, 485 m). Before (Pre-) and after (Post-) training, muscular levels of selected mRNAs were analyzed from resting muscle biopsies and RSA tested until exhaustion (10-s sprint, work-to-rest ratio 1ratio2) with muscle perfusion assessed by near-infrared spectroscopy. From Pre- to Post-, the average power output of all sprints in RSA was increased (p<0.01) to the same extent (6% vs 7%, NS) in RSH and in RSN but the number of sprints to exhaustion was increased in RSH (9.4+/-4.8 vs. 13.0+/-6.2 sprints, p<0.01) but not in RSN (9.3+/-4.2 vs. 8.9+/-3.5). mRNA concentrations of HIF-1alpha (+55%), carbonic anhydrase III (+35%) and monocarboxylate transporter-4 (+20%) were augmented (p<0.05) whereas mitochondrial transcription factor A (-40%), peroxisome proliferator-activated receptor gamma coactivator 1alpha (-23%) and monocarboxylate transporter-1 (-36%) were decreased (p<0.01) in RSH only. Besides, the changes in total hemoglobin variations (Delta[tHb]) during sprints throughout RSA test increased to a greater extent (p<0.01) in RSH. Our findings show larger improvement in repeated sprint performance in RSH than in RSN with significant molecular adaptations and larger blood perfusion variations in active muscles.
Resumo:
The simultaneous recording of scalp electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) can provide unique insights into the dynamics of human brain function, and the increased functional sensitivity offered by ultra-high field fMRI opens exciting perspectives for the future of this multimodal approach. However, simultaneous recordings are susceptible to various types of artifacts, many of which scale with magnetic field strength and can seriously compromise both EEG and fMRI data quality in recordings above 3T. The aim of the present study was to implement and characterize an optimized setup for simultaneous EEG-fMRI in humans at 7T. The effects of EEG cable length and geometry for signal transmission between the cap and amplifiers were assessed in a phantom model, with specific attention to noise contributions from the MR scanner coldheads. Cable shortening (down to 12cm from cap to amplifiers) and bundling effectively reduced environment noise by up to 84% in average power and 91% in inter-channel power variability. Subject safety was assessed and confirmed via numerical simulations of RF power distribution and temperature measurements on a phantom model, building on the limited existing literature at ultra-high field. MRI data degradation effects due to the EEG system were characterized via B0 and B1(+) field mapping on a human volunteer, demonstrating important, although not prohibitive, B1 disruption effects. With the optimized setup, simultaneous EEG-fMRI acquisitions were performed on 5 healthy volunteers undergoing two visual paradigms: an eyes-open/eyes-closed task, and a visual evoked potential (VEP) paradigm using reversing-checkerboard stimulation. EEG data exhibited clear occipital alpha modulation and average VEPs, respectively, with concomitant BOLD signal changes. On a single-trial level, alpha power variations could be observed with relative confidence on all trials; VEP detection was more limited, although statistically significant responses could be detected in more than 50% of trials for every subject. Overall, we conclude that the proposed setup is well suited for simultaneous EEG-fMRI at 7T.
Resumo:
Seasonal trends in littertall and potential mineral return were studied in two cork-oak forest sites in the northeastern Iberian peninsula. The estimated average litter production was 3.9.M- gy.e1ahar for one site and 4.6 .M- gy.e1ahar for the other; these figures are similar to those reported for holm-oak (Quercus ilex) forests in the same area. Seasonal litterfall patterns were typical of Mediterranean forest ecosystems. Leaves accounted for 46 to 78% of the total dry matter. Their annual weighted-average mineral composition was low in macronutrients (N 8-9; K 4-5; Mg 0.8-1.3; Ca 9-10 and P 0.4-1 m-)g.1g and relatively high in micronutrients such as Mn (2-2.2 m-)g.1g or Fe (0.3-0.4 m)-g..1g Minimum N and P concentrations were found during the growth period. Estimates of potential mineral return for an annual cycle were N 38-52, P 2.1-5.2, K 20-28, Ca 44-53 and Mg 5.4-5.0 k-,g.1ha depending on the site biomass and fertility
Resumo:
The purpose of this study was to examine the relationship between skeletal muscle monocarboxylate transporters 1 and 4 (MCT1 and MCT4) expression, skeletal muscle oxidative capacity and endurance performance in trained cyclists. Ten well-trained cyclists (mean +/- SD; age 24.4 +/- 2.8 years, body mass 73.2 +/- 8.3 kg, VO(2max) 58 +/- 7 ml kg(-1) min(-1)) completed three endurance performance tasks [incremental exercise test to exhaustion, 2 and 10 min time trial (TT)]. In addition, a muscle biopsy sample from the vastus lateralis muscle was analysed for MCT1 and MCT4 expression levels together with the activity of citrate synthase (CS) and 3-hydroxyacyl-CoA dehydrogenase (HAD). There was a tendency for VO(2max) and peak power output obtained in the incremental exercise test to be correlated with MCT1 (r = -0.71 to -0.74; P < 0.06), but not MCT4. The average power output (P (average)) in the 2 min TT was significantly correlated with MCT4 (r = -0.74; P < 0.05) and HAD (r = -0.92; P < 0.01). The P (average) in the 10 min TT was only correlated with CS activity (r = 0.68; P < 0.05). These results indicate the relationship between MCT1 and MCT4 as well as cycle TT performance may be influenced by the length and intensity of the task.
Resumo:
PURPOSE: Repeated-sprint training in hypoxia (RSH) was recently shown to improve repeated-sprint ability (RSA) in cycling. This phenomenon is likely to reflect fiber type-dependent, compensatory vasodilation, and therefore, our hypothesis was that RSH is even more beneficial for activities involving upper body muscles, such as double poling during cross-country skiing. METHODS: In a double-blinded fashion, 17 competitive cross-country skiers performed six sessions of repeated sprints (each consisting of four sets of five 10-s sprints, with 20-s intervals of recovery) either in normoxia (RSN, 300 m; FiO2, 20.9%; n = 8) or normobaric hypoxia (RSH, 3000 m; FiO2, 13.8 %; n = 9). Before (pre) and after (post) training, performance was evaluated with an RSA test (10-s all-out sprints-20-s recovery, until peak power output declined by 30%) and a simulated team sprint (team sprint, 3 × 3-min all-out with 3-min rest) on a double-poling ergometer. Triceps brachii oxygenation was measured by near-infrared spectroscopy. RESULTS: From pretraining to posttraining, peak power output in the RSA was increased (P < 0.01) to the same extent (29% ± 13% vs 26% ± 18%, nonsignificant) in RSH and in RSN whereas the number of sprints performed was enhanced in RSH (10.9 ± 5.2 vs 17.1 ± 6.8, P < 0.01) but not in RSN (11.6 ± 5.3 vs 11.7 ± 4.3, nonsignificant). In addition, the amplitude in total hemoglobin variations during sprints throughout RSA rose more in RSH (P < 0.01). Similarly, the average power output during all team sprints improved by 11% ± 9% in RSH and 15% ± 7% in RSN. CONCLUSIONS: Our findings reveal greater improvement in the performance of repeated double-poling sprints, together with larger variations in the perfusion of upper body muscles in RSH compared with those in RSN.
Resumo:
General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.