967 resultados para transparency thresholds


Relevância:

10.00% 10.00%

Publicador:

Resumo:

General Introduction This thesis can be divided into two main parts :the first one, corresponding to the first three chapters, studies Rules of Origin (RoOs) in Preferential Trade Agreements (PTAs); the second part -the fourth chapter- is concerned with Anti-Dumping (AD) measures. Despite wide-ranging preferential access granted to developing countries by industrial ones under North-South Trade Agreements -whether reciprocal, like the Europe Agreements (EAs) or NAFTA, or not, such as the GSP, AGOA, or EBA-, it has been claimed that the benefits from improved market access keep falling short of the full potential benefits. RoOs are largely regarded as a primary cause of the under-utilization of improved market access of PTAs. RoOs are the rules that determine the eligibility of goods to preferential treatment. Their economic justification is to prevent trade deflection, i.e. to prevent non-preferred exporters from using the tariff preferences. However, they are complex, cost raising and cumbersome, and can be manipulated by organised special interest groups. As a result, RoOs can restrain trade beyond what it is needed to prevent trade deflection and hence restrict market access in a statistically significant and quantitatively large proportion. Part l In order to further our understanding of the effects of RoOs in PTAs, the first chapter, written with Pr. Olivier Cadot, Celine Carrère and Pr. Jaime de Melo, describes and evaluates the RoOs governing EU and US PTAs. It draws on utilization-rate data for Mexican exports to the US in 2001 and on similar data for ACP exports to the EU in 2002. The paper makes two contributions. First, we construct an R-index of restrictiveness of RoOs along the lines first proposed by Estevadeordal (2000) for NAFTA, modifying it and extending it for the EU's single-list (SL). This synthetic R-index is then used to compare Roos under NAFTA and PANEURO. The two main findings of the chapter are as follows. First, it shows, in the case of PANEURO, that the R-index is useful to summarize how countries are differently affected by the same set of RoOs because of their different export baskets to the EU. Second, it is shown that the Rindex is a relatively reliable statistic in the sense that, subject to caveats, after controlling for the extent of tariff preference at the tariff-line level, it accounts for differences in utilization rates at the tariff line level. Finally, together with utilization rates, the index can be used to estimate total compliance costs of RoOs. The second chapter proposes a reform of preferential Roos with the aim of making them more transparent and less discriminatory. Such a reform would make preferential blocs more "cross-compatible" and would therefore facilitate cumulation. It would also contribute to move regionalism toward more openness and hence to make it more compatible with the multilateral trading system. It focuses on NAFTA, one of the most restrictive FTAs (see Estevadeordal and Suominen 2006), and proposes a way forward that is close in spirit to what the EU Commission is considering for the PANEURO system. In a nutshell, the idea is to replace the current array of RoOs by a single instrument- Maximum Foreign Content (MFC). An MFC is a conceptually clear and transparent instrument, like a tariff. Therefore changing all instruments into an MFC would bring improved transparency pretty much like the "tariffication" of NTBs. The methodology for this exercise is as follows: In step 1, I estimate the relationship between utilization rates, tariff preferences and RoOs. In step 2, I retrieve the estimates and invert the relationship to get a simulated MFC that gives, line by line, the same utilization rate as the old array of Roos. In step 3, I calculate the trade-weighted average of the simulated MFC across all lines to get an overall equivalent of the current system and explore the possibility of setting this unique instrument at a uniform rate across lines. This would have two advantages. First, like a uniform tariff, a uniform MFC would make it difficult for lobbies to manipulate the instrument at the margin. This argument is standard in the political-economy literature and has been used time and again in support of reductions in the variance of tariffs (together with standard welfare considerations). Second, uniformity across lines is the only way to eliminate the indirect source of discrimination alluded to earlier. Only if two countries face uniform RoOs and tariff preference will they face uniform incentives irrespective of their initial export structure. The result of this exercise is striking: the average simulated MFC is 25% of good value, a very low (i.e. restrictive) level, confirming Estevadeordal and Suominen's critical assessment of NAFTA's RoOs. Adopting a uniform MFC would imply a relaxation from the benchmark level for sectors like chemicals or textiles & apparel, and a stiffening for wood products, papers and base metals. Overall, however, the changes are not drastic, suggesting perhaps only moderate resistance to change from special interests. The third chapter of the thesis considers whether Europe Agreements of the EU, with the current sets of RoOs, could be the potential model for future EU-centered PTAs. First, I have studied and coded at the six-digit level of the Harmonised System (HS) .both the old RoOs -used before 1997- and the "Single list" Roos -used since 1997. Second, using a Constant Elasticity Transformation function where CEEC exporters smoothly mix sales between the EU and the rest of the world by comparing producer prices on each market, I have estimated the trade effects of the EU RoOs. The estimates suggest that much of the market access conferred by the EAs -outside sensitive sectors- was undone by the cost-raising effects of RoOs. The chapter also contains an analysis of the evolution of the CEECs' trade with the EU from post-communism to accession. Part II The last chapter of the thesis is concerned with anti-dumping, another trade-policy instrument having the effect of reducing market access. In 1995, the Uruguay Round introduced in the Anti-Dumping Agreement (ADA) a mandatory "sunset-review" clause (Article 11.3 ADA) under which anti-dumping measures should be reviewed no later than five years from their imposition and terminated unless there was a serious risk of resumption of injurious dumping. The last chapter, written with Pr. Olivier Cadot and Pr. Jaime de Melo, uses a new database on Anti-Dumping (AD) measures worldwide to assess whether the sunset-review agreement had any effect. The question we address is whether the WTO Agreement succeeded in imposing the discipline of a five-year cycle on AD measures and, ultimately, in curbing their length. Two methods are used; count data analysis and survival analysis. First, using Poisson and Negative Binomial regressions, the count of AD measures' revocations is regressed on (inter alia) the count of "initiations" lagged five years. The analysis yields a coefficient on measures' initiations lagged five years that is larger and more precisely estimated after the agreement than before, suggesting some effect. However the coefficient estimate is nowhere near the value that would give a one-for-one relationship between initiations and revocations after five years. We also find that (i) if the agreement affected EU AD practices, the effect went the wrong way, the five-year cycle being quantitatively weaker after the agreement than before; (ii) the agreement had no visible effect on the United States except for aone-time peak in 2000, suggesting a mopping-up of old cases. Second, the survival analysis of AD measures around the world suggests a shortening of their expected lifetime after the agreement, and this shortening effect (a downward shift in the survival function postagreement) was larger and more significant for measures targeted at WTO members than for those targeted at non-members (for which WTO disciplines do not bind), suggesting that compliance was de jure. A difference-in-differences Cox regression confirms this diagnosis: controlling for the countries imposing the measures, for the investigated countries and for the products' sector, we find a larger increase in the hazard rate of AD measures covered by the Agreement than for other measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Osteoporosis is well recognized as a public health problem in industrialized countries. Because of the efficiency of new treatments to decrease fracture risk, it is of a major interest to detect the patients who should benefit from such treatments. A diagnosis of osteoporosis is necessary before to start a specific treatment. This diagnosis is based on the measurement of the skeleton (hip and spine) with dual X-ray absorptiometry, using diagnostic criteria established by the World Health Organisation (WHO). In Switzerland, indications for bone densitometry are limited to precise situations. This technique cannot be applied for screening. For this purpose, peripheral measurements and particularly quantitative ultrasounds of bone seem to be promising. Indeed, several prospective studies clearly showed their predictive power for hip fracture risk in women aged more than 65 years. In order to facilitate the clinical use of bone ultrasounds, thresholds of risk of fracture and osteoporosis of the hip will be shortly published. This will integrate bone ultrasound in a global concept including bone densitometry and its indications, but also other risk factors for osteoporosis recognized by the Swiss association against osteoporosis (ASCO).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To compare the effect of a rat anti-VEGF antibody, administered either by topical or subconjunctival (SC) routes, on a rat model of corneal transplant rejection.METHODS: Twenty-four rats underwent corneal transplantation and were randomized into four treatment groups (n=6 in each group). G1 and G2 received six SC injections (0.02 ml 10 µg/ml) of denatured (G1) or active (G2) anti-VEGF from Day 0 to Day 21 every third day. G3 and G4 were instilled three times a day with denatured (G3) or active (G4) anti-VEGF drops (10 µg/ml) from Day 0 to Day 21. Corneal mean clinical scores (MCSs) of edema (E), transparency (T), and neovessels (nv) were recorded at Days 3, 9, 15, and 21. Quantification of neovessels was performed after lectin staining of vessels on flat mounted corneas.RESULTS: Twenty-one days after surgery, MCSs differed significantly between G1 and G2, but not between G3 and G4, and the rejection rate was significantly reduced in rats receiving active antibodies regardless of the route of administration (G2=50%, G4=66.65% versus G1 and G3=100%; p<0.05). The mean surfaces of neovessels were significantly reduced in groups treated with active anti-VEGF (G2, G4). However, anti-VEGF therapy did not completely suppress corneal neovessels.CONCLUSIONS: Specific rat anti-VEGF antibodies significantly reduced neovascularization and subsequent corneal graft rejection. The SC administration of the anti-VEGF antibody was more effective than topical instillation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The association between adiposity measures and dyslipidemia has seldom been assessed in a multipopulational setting. 27 populations from Europe, Australia, New Zealand and Canada (WHO MONICA project) using health surveys conducted between 1990 and 1997 in adults aged 35-64 years (n = 40,480). Dyslipidemia was defined as the total/HDL cholesterol ratio >6 (men) and >5 (women). Overall prevalence of dyslipidemia was 25% in men and 23% in women. Logistic regression showed that dyslipidemia was strongly associated with body mass index (BMI) in men and with waist circumference (WC) in women, after adjusting for region, age and smoking. Among normal-weight men and women (BMI<25 kg/m(2)), an increase in the odds for being dyslipidemic was observed between lowest and highest WC quartiles (OR = 3.6, p < 0.001). Among obese men (BMI ≥ 30), the corresponding increase was smaller (OR = 1.2, p = 0.036). A similar weakening was observed among women. Classification tree analysis was performed to assign subjects into classes of risk for dyslipidemia. BMI thresholds (25.4 and 29.2 kg/m(2)) in men and WC thresholds (81.7 and 92.6 cm) in women came out at first stages. High WC (>84.8 cm) in normal-weight men, menopause in women and regular smoking further defined subgroups at increased risk. standard categories of BMI and WC, or their combinations, do not lead to optimal risk stratification for dyslipidemia in middle-age adults. Sex-specific adaptations are necessary, in particular by taking into account abdominal obesity in normal-weight men, post-menopausal age in women and regular smoking in both sexes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To investigate the effect of intraocular straylight (IOS) induced by white opacity filters (WOF) on threshold measurements for stimuli employed in three perimeters: standard automated perimetry (SAP), pulsar perimetry (PP) and the Moorfields motion displacement test (MDT).¦METHODS: Four healthy young (24-28 years old) observers were tested six times with each perimeter, each time with one of five different WOFs and once without, inducing various levels of IOS (from 10% to 200%). An increase in IOS was measured with a straylight meter. The change in sensitivity from baseline was normalized, allowing comparison of standardized (z) scores (change divided by the SD of normative values) for each instrument.¦RESULTS: SAP and PP thresholds were significantly affected (P < 0.001) by moderate to large increases in IOS (50%-200%). The drop in motion displacement (MD) from baseline with WOF 5, was approximately 5 dB, in both SAP and PP which represents a clinically significant loss; in contrast the change in MD with MDT was on average 1 minute of arc, which is not likely to indicate a clinically significant loss.¦CONCLUSIONS: The Moorfields MDT is more robust to the effects of additional straylight in comparison with SAP or PP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Casos de fraudes têm ocorrido, frequentemente no mercado mundial. Diversos são os profissionais envolvidos nesses casos, inclusive os da contabilidade. Os escândalos contabilísticos, especialmente os mais famosos, como os incidido nas empresas Enron e Wordcom, acenderam para uma maior preocupação em relação a conduta ética dos profissionais da contabilidade. Como consequência há uma maior exigência quanto a transparência e a fidedignidade das informações prestadas por estes profissionais. Esta preocupação visa, sobretudo, manter a confiança das empresas, investidores, fornece-dores e sociedade em geral, de entre outras, na responsabilidade ética do contabilista, de-negrida pelo envolvimento nas fraudes detectadas. Desta forma, o presente estudo teve como objectivo verificar a conduta ética dos contabilistas, quando, no exercício da sua profissão, depararem com questões relacionadas a fraudes. Nesse sentido considerou-se factores que podem vir a influenciar o processo decisório ético de um indivíduo, demonstrados através do modelo de tomada de decisão, desenvolvido por Alves, quanto a motivar um indivíduo a cometer uma fraude, evidenciada através do modelo desenvolvido por Cressey. Tentando responder a questão norteadora desta pesquisa, executou-se a análise descritiva e estatística dos dados. Em relação a análise descritiva, foram elaboradas tabelas de frequência. Para a análise estatística dos dados foi utilizado o teste não paramétrico de Spearman. Os resultados demonstraram que a maioria dos contabilistas, da amostra pesquisada, reconhece a questão moral inserida nos cenários, e discordam dos actos dos agentes de cada cenário, e, ainda os classificam como graves ou muito graves. A pesquisa revelou maior aproximação desses profissionais a corrente teleológica, uma vez que a intenção de agir é mais influenciada por alguns factores como a oportunidade, a racionalização e principalmente a pressão. Alguns factores individuais apresentam influências sob o posicionamento ético dos contabilistas entrevistados nesta pesquisa. Cases of fraud have occurred, in the word market. Several are involved in these cases, including the accounting class. The accounting scandals, especially the most famous, such as focusing on companies and Enron Word Com, kindled to greater concern about the ethical conduct of professional accounting. As a result there is a greater demand on the transparency and reliability of information provide by these professionals This concern is aimed, primarily, to maintain the confidence of businesses, investor, suppliers and society, among others, the ethical responsibility of the meter, denigrated, by involvement in the fraud detected. Thus, this study aimed to verify the ethical conduct of accounts in when, in the exercise of their professional activities, is confronted with issues related to fraud. This is considered some factors that can both come to influence the ethical decision making of an individual, demonstrated by the model of decision making, developed by Alves, as a motivated individual to commit a fraudulent act, developed by Cressey. Seeking to answer question, guiding this study, performed to exploratory and confirmatory analysis of data. For exploratory data analysis were made table of frequencies. For confirmatory analysis of data, were used non parametric tests of Spearman. The results showed that the majority of accountings professionals, the sample, recognizing the moral issue included in the scenarios, disagrees the acts of agents of each scenario, and also classifies such acts as serious and very serious. However, we found that these accounting professionals tend to have a position more toward the teleological theory, since the intention to act is influenced by factors as opportunity, rationalization and particularly the pressure. Some individual factors also had influence on the ethical position of the professional interviewed is this research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A transição para o modo de tributação único, a nível de rendimentos, constituiu uma modificação profunda na base de incidência e nas regras de determinação da matéria colectável dos impostos sobre os rendimentos, de forma a exprimir uma nova relação contribuinte – fisco, baseada numa maior transparência e simplificação dos procedimentos por parte da Administração Fiscal, mas também duma maior responsabilização dos contribuintes pelos seus comportamentos e declarações. Este trabalho teve como propósito estudar a forma como os rendimentos familiares são tributados, nomeadamente, a incidência na tributação de sujeitos passivos singulares, referindo assim o periodo da tributação, os métodos utilizados na determinação da matéria colectável e as taxas aplicadas. Para tanto, foram reunidos estudos teóricos e práticos a nível da tributação dos rendimentos de pessoas singulares, nomeadamente os principios e as regras praticados. Foi feito um estudo de caso sobre apuramento de imposto dos contribuintes casados dois titulares, a partir de formulários modelos 6A da Repartição de Finanças São Vicente, e o objecto desse estudo foi separar os rendimentos desses contribuintes e fazer o apuramento do respectivo imposto em separado. Os resultados do estudo apontam uma vantagem bastante satisfatória para os contribuintes no que diz respeito ao apuramento do imposto em separado. The transition to the unique taxation method, in the level of incomes, is a deep modification in the incidence base and in the determination rules of the basis of tax assessments of income taxes, in a way to express a new taxpayer – Exchequer relationship, based on a greater transparency and simplification of the procedures by Fiscal Administration, and a bigger responsabilization of taxpayers for their behaviours and declarations as well. The aim of this survey was to study how household incomes are taxed, namely: the incidence in the taxation of passive single persons, referring, thus, the taxation period, the methods used to determinate the basis of tax assessments and the applied taxes. For that, theoretical and practical studies were collected in the level of single persons incomes taxation, namely the principles and rules practised. A case study about tax verification of married taxpayers two holders was made, from São Vicente Financial Department’s 6A model forms, and the aim of this study was to separate the incomes of these taxpayers and make the verification of the respective tax separately. The results of this study points out a very satisfying advantage to taxpayers regarding the tax verification separately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1 Insect pests, biological invasions and climate change are considered to representmajor threats to biodiversity, ecosystem functioning, agriculture and forestry.Deriving hypothesis of contemporary and/or future potential distributions of insectpests and invasive species is becoming an important tool for predicting the spatialstructure of potential threats.2 The western corn rootworm (WCR) Diabrotica virgifera virgifera LeConte is apest of maize in North America that has invaded Europe in recent years, resultingin economic costs in terms of maize yields in both continents. The present studyaimed to estimate the dynamics of potential areas of invasion by the WCR under aclimate change scenario in the Northern Hemisphere. The areas at risk under thisscenario were assessed by comparing, using complementary approaches, the spatialprojections of current and future areas of climatic favourability of the WCR. Spatialhypothesis were generated with respect to the presence records in the native rangeof the WCR and physiological thresholds from previous empirical studies.3 We used a previously developed protocol specifically designed to estimatethe climatic favourability of the WCR. We selected the most biologicallyrelevant climatic predictors and then used multidimensional envelope (MDE) andMahalanobis distances (MD) approaches to derive potential distributions for currentand future climatic conditions.4 The results obtained showed a northward advancement of the upper physiologicallimit as a result of climate change, which might increase the strength of outbreaksat higher latitudes. In addition, both MDE and MD outputs predict the stability ofclimatic favourability for the WCR in the core of the already invaded area in Europe,which suggests that this zone would continue to experience damage from this pestin Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The integrity of the cornea, the most anterior part of the eye, is indispensable for vision. Forty-five million individuals worldwide are bilaterally blind and another 135 million have severely impaired vision in both eyes because of loss of corneal transparency; treatments range from local medications to corneal transplants, and more recently to stem cell therapy. The corneal epithelium is a squamous epithelium that is constantly renewing, with a vertical turnover of 7 to 14 days in many mammals. Identification of slow cycling cells (label-retaining cells) in the limbus of the mouse has led to the notion that the limbus is the niche for the stem cells responsible for the long-term renewal of the cornea; hence, the corneal epithelium is supposedly renewed by cells generated at and migrating from the limbus, in marked opposition to other squamous epithelia in which each resident stem cell has in charge a limited area of epithelium. Here we show that the corneal epithelium of the mouse can be serially transplanted, is self-maintained and contains oligopotent stem cells with the capacity to generate goblet cells if provided with a conjunctival environment. Furthermore, the entire ocular surface of the pig, including the cornea, contains oligopotent stem cells (holoclones) with the capacity to generate individual colonies of corneal and conjunctival cells. Therefore, the limbus is not the only niche for corneal stem cells and corneal renewal is not different from other squamous epithelia. We propose a model that unifies our observations with the literature and explains why the limbal region is enriched in stem cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A population register is an inventory of residents within a country, with their characteristics (date of birth, sex, marital status, etc.) and other socio-economic data, such as occupation or education. However, data on population are also stored in numerous other public registers such as tax, land, building and housing, military, foreigners, vehicles, etc. Altogether they contain vast amounts of personal and sensitive information. Access to public information is granted by law in many countries, but this transparency is generally subject to tensions with data protection laws. This paper proposes a framework to analyze data access (or protection) requirements, as well as a model of metadata for data exchange.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Secondary accident statistics can be useful for studying the impact of traffic incident management strategies. An easy-to-implement methodology is presented for classifying secondary accidents using data fusion of a police accident database with intranet incident reports. A current method for classifying secondary accidents uses a static threshold that represents the spatial and temporal region of influence of the primary accident, such as two miles and one hour. An accident is considered secondary if it occurs upstream from the primary accident and is within the duration and queue of the primary accident. However, using the static threshold may result in both false positives and negatives because accident queues are constantly varying. The methodology presented in this report seeks to improve upon this existing method by making the threshold dynamic. An incident progression curve is used to mark the end of the queue throughout the entire incident. Four steps in the development of incident progression curves are described. Step one is the processing of intranet incident reports. Step two is the filling in of incomplete incident reports. Step three is the nonlinear regression of incident progression curves. Step four is the merging of individual incident progression curves into one master curve. To illustrate this methodology, 5,514 accidents from Missouri freeways were analyzed. The results show that secondary accidents identified by dynamic versus static thresholds can differ by more than 30%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: We launched an investigator-initiated study (ISRCTN31181395) to evaluate the potential benefit of pharmacokinetic-guided dosage individualization of imatinib for leukaemiapatients followed in public and private sectors. Following approval by the research ethics committee (REC) of the coordinating centre, recruitment throughout Switzerland necessitatedto submit the protocol to 11 cantonal RECs.Materials and Methods: We analysed requirements and evaluation procedures of the 12 RECs with associated costs.Results: 1-18 copies of the dossier, in total 4300 printed pages, were required (printing/posting costs: ~300 CHF) to meet initial requirements. Meeting frequencies of RECs ranged between 2 weeks and 2 months, time from submission to fi rst feedback took 2-75 days. Study approval was obtained from a chairman, a subor the full committee, the evaluation work being invoiced by0-1000 CHF (median: 750 CHF, total: 9200 CHF). While 5 RECs gave immediate approval, the other 6 rose in total 38 queries before study release, mainly related to wording in the patient information, leading to 7 different fi nal versions approved. Submission tasks employed an investigator half-time over about 6 months.Conclusion: While the necessity of clinical research evaluation by independent RECs is undisputed, there is a need of further harmonization and cooperation in evaluation procedures. Current administrative burden is indeed complex, time-consuming and costly. A harmonized electronic application form, preferably compatible with other regulatory bodies and European countries, could increase transparency, improve communication, and encourage academic multi-centre clinical research in Switzerland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To what extent do Voting Advice Applications (VAA) have an influence on voting behaviour and to what extent should providers be hold accountable for such tools? This paper puts forward some empirical evidence from the Swiss VAA smartvote. The enormous popularity of smartvote in the last national elections in 2007 and the feedback of users and candidates let us come to the conclusion that smartvote is more than a toy and likely to have an influence on the voting decisions. Since Swiss citizens not only vote for parties but also for candidates, and the voting recommendation of smartvote is based on the political positions of the candidates, smartvote turns out to be particularly helpful. Political scientists must not keep their hands off such tools. Scientific research is needed to understand their functioning and possibilities to manipulate elections. On the bases of a legal study we come to the conclusion, that a science driven way of setting up such tools is essential for their legitimacy. However, we do not believe that there is a single best way of setting up such a tool and rather support a market like solution with different competing tools, provided they meet minimal standards like transparency and equal access for all parties and candidates. Once the process of selecting candidates and parties are directly linked to the act of voting, all these questions will become even more salient.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.Methodology/Principal Findings: We built up two prediction rules ("Snap-shot rule" for a single sample and "Track-shot rule" for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior >= 5% or < 5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200x10(6)/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.Conclusions/Significance: Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count > 650 for a threshold of 200, > 900 for 350, or > 1150 for 500x10(6)/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.