64 resultados para Epidemiologic surveillance
Resumo:
There has always been a question mark over how best to integrate developing countries into the world trading system and traditionally the WTO has used special and differential treatment (S&D) to do so. However, since 1996 the WTO has been involved with the Aid for Trade (AfT) initiative typically co-ordinated by the OECD and UN. This article firstly outlines the background to AfT since 1996 under the numerous agencies working in the area, highlighting how importance has always been placed on the monitoring and effectiveness of the process. It then turns to assessing the various methods currently used and the proposal of the WTO’s Trade Policy Review Mechanism (TPRM) as a potential monitoring tool of AfT.
Resumo:
Passive equipments operating in the 30-300 GHZ (millimeter wave) band are compared to those in the 300 GHz-3 THz (submillimeter band). Equipments operating in the submillimeter band can measure distance and also spectral information and have been used to address new opportunities in security. Solid state spectral information is available in the submillimeter region making it possible to identify materials, whereas in millimeter region bulk optical properties determine the image contrast. The optical properties in the region from 30 GHz to 3 THz are discussed for some typical inorganic and organic solids. in the millimeter-wave region of the spectrum, obscurants such as poor weather, dust, and smoke can be penetrated and useful imagery generated for surveillance. in the 30 GHZ-3 THZ region dielectrics such as plastic and cloth are also transparent and the detection of contraband hidden under clothing is possible. A passive millimeter-wave imaging concept based on a folded Schmidt camera has been developed and applied to poor weather navigation and security. The optical design uses a rotating mirror and is folded using polarization techniques. The design is very well corrected over a wide field of view making it ideal for surveillance, and security. This produces a relatively compact imager which minimizes the receiver count.
The incidence of adenocarcinoma in Barrett's oesophagus and an evaluation of endoscopic surveillance
Resumo:
Objective: Several surveillance definitions of influenza-like illness (ILI) have been proposed, based on the presence of symptoms. Symptom data can be obtained from patients, medical records, or both. Past research has found that agreements between health record data and self-report are variable depending on the specific symptom. Therefore, we aimed to explore the implications of using data on influenza symptoms extracted from medical records, similar data collected prospectively from outpatients, and the combined data from both sources as predictors of laboratory-confirmed influenza. Methods: Using data from the Hutterite Influenza Prevention Study, we calculated: 1) the sensitivity, specificity and predictive values of individual symptoms within surveillance definitions; 2) how frequently surveillance definitions correlated to laboratory-confirmed influenza; and 3) the predictive value of surveillance definitions. Results: Of the 176 participants with reports from participants and medical records, 142 (81%) were tested for influenza and 37 (26%) were PCR positive for influenza. Fever (alone) and fever combined with cough and/or sore throat were highly correlated with being PCR positive for influenza for all data sources. ILI surveillance definitions, based on symptom data from medical records only or from both medical records and self-report, were better predictors of laboratory-confirmed influenza with higher odds ratios and positive predictive values. Discussion: The choice of data source to determine ILI will depend on the patient population, outcome of interest, availability of data source, and use for clinical decision making, research, or surveillance. © Canadian Public Health Association, 2012.
Resumo:
To determine the incidence of giant retinal tear (GRT) in the United Kingdom and to provide epidemiologic data, clinical characteristics, treatment methods, and short-term outcomes in affected and fellow eyes. METHODS. Patients with a newly developed GRT (90° or greater in circumferential extent associated with posterior vitreous detachment) were identified prospectively over a 13-month period (January 2007-January 2008, inclusive) by active surveillance through the British Ophthalmic Surveillance Unit. Questionnaire-based data were obtained from reporting ophthalmologists at baseline and 12 months. RESULTS. Sixty patients (62 eyes) developed a new GRT, giving a U.K. annual incidence of 0.094 (95% CI 0.072-0.120) cases or 0.091 (95% CI 0.069-0.117) patients per 100,000. The GRTs were mostly idiopathic (54.8%), affected middle-aged (mean, 42.2 years), white British (93.3%) males (71.7%), with presenting vision worse than 20/40 in 59.7%, foveal detachment in 45.2%, and proliferative vitreoretinopathy of grade C (PVR-C) or worse in 11.3%. Treatment in most was managed by pars plana vitrectomy (93.5%) with laser retinopexy (52.5%) and silicone oil endotamponade (75.8%). Prophylactic 360° laser or cryotherapy was applied to 39.0% of the fellow eyes. At mean follow-up of 11.3 months, eventual retinal reattachment was attained in 94.7%, although only 42.1% achieved vision of =20/40. Neither GRT nor RD developed in any of the 19 nontraumatic, noniatrogenic, prophylactically treated fellow eyes. CONCLUSIONS. This study is the first population-based prospective effort to evaluate the epidemiology of GRT. Although onlya minority presented with PVR-C and high retinal reattachment rates were achieved, fewer than half had vision sufficient for driving in the GRT eye.
Resumo:
OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.
DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.
SETTING: Primary and secondary care.
PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).
INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.
MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).
RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.
LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.
CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.
FUNDING: The National Institute for Health Research Health Technology Assessment Programme.
Resumo:
AIM: To estimate the incidence of severe chemical corneal injuries in the UK and describe presenting clinical features and initial management.
METHODS: All patients with severe chemical corneal injury in the UK from December 2005 to November 2006 inclusive were prospectively identified using the British Ophthalmological Surveillance Unit. Reporting ophthalmologists provided information regarding presentation and follow-up.
RESULTS: Twelve cases were identified, giving a minimum estimated incidence in the UK of severe chemical corneal injury of 0.02 per 100,000. 66.7% of injuries were in males of working age, 50% occurred at work, and alkali was causative in 66.7%. Only one patient was wearing eye protection at the time of injury, 75% received immediate irrigation. Six patients required one or more surgical procedures, most commonly amniotic membrane graft. At 6 months' follow-up, the best-corrected visual acuity was 6/12 or better in five patients, and worse than 6/60 in two.
CONCLUSION: The incidence of severe chemical corneal injury in the UK is low. The cases that occur can require extended hospital treatment, with substantial ocular morbidity and visual sequelae. Current enforcement of eye protection in the workplace in the UK has probably contributed to a reduced incidence of severe ocular burns.
Resumo:
While the influence of temperature and moisture on the free-living stages of gastrointestinal nematodes have been described in detail, and evidence for global climate change is mounting, there have been only a few attempts to relate altered incidence or seasonal patterns of disease to climate change. Studies of this type have been completed for England Scotland and Wales, but not for Northern Ireland (NI). Here we present an analysis of veterinary diagnostic data that relates three categories of gastrointestinal nematode infection in sheep to historical meteorological data for NI. The infections are: trichostrongylosis/teladorsagiosis (Teladorsagia/Trichostrongylus), strongyloidosis and nematodirosis. This study aims to provide a baseline for future climate change analyses and to provide basic information for the development of nematode control programmes. After identifying and evaluating possible sources of bias, climate change was found to be the most likely explanation for the observed patterns of change in parasite epidemiology, although other hypotheses could not be refuted. Seasonal rates of diagnosis showed a uniform year-round distribution for Teladorsagia and Trichostrongylus infections, suggesting consistent levels of larval survival throughout the year and extension of the traditionally expected seasonal transmission windows. Nematodirosis showed a higher level of autumn than Spring infection, suggesting that suitable conditions for egg and larval development occurred after the Spring infection period. Differences between regions within the Province were shown for strongyloidosis, with peaks of infection falling in the period September-November. For all three-infection categories (trichostrongylosis/teladorsagiosis, strongyloidosis and nematodirosis), significant differences in the rates of diagnosis, and in the seasonality of disease, were identified between regions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Diabetes, in particular type 2, is associated with an increased incidence of cancer. Although the mortality attributable to cancer in type 2 diabetes is overshadowed by that due to cardiovascular disease, emerging data from epidemiologic studies suggest that insulin therapy may confer added risk for cancer, perhaps mediated by signaling through the IGF-1 (insulin-like growth factor-1) receptor. Co-administered metformin seems to mitigate the risk associated with insulin. A recent series of publications in Diabetologia addresses the possibility that glargine, the most widely used long-acting insulin analogue, may confer a greater risk than other insulin preparations, particularly for breast cancer. This has led to a heated controversy. Despite this, there is a consensus that the currently available data are not conclusive and should not be the basis for any change in practice. Further studies and more thorough surveillance of cancer in diabetes are needed to address this important issue.
Resumo:
This chapter describes an experimental system for the recognition of human faces from surveillance video. In surveillance applications, the system must be robust to changes in illumination, scale, pose and expression. The system must also be able to perform detection and recognition rapidly in real time. Our system detects faces using the Viola-Jones face detector, then extracts local features to build a shape-based feature vector. The feature vector is constructed from ratios of lengths and differences in tangents of angles, so as to be robust to changes in scale and rotations in-plane and out-of-plane. Consideration was given to improving the performance and accuracy of both the detection and recognition steps.
Resumo:
This paper examines the use of visual technologies by political activists in protest situations to monitor police conduct. Using interview data with Australian video activists, this paper seeks to understand the motivations, techniques and outcomes of video activism, and its relationship to counter-surveillance and police accountability. Our data also indicated that there have been significant transformations in the organization and deployment of counter-surveillance methods since 2000, when there were large-scale protests against the World Economic Forum meeting in Melbourne accompanied by a coordinated campaign that sought to document police misconduct. The paper identifies and examines two inter-related aspects of this: the act of filming and the process of dissemination of this footage. It is noted that technological changes over the last decade have led to a proliferation of visual recording technologies, particularly mobile phone cameras, which have stimulated a corresponding proliferation of images. Analogous innovations in internet communications have stimulated a coterminous proliferation of potential outlets for images Video footage provides activists with a valuable tool for safety and publicity. Nevertheless, we argue, video activism can have unintended consequences, including exposure to legal risks and the amplification of official surveillance. Activists are also often unable to control the political effects of their footage or the purposes to which it is used. We conclude by assessing the impact that transformations in both protest organization and media technologies might have for counter-surveillance techniques based on visual surveillance.