79 resultados para Epidemiological surveillance


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple, new method permitting the simultaneous determination and confirmation of trace residues of 24 different growth promoters and metabolites using liquid chromatography-mass spectrometry was developed and validated. The compounds were extracted from bovine tissue using acetonitrile; sodium sulphate was also added at this stage to aid with purification. The resulting mixture was then evaporated to approximately 1 ml and subsequently centrifuged at high speed and an aliquot injected onto the LC-MS/MS system. The calculated CC values ranged between 0.11 and 0.46 mu g kg-1; calculated CC were in the range 0.19-0.79 mu g kg-1. Accuracy, measurement of uncertainty, repeatability and linearity were also determined for each analyte. The analytical method was applied to a number of bovine tissue samples imported into Ireland from third countries. Levels of progesterone were found in a number of samples at concentrations ranging between 0.28 and 30.30 mu g kg-1. Levels of alpha- and beta-testosterone were also found in a number of samples at concentrations ranging between 0.22 and 8.63 mu g kg-1 and between 0.16 and 2.08 mu g kg-1 respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has always been a question mark over how best to integrate developing countries into the world trading system and traditionally the WTO has used special and differential treatment (S&D) to do so. However, since 1996 the WTO has been involved with the Aid for Trade (AfT) initiative typically co-ordinated by the OECD and UN. This article firstly outlines the background to AfT since 1996 under the numerous agencies working in the area, highlighting how importance has always been placed on the monitoring and effectiveness of the process. It then turns to assessing the various methods currently used and the proposal of the WTO’s Trade Policy Review Mechanism (TPRM) as a potential monitoring tool of AfT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Passive equipments operating in the 30-300 GHZ (millimeter wave) band are compared to those in the 300 GHz-3 THz (submillimeter band). Equipments operating in the submillimeter band can measure distance and also spectral information and have been used to address new opportunities in security. Solid state spectral information is available in the submillimeter region making it possible to identify materials, whereas in millimeter region bulk optical properties determine the image contrast. The optical properties in the region from 30 GHz to 3 THz are discussed for some typical inorganic and organic solids. in the millimeter-wave region of the spectrum, obscurants such as poor weather, dust, and smoke can be penetrated and useful imagery generated for surveillance. in the 30 GHZ-3 THZ region dielectrics such as plastic and cloth are also transparent and the detection of contraband hidden under clothing is possible. A passive millimeter-wave imaging concept based on a folded Schmidt camera has been developed and applied to poor weather navigation and security. The optical design uses a rotating mirror and is folded using polarization techniques. The design is very well corrected over a wide field of view making it ideal for surveillance, and security. This produces a relatively compact imager which minimizes the receiver count.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A review of medical records of 45 of 53 hospitalised patients with positive cultures for CTX-M type ESBL-producing Escherichia coli between 01 January and 31 May 2004 was conducted. The mean age of the population studied was 73.1 (+/-14.6) years and the majority (55.6%) had been under the care of the internal medicine or elderly care service. In the majority (77.8%) of instances the isolate was attributed to a clinical infection rather than colonisation and the commonest clinical specimen to yield the organism was urine, which was positive in 57.8% of patients. Acquisition of the organism was categorised as nosocomial in 68.9% of patients; in this subgroup, the median duration of inpatient stay prior to recovery of the organism was 24 (range 3-240) days. Haemodialysis-dependence was the most common of the comorbidities evaluated. The mean number of antibiotics prescribed per patient in the 30 days prior to first isolation of the organism was 1.7 (range 0-4). Furthermore, the mean number of antibiotic-days exposure per patient during this period was 13.9 (range 0-48). The most frequently received class of antibiotic was beta-lactam/beta-lactamase inhibitor combinations. Of 35 infections, 26 (74.2%) were successfully treated. Overall 12 patients with infection died (34.3%); attributable mortality was presumed in seven (20%).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Several surveillance definitions of influenza-like illness (ILI) have been proposed, based on the presence of symptoms. Symptom data can be obtained from patients, medical records, or both. Past research has found that agreements between health record data and self-report are variable depending on the specific symptom. Therefore, we aimed to explore the implications of using data on influenza symptoms extracted from medical records, similar data collected prospectively from outpatients, and the combined data from both sources as predictors of laboratory-confirmed influenza. Methods: Using data from the Hutterite Influenza Prevention Study, we calculated: 1) the sensitivity, specificity and predictive values of individual symptoms within surveillance definitions; 2) how frequently surveillance definitions correlated to laboratory-confirmed influenza; and 3) the predictive value of surveillance definitions. Results: Of the 176 participants with reports from participants and medical records, 142 (81%) were tested for influenza and 37 (26%) were PCR positive for influenza. Fever (alone) and fever combined with cough and/or sore throat were highly correlated with being PCR positive for influenza for all data sources. ILI surveillance definitions, based on symptom data from medical records only or from both medical records and self-report, were better predictors of laboratory-confirmed influenza with higher odds ratios and positive predictive values. Discussion: The choice of data source to determine ILI will depend on the patient population, outcome of interest, availability of data source, and use for clinical decision making, research, or surveillance. © Canadian Public Health Association, 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.

DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.

SETTING: Primary and secondary care.

PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).

INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.

MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).

RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.

LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.

CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.

FUNDING: The National Institute for Health Research Health Technology Assessment Programme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To estimate the incidence of severe chemical corneal injuries in the UK and describe presenting clinical features and initial management.

METHODS: All patients with severe chemical corneal injury in the UK from December 2005 to November 2006 inclusive were prospectively identified using the British Ophthalmological Surveillance Unit. Reporting ophthalmologists provided information regarding presentation and follow-up.

RESULTS: Twelve cases were identified, giving a minimum estimated incidence in the UK of severe chemical corneal injury of 0.02 per 100,000. 66.7% of injuries were in males of working age, 50% occurred at work, and alkali was causative in 66.7%. Only one patient was wearing eye protection at the time of injury, 75% received immediate irrigation. Six patients required one or more surgical procedures, most commonly amniotic membrane graft. At 6 months' follow-up, the best-corrected visual acuity was 6/12 or better in five patients, and worse than 6/60 in two.

CONCLUSION: The incidence of severe chemical corneal injury in the UK is low. The cases that occur can require extended hospital treatment, with substantial ocular morbidity and visual sequelae. Current enforcement of eye protection in the workplace in the UK has probably contributed to a reduced incidence of severe ocular burns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the influence of temperature and moisture on the free-living stages of gastrointestinal nematodes have been described in detail, and evidence for global climate change is mounting, there have been only a few attempts to relate altered incidence or seasonal patterns of disease to climate change. Studies of this type have been completed for England Scotland and Wales, but not for Northern Ireland (NI). Here we present an analysis of veterinary diagnostic data that relates three categories of gastrointestinal nematode infection in sheep to historical meteorological data for NI. The infections are: trichostrongylosis/teladorsagiosis (Teladorsagia/Trichostrongylus), strongyloidosis and nematodirosis. This study aims to provide a baseline for future climate change analyses and to provide basic information for the development of nematode control programmes. After identifying and evaluating possible sources of bias, climate change was found to be the most likely explanation for the observed patterns of change in parasite epidemiology, although other hypotheses could not be refuted. Seasonal rates of diagnosis showed a uniform year-round distribution for Teladorsagia and Trichostrongylus infections, suggesting consistent levels of larval survival throughout the year and extension of the traditionally expected seasonal transmission windows. Nematodirosis showed a higher level of autumn than Spring infection, suggesting that suitable conditions for egg and larval development occurred after the Spring infection period. Differences between regions within the Province were shown for strongyloidosis, with peaks of infection falling in the period September-November. For all three-infection categories (trichostrongylosis/teladorsagiosis, strongyloidosis and nematodirosis), significant differences in the rates of diagnosis, and in the seasonality of disease, were identified between regions. (C) 2012 Elsevier B.V. All rights reserved.