24 resultados para injury data surveillance
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
Objective: Several surveillance definitions of influenza-like illness (ILI) have been proposed, based on the presence of symptoms. Symptom data can be obtained from patients, medical records, or both. Past research has found that agreements between health record data and self-report are variable depending on the specific symptom. Therefore, we aimed to explore the implications of using data on influenza symptoms extracted from medical records, similar data collected prospectively from outpatients, and the combined data from both sources as predictors of laboratory-confirmed influenza. Methods: Using data from the Hutterite Influenza Prevention Study, we calculated: 1) the sensitivity, specificity and predictive values of individual symptoms within surveillance definitions; 2) how frequently surveillance definitions correlated to laboratory-confirmed influenza; and 3) the predictive value of surveillance definitions. Results: Of the 176 participants with reports from participants and medical records, 142 (81%) were tested for influenza and 37 (26%) were PCR positive for influenza. Fever (alone) and fever combined with cough and/or sore throat were highly correlated with being PCR positive for influenza for all data sources. ILI surveillance definitions, based on symptom data from medical records only or from both medical records and self-report, were better predictors of laboratory-confirmed influenza with higher odds ratios and positive predictive values. Discussion: The choice of data source to determine ILI will depend on the patient population, outcome of interest, availability of data source, and use for clinical decision making, research, or surveillance. © Canadian Public Health Association, 2012.
Resumo:
While the influence of temperature and moisture on the free-living stages of gastrointestinal nematodes have been described in detail, and evidence for global climate change is mounting, there have been only a few attempts to relate altered incidence or seasonal patterns of disease to climate change. Studies of this type have been completed for England Scotland and Wales, but not for Northern Ireland (NI). Here we present an analysis of veterinary diagnostic data that relates three categories of gastrointestinal nematode infection in sheep to historical meteorological data for NI. The infections are: trichostrongylosis/teladorsagiosis (Teladorsagia/Trichostrongylus), strongyloidosis and nematodirosis. This study aims to provide a baseline for future climate change analyses and to provide basic information for the development of nematode control programmes. After identifying and evaluating possible sources of bias, climate change was found to be the most likely explanation for the observed patterns of change in parasite epidemiology, although other hypotheses could not be refuted. Seasonal rates of diagnosis showed a uniform year-round distribution for Teladorsagia and Trichostrongylus infections, suggesting consistent levels of larval survival throughout the year and extension of the traditionally expected seasonal transmission windows. Nematodirosis showed a higher level of autumn than Spring infection, suggesting that suitable conditions for egg and larval development occurred after the Spring infection period. Differences between regions within the Province were shown for strongyloidosis, with peaks of infection falling in the period September-November. For all three-infection categories (trichostrongylosis/teladorsagiosis, strongyloidosis and nematodirosis), significant differences in the rates of diagnosis, and in the seasonality of disease, were identified between regions. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Diagnostic test sensitivity and specificity are probabilistic estimates with far reaching implications for disease control, management and genetic studies. In the absence of 'gold standard' tests, traditional Bayesian latent class models may be used to assess diagnostic test accuracies through the comparison of two or more tests performed on the same groups of individuals. The aim of this study was to extend such models to estimate diagnostic test parameters and true cohort-specific prevalence, using disease surveillance data. The traditional Hui-Walter latent class methodology was extended to allow for features seen in such data, including (i) unrecorded data (i.e. data for a second test available only on a subset of the sampled population) and (ii) cohort-specific sensitivities and specificities. The model was applied with and without the modelling of conditional dependence between tests. The utility of the extended model was demonstrated through application to bovine tuberculosis surveillance data from Northern and the Republic of Ireland. Simulation coupled with re-sampling techniques, demonstrated that the extended model has good predictive power to estimate the diagnostic parameters and true herd-level prevalence from surveillance data. Our methodology can aid in the interpretation of disease surveillance data, and the results can potentially refine disease control strategies.
Resumo:
BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control.
METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75,000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights.
FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease.
INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems.
Resumo:
OBJECTIVE: To investigate the role of recombinant bactericidal/permeability-increasing protein (rBPI21) in the attenuation of the sepsis syndrome and acute lung injury associated with lower limb ischemia-reperfusion (I/R) injury. SUMMARY BACKGROUND DATA: Gut-derived endotoxin has been implicated in the conversion of the sterile inflammatory response to a lethal sepsis syndrome after lower torso I/R injury. rBPI21 is a novel antiendotoxin therapy with proven benefit in sepsis. METHODS: Anesthetized ventilated swine underwent midline laparotomy and bilateral external iliac artery occlusion for 2 hours followed by 2.5 hours of reperfusion. Two groups (n = 6 per group) were randomized to receive, by intravenous infusion over 30 minutes, at the start of reperfusion, either thaumatin, a control-protein preparation, at 2 mg/kg body weight, or rBPI21 at 2 mg/kg body weight. A control group (n = 6) underwent laparotomy without further treatment and was administered thaumatin at 2 mg/kg body weight after 2 hours of anesthesia. Blood from a carotid artery cannula was taken every half-hour for arterial blood gas analysis. Plasma was separated and stored at -70 degrees C for later determination of plasma tumor necrosis factor (TNF)-alpha, interleukin (IL)-6 by bioassay, and IL-8 by enzyme-linked immunosorbent assay (ELISA), as a markers of systemic inflammation. Plasma endotoxin concentration was measured using ELISA. Lung tissue wet-to-dry weight ratio and myeloperoxidase concentration were used as markers of edema and neutrophil sequestration, respectively. Bronchoalveolar lavage protein concentration was measured by the bicinclinoic acid method as a measure of capillary-alveolar protein leak. The alveolar-arterial gradient was measured; a large gradient indicated impaired oxygen transport and hence lung injury. RESULTS: Bilateral hind limb I/R injury increased significantly intestinal mucosal acidosis, intestinal permeability, portal endotoxemia, plasma IL-6 concentrations, circulating phagocytic cell priming and pulmonary leukosequestration, edema, capillary-alveolar protein leak, and impaired gas exchange. Conversely, pigs treated with rBPI21 2 mg/kg at the onset of reperfusion had significantly reduced intestinal mucosal acidosis, portal endotoxin concentrations, and circulating phagocytic cell priming and had significantly less pulmonary edema, leukosequestration, and respiratory failure. CONCLUSIONS: Endotoxin transmigration across a hyperpermeable gut barrier, phagocytic cell priming, and cytokinemia are key events of I/R injury, sepsis, and pulmonary dysfunction. This study shows that rBPI21 ameliorates these adverse effects and may provide a novel therapeutic approach for prevention of I/R-associated sepsis syndrome.
Resumo:
Following brain injury there is often a prolonged period of deteriorating psychological condition, despite neurological stability or improvement. This is presumably consequent to the remission of anosognosia and the realisation of permanently worsened status. This change is hypothesised to be directed partially by the socially mediated processes which play a role in generating self-awareness and which here direct the reconstruction of the self as a permanently injured person. However, before we can understand this process of redevelopment, we need an unbiassed technique to monitor self-awareness. Semi-structured interviews were conducted with 30 individuals with long-standing brain injuries to capture their spontaneous complaints and their level of insight into the implications of their difficulties. The focus was on what the participants said in their own words, and the extent to which self-knowledge of difficulties was spontaneously salient to the participants. Their responses were subjected to content analysis. Most participants were able to say that they had brain injuries and physical difficulties, many mentioned memory and attentional problems and a few made references to a variety of emotional disturbances. Content analysis of data from unbiassed interviews can reveal the extent to which people with brain injuries know about their difficulties. Social constructionist accounts of self-awareness and recovery are supported.
Resumo:
Although cerebral palsy (CP) is the most common cause of motor deficiency in young children, it occurs in only 2 to 3 per 1000 live births. In order to monitor prevalence rates, especially within subgroups (birthweight, clinical type), it is necessary to study large populations. A network of CP surveys and registers was formed in 14 centres in eight countries across Europe. Differences in prevalence rates of CP in the centres prior to any work on harmonization of data are reported. The subsequent process to standardize the definition of CP, inclusion/exclusion criteria, classification, and description of children with CP is outlined. The consensus that was reached on these issues will make it possible to monitor trends in CP rate, to provide a framework for collaborative research, and a basis for services planning among European countries.
Resumo:
OBJECTIVE: Despite recent increases in the volume of research in professional rugby union, there is little consensus on the epidemiology of injury in adolescent players. We undertook a systematic review to determine the incidence, severity, and nature of injury in adolescent rugby union players.
DATA SOURCES: In April 2009, we performed a computerized literature search on PubMed, Embase, and Cochrane Controlled Trials Register (via Ovid). Population-specific and patient-specific search terms were combined in the form of MEDLINE subject headings and key words (wound$ and injur$, rugby, adolescent$). These were supplemented with related-citation searches on PubMed and bibliographic tracking of primary and review articles.
STUDY SELECTION: Prospective epidemiologic studies in adolescent rugby union players.
DATA SYNTHESIS: A total of 15 studies were included, and the data were analyzed descriptively. Two independent reviewers extracted key study characteristics regarding the incidence, severity, and nature of injuries and the methodologic design.
CONCLUSIONS: Wide variations existed in the injury definitions and data collection procedures. The incidence of injury necessitating medical attention varied with the definition, from 27.5 to 129.8 injuries per 1000 match hours. The incidence of time-loss injury (>7 days) ranged from 0.96 to 1.6 per 1000 playing hours and from 11.4/1000 match hours (>1 day) to 12-22/1000 match hours (missed games). The highest incidence of concussion was 3.3/1000 playing hours. No catastrophic injuries were reported. The head and neck, upper limb, and lower limb were all common sites of injury, and trends were noted toward greater time loss due to upper limb fractures or dislocations and knee ligament injuries. Increasing age, the early part of the playing season, and the tackle situation were most closely associated with injury. Future injury-surveillance studies in rugby union must follow consensus guidelines to facilitate interstudy comparisons and provide further clarification as to where injury-prevention strategies should be focused.
Resumo:
This paper describes a data model for content representation of temporal media in an IP based sensor network. The model is formed by introducing the idea of semantic-role from linguistics into the underlying concepts of formal event representation with the aim of developing a common event model. The architecture of a prototype system for a multi camera surveillance system, based on the proposed model is described. The important aspects of the proposed model are its expressiveness, its ability to model content of temporal media, and its suitability for use with a natural language interface. It also provides a platform for temporal information fusion, as well as organizing sensor annotations by help of ontologies.
Resumo:
OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.
DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.
SETTING: Primary and secondary care.
PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).
INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.
MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).
RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.
LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.
CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.
FUNDING: The National Institute for Health Research Health Technology Assessment Programme.
Resumo:
AIM: To estimate the incidence of severe chemical corneal injuries in the UK and describe presenting clinical features and initial management.
METHODS: All patients with severe chemical corneal injury in the UK from December 2005 to November 2006 inclusive were prospectively identified using the British Ophthalmological Surveillance Unit. Reporting ophthalmologists provided information regarding presentation and follow-up.
RESULTS: Twelve cases were identified, giving a minimum estimated incidence in the UK of severe chemical corneal injury of 0.02 per 100,000. 66.7% of injuries were in males of working age, 50% occurred at work, and alkali was causative in 66.7%. Only one patient was wearing eye protection at the time of injury, 75% received immediate irrigation. Six patients required one or more surgical procedures, most commonly amniotic membrane graft. At 6 months' follow-up, the best-corrected visual acuity was 6/12 or better in five patients, and worse than 6/60 in two.
CONCLUSION: The incidence of severe chemical corneal injury in the UK is low. The cases that occur can require extended hospital treatment, with substantial ocular morbidity and visual sequelae. Current enforcement of eye protection in the workplace in the UK has probably contributed to a reduced incidence of severe ocular burns.
Resumo:
Aims/hypothesis: In previous studies we have shown that extravasated, modified LDL is associated with pericyte loss, an early feature of diabetic retinopathy (DR). Here we sought to determine detailed mechanisms of this LDLinduced pericyte loss.
Methods: Human retinal capillary pericytes (HRCP) were exposed to ‘highly-oxidised glycated’ LDL (HOG-LDL) (a model of extravasated and modified LDL) and to 4-hydroxynonenal or 7-ketocholesterol (components of oxidised LDL), or to native LDL for 1 to 24 h with or without 1 h of pretreatment with inhibitors of the following: (1) the scavenger receptor (polyinosinic acid); (2) oxidative stress (N-acetyl cysteine); (3) endoplasmic reticulum (ER) stress (4-phenyl butyric acid); and (4) mitochondrial dysfunction (cyclosporin A). Oxidative stress, ER stress, mitochondrial dysfunction, apoptosis and autophagy were assessed using techniques including western blotting, immunofluorescence, RT-PCR, flow cytometry and TUNEL assay. To assess the relevance of the results in vivo, immunohistochemistry was used to detect the ER stress chaperon, 78 kDa glucose-regulated protein, and the ER sensor, activating transcription factor 6, in retinas from a mouse model of DR that mimics exposure of the retina to elevated glucose and elevated LDL levels, and in retinas from human participants with and without diabetes and DR.
Results: Compared with native LDL, HOG-LDL activated oxidative and ER stress in HRCP, resulting in mitochondrial dysfunction, apoptosis and autophagy. In a mouse model of diabetes and hyperlipidaemia (vs mouse models of either condition alone), retinal ER stress was enhanced. ER stress was also enhanced in diabetic human retina and correlated with the severity of DR.
Conclusions/interpretation: Cell culture, animal, and human data suggest that oxidative stress and ER stress are induced by modified LDL, and are implicated in pericyte loss in DR.
Resumo:
Background: Acute lung injury (ALI) is a common devastating clinical syndrome characterized by life-threatening respiratory failure requiring mechanical ventilation and multiple organ failure. There are in vitro, animal studies and pre-clinical data suggesting that statins may be beneficial in ALI. The Hydroxymethylglutaryl-CoA reductase inhibition with simvastatin in Acute lung injury to Reduce Pulmonary dysfunction (HARP-2) trial is a multicenter, prospective, randomized, allocation concealed, double-blind, placebo-controlled clinical trial which aims to test the hypothesis that treatment with simvastatin will improve clinical outcomes in patients with ALI.
Methods/Design: Patients fulfilling the American-European Consensus Conference Definition of ALI will be randomized in a 1: 1 ratio to receive enteral simvastatin 80 mg or placebo once daily for a maximum of 28 days. Allocation to randomized groups will be stratified with respect to hospital of recruitment and vasopressor requirement. Data will be recorded by participating ICUs until hospital discharge, and surviving patients will be followed up by post at 3, 6 and 12 months post randomization. The primary outcome is number of ventilator-free days to day 28. Secondary outcomes are: change in oxygenation index and sequential organ failure assessment score up to day 28, number of non pulmonary organ failure free days to day 28, critical care unit mortality; hospital mortality; 28 day post randomization mortality and 12 month post randomization mortality; health related quality of life at discharge, 3, 6 and 12 months post randomization; length of critical care unit and hospital stay; health service use up to 12 months post-randomization; and safety. A total of 540 patients will be recruited from approximately 35 ICUs in the UK and Ireland. An economic evaluation will be conducted alongside the trial. Plasma and urine samples will be taken up to day 28 to investigate potential mechanisms by which simvastatin might act to improve clinical outcomes.
Resumo:
This paper examines the use of visual technologies by political activists in protest situations to monitor police conduct. Using interview data with Australian video activists, this paper seeks to understand the motivations, techniques and outcomes of video activism, and its relationship to counter-surveillance and police accountability. Our data also indicated that there have been significant transformations in the organization and deployment of counter-surveillance methods since 2000, when there were large-scale protests against the World Economic Forum meeting in Melbourne accompanied by a coordinated campaign that sought to document police misconduct. The paper identifies and examines two inter-related aspects of this: the act of filming and the process of dissemination of this footage. It is noted that technological changes over the last decade have led to a proliferation of visual recording technologies, particularly mobile phone cameras, which have stimulated a corresponding proliferation of images. Analogous innovations in internet communications have stimulated a coterminous proliferation of potential outlets for images Video footage provides activists with a valuable tool for safety and publicity. Nevertheless, we argue, video activism can have unintended consequences, including exposure to legal risks and the amplification of official surveillance. Activists are also often unable to control the political effects of their footage or the purposes to which it is used. We conclude by assessing the impact that transformations in both protest organization and media technologies might have for counter-surveillance techniques based on visual surveillance.
Resumo:
Data registration refers to a series of techniques for matching or bringing similar objects or datasets together into alignment. These techniques enjoy widespread use in a diverse variety of applications, such as video coding, tracking, object and face detection and recognition, surveillance and satellite imaging, medical image analysis and structure from motion. Registration methods are as numerous as their manifold uses, from pixel level and block or feature based methods to Fourier domain methods.
This book is focused on providing algorithms and image and video techniques for registration and quality performance metrics. The authors provide various assessment metrics for measuring registration quality alongside analyses of registration techniques, introducing and explaining both familiar and state-of-the-art registration methodologies used in a variety of targeted applications.
Key features:
- Provides a state-of-the-art review of image and video registration techniques, allowing readers to develop an understanding of how well the techniques perform by using specific quality assessment criteria
- Addresses a range of applications from familiar image and video processing domains to satellite and medical imaging among others, enabling readers to discover novel methodologies with utility in their own research
- Discusses quality evaluation metrics for each application domain with an interdisciplinary approach from different research perspectives