988 resultados para SURVEILLANCE NETWORK TRANSNET
Resumo:
INTRODUCTION: This study aimed to determine the epidemiology of the three most common nosocomial infections (NI), namely, sepsis, pneumonia, and urinary tract infection (UTI), in a pediatric intensive care unit (PICU) in a developing country and to define the risk factors associated with NI. METHODS: We performed a prospective study on the incidence of NI in a single PICU, between August 2009 and August 2010. Active surveillance by National Healthcare Safety Network (NHSN) was conducted in the unit and children with NI (cases) were compared with a group (matched controls) in a case-control fashion. RESULTS: We analyzed 172 patients; 22.1% had NI, 71.1% of whom acquired it in the unit. The incidence densities of sepsis, pneumonia, and UTI per 1,000 patients/day were 17.9, 11.4, and 4.3, respectively. The most common agents in sepsis were Enterococcus faecalis and Escherichia coli (18% each); Staphylococcus epidermidis was isolated in 13% of cases. In pneumonias Staphylococcus aureus was the most common cause (3.2%), and in UTI the most frequent agents were yeasts (33.3%). The presence of NI was associated with a long period of hospitalization, use of invasive devices (central venous catheter, nasogastric tube), and use of antibiotics. The last two were independent factors for NI. CONCLUSIONS: The incidence of NI acquired in this unit was high and was associated with extrinsic factors.
Resumo:
To evaluate the long-term impact of successive interventions on rates of methicillin-resistant Staphylococcus aureus (MRSA) colonization or infection and MRSA bacteremia in an endemic hospital-wide situation. DESIGN:Quasi-experimental, interrupted time-series analysis. The impact of the interventions was analyzed by use of segmented regression. Representative MRSA isolates were typed by use of pulsed-field gel electrophoresis. SETTING:A 950-bed teaching hospital in Seville, Spain. PATIENTS:All patients admitted to the hospital during the period from 1995 through 2008. METHODS:Three successive interventions were studied: (1) contact precautions, with no active surveillance for MRSA; (2) targeted active surveillance for MRSA in patients and healthcare workers in specific wards, prioritized according to clinical epidemiology data; and (3) targeted active surveillance for MRSA in patients admitted from other medical centers. RESULTS:Neither the preintervention rate of MRSA colonization or infection (0.56 cases per 1,000 patient-days [95% confidence interval {CI}, 0.49-0.62 cases per 1,000 patient-days]) nor the slope for the rate of MRSA colonization or infection changed significantly after the first intervention. The rate decreased significantly to 0.28 cases per 1,000 patient-days (95% CI, 0.17-0.40 cases per 1,000 patient-days) after the second intervention and to 0.07 cases per 1,000 patient-days (95% CI, 0.06-0.08 cases per 1,000 patient-days) after the third intervention, and the rate remained at a similar level for 8 years. The MRSA bacteremia rate decreased by 80%, whereas the rate of bacteremia due to methicillin-susceptible S. aureus did not change. Eighty-three percent of the MRSA isolates identified were clonally related. All MRSA isolates obtained from healthcare workers were clonally related to those recovered from patients who were in their care. CONCLUSION:Our data indicate that long-term control of endemic MRSA is feasible in tertiary care centers. The use of targeted active surveillance for MRSA in patients and healthcare workers in specific wards (identified by means of analysis of clinical epidemiology data) and the use of decolonization were key to the success of the program.
Resumo:
BACKGROUND. Listeria monocytogenes is the third most frequent cause of bacterial meningitis. The aim of this study is to know the incidence and risk factors associated with development of acute community-acquired Lm meningitis in adult patients and to evaluate the clinical features, management, and outcome in this prospective case series. METHODS. A descriptive, prospective, and multicentric study carried out in 9 hospitals in the Spanish Network for Research in Infectious Diseases (REIPI) over a 39-month period. All adults patients admitted to the participating hospitals with the diagnosis of acute community-acquired bacterial meningitis (Ac-ABM) were included in this study. All these cases were diagnosed on the basis of a compatible clinical picture and a positive cerebrospinal fluid (CSF) culture or blood culture. The patients were followed up until death or discharge from hospital. RESULTS. Two hundred and seventy-eight patients with Ac-ABM were included. Forty-six episodes of Lm meningitis were identified in 46 adult patients. In the multivariate analysis only age (OR 1.026; 95% CI 1.00-1.05; p = 0.042), immunosuppression (OR 2.520; 95% CI 1.05-6.00; p = 0.037), and CSF/blood glucose ratio (OR 39.42; 95% CI 4.01-387.50; p = 0.002) were independently associated with a Lm meningitis. The classic triad of fever, neck stiffness and altered mental status was present in 21 (49%) patients, 32% had focal neurological findings at presentation, 12% presented cerebellum dysfunction, and 9% had seizures. Twenty-nine (68%) patients were immunocompromised. Empirical antimicrobial therapy was intravenous ampicillin for 34 (79%) of 43 patients, in 11 (32%) of them associated to aminoglycosides. Definitive ampicillin plus gentamicin therapy was significantly associated with unfavourable outcome (67% vs 28%; p = 0.024) and a higher mortality (67% vs 32%; p = 0.040).The mortality rate was 28% (12 of 43 patients) and 5 of 31 (16.1%) surviving patients developed adverse clinical outcome. CONCLUSIONS Elderly or immunocompromised patients, and a higher CSF/blood glucose ratio in patients with Ac-ABM must alert clinicians about Lm aetiology. Furthermore, we observed a high incidence of acute community-acquired Lm meningitis in adults and the addition of aminoglycosides to treatment should be avoid in order to improve the patients' outcome. Nevertheless, despite developments in intensive care and antimicrobial therapy, this entity is still a serious disease that carries high morbidity and mortality rates.
Resumo:
Chagas disease, named after Carlos Chagas, who first described it in 1909, exists only on the American Continent. It is caused by a parasite, Trypanosoma cruzi, which is transmitted to humans by blood-sucking triatomine bugs and via blood transfusion. Chagas disease has two successive phases: acute and chronic. The acute phase lasts six-eight weeks. Several years after entering the chronic phase, 20-35% of infected individuals, depending on the geographical area, will develop irreversible lesions of the autonomous nervous system in the heart, oesophagus and colon, and of the peripheral nervous system. Data on the prevalence and distribution of Chagas disease improved in quality during the 1980s as a result of the demographically representative cross-sectional studies in countries where accurate information was not previously available. A group of experts met in Brasilia in 1979 and devised standard protocols to carry out countrywide prevalence studies on human T. cruzi infection and triatomine house infestation. Thanks to a coordinated multi-country programme in the Southern Cone countries, the transmission of Chagas disease by vectors and via blood transfusion was interrupted in Uruguay in 1997, in Chile in 1999 and in Brazil in 2006; thus, the incidence of new infections by T. cruzi across the South American continent has decreased by 70%. Similar multi-country initiatives have been launched in the Andean countries and in Central America and rapid progress has been reported towards the goal of interrupting the transmission of Chagas disease, as requested by a 1998 Resolution of the World Health Assembly. The cost-benefit analysis of investment in the vector control programme in Brazil indicates that there are savings of US$17 in medical care and disabilities for each dollar spent on prevention, showing that the programme is a health investment with very high return. Many well-known research institutions in Latin America were key elements of a worldwide network of laboratories that carried out basic and applied research supporting the planning and evaluation of national Chagas disease control programmes. The present article reviews the current epidemiological trends for Chagas disease in Latin America and the future challenges in terms of epidemiology, surveillance and health policy.
Resumo:
Les décisions de gestion des eaux souterraines doivent souvent être justiffées par des modèles quantitatifs d'aquifères qui tiennent compte de l'hétérogénéité des propriétés hydrauliques. Les aquifères fracturés sont parmi les plus hétérogènes et très difficiles à étudier. Dans ceux-ci, les fractures connectées, d'ouverture millimètrique, peuvent agir comme conducteurs hydrauliques et donc créer des écoulements très localisés. Le manque général d'informations sur la distribution spatiale des fractures limite la possibilité de construire des modèles quantitatifs de flux et de transport. Les données qui conditionnent les modèles sont généralement spatialement limitées, bruitées et elles ne représentent que des mesures indirectes de propriétés physiques. Ces limitations aux données peuvent être en partie surmontées en combinant différents types de données, telles que les données hydrologiques et de radar à pénétration de sol plus commun ément appelé géoradar. L'utilisation du géoradar en forage est un outil prometteur pour identiffer les fractures individuelles jusqu'à quelques dizaines de mètres dans la formation. Dans cette thèse, je développe des approches pour combiner le géoradar avec les données hydrologiques affn d'améliorer la caractérisation des aquifères fracturés. Des investigations hydrologiques intensives ont déjà été réalisées à partir de trois forage adjacents dans un aquifère cristallin en Bretagne (France). Néanmoins, la dimension des fractures et la géométrie 3-D des fractures conductives restaient mal connue. Affn d'améliorer la caractérisation du réseau de fractures je propose dans un premier temps un traitement géoradar avancé qui permet l'imagerie des fractures individuellement. Les résultats montrent que les fractures perméables précédemment identiffées dans les forages peuvent être caractérisées géométriquement loin du forage et que les fractures qui ne croisent pas les forages peuvent aussi être identiffées. Les résultats d'une deuxième étude montrent que les données géoradar peuvent suivre le transport d'un traceur salin. Ainsi, les fractures qui font partie du réseau conductif et connecté qui dominent l'écoulement et le transport local sont identiffées. C'est la première fois que le transport d'un traceur salin a pu être imagé sur une dizaines de mètres dans des fractures individuelles. Une troisième étude conffrme ces résultats par des expériences répétées et des essais de traçage supplémentaires dans différentes parties du réseau local. En outre, la combinaison des données de surveillance hydrologique et géoradar fournit la preuve que les variations temporelles d'amplitude des signaux géoradar peuvent nous informer sur les changements relatifs de concentrations de traceurs dans la formation. Par conséquent, les données géoradar et hydrologiques sont complémentaires. Je propose ensuite une approche d'inversion stochastique pour générer des modèles 3-D de fractures discrètes qui sont conditionnés à toutes les données disponibles en respectant leurs incertitudes. La génération stochastique des modèles conditionnés par géoradar est capable de reproduire les connexions hydrauliques observées et leur contribution aux écoulements. L'ensemble des modèles conditionnés fournit des estimations quantitatives des dimensions et de l'organisation spatiale des fractures hydrauliquement importantes. Cette thèse montre clairement que l'imagerie géoradar est un outil utile pour caractériser les fractures. La combinaison de mesures géoradar avec des données hydrologiques permet de conditionner avec succès le réseau de fractures et de fournir des modèles quantitatifs. Les approches présentées peuvent être appliquées dans d'autres types de formations rocheuses fracturées où la roche est électriquement résistive.
Resumo:
AIM: To provide insight into cancer registration coverage, data access and use in Europe. This contributes to data and infrastructure harmonisation and will foster a more prominent role of cancer registries (CRs) within public health, clinical policy and cancer research, whether within or outside the European Research Area. METHODS: During 2010-12 an extensive survey of cancer registration practices and data use was conducted among 161 population-based CRs across Europe. Responding registries (66%) operated in 33 countries, including 23 with national coverage. RESULTS: Population-based oncological surveillance started during the 1940-50s in the northwest of Europe and from the 1970s to 1990s in other regions. The European Union (EU) protection regulations affected data access, especially in Germany and France, but less in the Netherlands or Belgium. Regular reports were produced by CRs on incidence rates (95%), survival (60%) and stage for selected tumours (80%). Evaluation of cancer control and quality of care remained modest except in a few dedicated CRs. Variables evaluated were support of clinical audits, monitoring adherence to clinical guidelines, improvement of cancer care and evaluation of mass cancer screening. Evaluation of diagnostic imaging tools was only occasional. CONCLUSION: Most population-based CRs are well equipped for strengthening cancer surveillance across Europe. Data quality and intensity of use depend on the role the cancer registry plays in the politico, oncomedical and public health setting within the country. Standard registration methodology could therefore not be translated to equivalent advances in cancer prevention and mass screening, quality of care, translational research of prognosis and survivorship across Europe. Further European collaboration remains essential to ensure access to data and comparability of the results.
Resumo:
The evolving antimicrobial resistance coupled with a recent increase in incidence highlights the importance of reducing gonococcal transmission. Establishing novel risk factors associated with gonorrhea facilitates the development of appropriate prevention and disease control strategies. Sexual Network Analysis (NA), a novel research technique used to further understand sexually transmitted infections, was used to identify network-based risk factors in a defined region in Ontario, Canada experiencing an increase in the incidence of gonorrhea. Linear network structures were identified as important reservoirs of gonococcal transmission. Additionally, a significant association between a central network position and gonorrhea was observed. The central participants were more likely to be younger, report a greater number of risk factors, engage in anonymous sex, have multiple sex partners in the past six months and have sex with the same sex. The network-based risk factors identified through sexual NA, serving as a method of analyzing local surveillance data, support the development of strategies aimed at reducing gonococcal spread.
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.
Resumo:
Agri-environment schemes (AESs) have been implemented across EU member states in an attempt to reconcile agricultural production methods with protection of the environment and maintenance of the countryside. To determine the extent to which such policy objectives are being fulfilled, participating countries are obliged to monitor and evaluate the environmental, agricultural and socio-economic impacts of their AESs. However, few evaluations measure precise environmental outcomes and critically, there are no agreed methodologies to evaluate the benefits of particular agri-environmental measures, or to track the environmental consequences of changing agricultural practices. In response to these issues, the Agri-Environmental Footprint project developed a common methodology for assessing the environmental impact of European AES. The Agri-Environmental Footprint Index (AFI) is a farm-level, adaptable methodology that aggregates measurements of agri-environmental indicators based on Multi-Criteria Analysis (MCA) techniques. The method was developed specifically to allow assessment of differences in the environmental performance of farms according to participation in agri-environment schemes. The AFI methodology is constructed so that high values represent good environmental performance. This paper explores the use of the AFI methodology in combination with Farm Business Survey data collected in England for the Farm Accountancy Data Network (FADN), to test whether its use could be extended for the routine surveillance of environmental performance of farming systems using established data sources. Overall, the aim was to measure the environmental impact of three different types of agriculture (arable, lowland livestock and upland livestock) in England and to identify differences in AFI due to participation in agri-environment schemes. However, because farm size, farmer age, level of education and region are also likely to influence the environmental performance of a holding, these factors were also considered. Application of the methodology revealed that only arable holdings participating in agri-environment schemes had a greater environmental performance, although responses differed between regions. Of the other explanatory variables explored, the key factors determining the environmental performance for lowland livestock holdings were farm size, farmer age and level of education. In contrast, the AFI value of upland livestock holdings differed only between regions. The paper demonstrates that the AFI methodology can be used readily with English FADN data and therefore has the potential to be applied more widely to similar data sources routinely collected across the EU-27 in a standardised manner.
Resumo:
Parasites threaten human and animal health globally. It is estimated that more than 60% of people on planet Earth carry at least one parasite, many of them several different species. Unfortunately, parasite studies suffer from duplications and inconsistencies between different investigator groups. Hence, groups need to collaborate in an integrated manner in areas including parasite control, improved therapy strategies, diagnostic and surveillance tools, and public awareness. Parasite studies will be better served if there is coordinated management of field data and samples across multidisciplinary approach plans, among academic and non-academic organizations worldwide. In this paper we report the first 'Living organism-World Molecular Network', with the cooperation of 167 parasitologists from 88 countries on all continents. This integrative approach, the 'Sarcoptes-World Molecular Network', seeks to harmonize Sarcoptes epidemiology, diagnosis, treatment, and molecular studies from all over the world, with the aim of decreasing mite infestations in humans and animals.
Resumo:
A sensitive, specific and timely surveillance is necessary to monitor progress towards measles elimination. We evaluated the performance of sentinel and mandatory-based surveillance systems for measles in Switzerland during a 5-year period by comparing 145 sentinel and 740 mandatory notified cases. The higher proportion of physicians who reported at least one case per year in the sentinel system suggests underreporting in the recently introduced mandatory surveillance for measles. Accordingly, the latter reported 2-36-fold lower estimates for incidence rates than the sentinel surveillance. However, these estimates were only 0.6-12-fold lower when we considered confirmed cases alone, which indicates a higher specificity of the mandatory surveillance system. In contrast, the sentinel network, which covers 3.5% of all outpatient consultations, detected only weakly and late a major national measles epidemic in 2003 and completely missed 2 of 10 cantonal outbreaks. Despite its better timeliness and greater sensitivity in case detection, the sentinel system, in the current situation of low incidence, is insufficient to perform measles control and to monitor progress towards elimination.
Resumo:
The field of animal syndromic surveillance (SyS) is growing, with many systems being developed worldwide. Now is an appropriate time to share ideas and lessons learned from early SyS design and implementation. Based on our practical experience in animal health SyS, with additions from the public health and animal health SyS literature, we put forward for discussion a 6-step approach to designing SyS systems for livestock and poultry. The first step is to formalise policy and surveillance goals which are considerate of stakeholder expectations and reflect priority issues (1). Next, it is important to find consensus on national priority diseases and identify current surveillance gaps. The geographic, demographic, and temporal coverage of the system must be carefully assessed (2). A minimum dataset for SyS that includes the essential data to achieve all surveillance objectives while minimizing the amount of data collected should be defined. One can then compile an inventory of the data sources available and evaluate each using the criteria developed (3). A list of syndromes should then be produced for all data sources. Cases can be classified into syndrome classes and the data can be converted into time series (4). Based on the characteristics of the syndrome-time series, the length of historic data available and the type of outbreaks the system must detect, different aberration detection algorithms can be tested (5). Finally, it is essential to develop a minimally acceptable response protocol for each statistical signal produced (6). Important outcomes of this pre-operational phase should be building of a national network of experts and collective action and evaluation plans. While some of the more applied steps (4 and 5) are currently receiving consideration, more emphasis should be put on earlier conceptual steps by decision makers and surveillance developers (1-3).
Resumo:
Clinical observations made by practitioners and reported using web- and mobile-based technologies may benefit disease surveillance by improving the timeliness of outbreak detection. Equinella is a voluntary electronic reporting and information system established for the early detection of infectious equine diseases in Switzerland. Sentinel veterinary practitioners have been able to report cases of non-notifiable diseases and clinical symptoms to an internet-based platform since November 2013. Telephone interviews were carried out during the first year to understand the motivating and constraining factors affecting voluntary reporting and the use of mobile devices in a sentinel network. We found that non-monetary incentives attract sentinel practitioners; however, insufficient understanding of the reporting system and of its relevance, as well as concerns over the electronic dissemination of health data were identified as potential challenges to sustainable reporting. Many practitioners are not yet aware of the advantages of mobile-based surveillance and may require some time to become accustomed to novel reporting methods. Finally, our study highlights the need for continued information feedback loops within voluntary sentinel networks.
Resumo:
Birth defects are the leading cause of infant mortality in the United States and are a major cause of lifetime disability. However, efforts to understand their causes have been hampered by a lack of population-specific data. During 1990–2004, 22 state legislatures responded to this need by proposing birth defects surveillance legislation (BDSL). The contrast between these states and those that did not pass BDSL provides an opportunity to better understand conditions associated with US public health policy diffusion. ^ This study identifies key state-specific determinants that predict: (1) the introduction of birth defects surveillance legislation (BDSL) onto states' formal legislative agenda, and (2) the successful adoption of these laws. Secondary aims were to interpret these findings in a theoretically sound framework and to incorporate evidence from three analytical approaches. ^ The study begins with a comparative case study of Texas and Oregon (states with divergent BDSL outcomes), including a review of historical documentation and content analysis of key informant interviews. After selecting and operationalizing explanatory variables suggested by the case study, Qualitative Comparative Analysis (QCA) was applied to publically available data to describe important patterns of variation among 37 states. Results from logistic regression were compared to determine whether the two methods produced consistent findings. ^ Themes emerging from the comparative case study included differing budgetary conditions and the significance of relationships within policy issue networks. However, the QCA and statistical analysis pointed to the importance of political parties and contrasting societal contexts. Notably, state policies that allow greater access to citizen-driven ballot initiatives were consistently associated with lower likelihood of introducing BDSL. ^ Methodologically, these results indicate that a case study approach, while important for eliciting valuable context-specific detail, may fail to detect the influence of overarching, systemic variables, such as party competition. However, QCA and statistical analyses were limited by a lack of existing data to operationalize policy issue networks, and thus may have downplayed the impact of personal interactions. ^ This study contributes to the field of health policy studies in three ways. First, it emphasizes the importance of collegial and consistent relationships among policy issue network members. Second, it calls attention to political party systems in predicting policy outcomes. Finally, a novel approach to interpreting state data in a theoretically significant manner (QCA) has been demonstrated.^
Resumo:
A real-time surveillance system for IP network cameras is presented. Motion, part-body, and whole-body detectors are efficiently combined to generate robust and fast detections, which feed multiple compressive trackers. The generated trajectories are then improved using a reidentification strategy for long term operation.