905 resultados para Video Surveillance System
Resumo:
Telecommunications and network technology is now the driving force that ensures continued progress of world civilization. Design of new and expansion of existing network infrastructures requires improving the quality of service(QoS). Modeling probabilistic and time characteristics of telecommunication systems is an integral part of modern algorithms of administration of quality of service. At present, for the assessment of quality parameters except simulation models analytical models in the form of systems and queuing networks are widely used. Because of the limited mathematical tools of models of these classes the corresponding parameter estimation of parameters of quality of service are inadequate by definition. Especially concerning the models of telecommunication systems with packet transmission of multimedia real-time traffic.
Resumo:
OBJECTIVES: This study aimed at investigating whether data from medical teleconsultations may contribute to influenza surveillance. METHODS: International Classification of Primary Care 2nd Edition (ICPC-2) codes were used to analyse the proportion of teleconsultations due to influenza-related symptoms. Results were compared with the weekly Swiss Sentinel reports. RESULTS: When using the ICPC-2 code for fever we could reproduce the seasonal influenza peaks of the winter seasons 07/08, 08/09 and 09/10 as depicted by the Sentinel data. For the pandemic influenza 09/10, we detected a much higher first peak in summer 2009 which correlated with a potential underreporting in the Sentinel system. CONCLUSIONS: ICPC-2 data from medical teleconsultations allows influenza surveillance in real time and correlates very well with the Swiss Sentinel system.
Resumo:
We propose a new method, based on inertial sensors, to automatically measure at high frequency the durations of the main phases of ski jumping (i.e. take-off release, take-off, and early flight). The kinematics of the ski jumping movement were recorded by four inertial sensors, attached to the thigh and shank of junior athletes, for 40 jumps performed during indoor conditions and 36 jumps in field conditions. An algorithm was designed to detect temporal events from the recorded signals and to estimate the duration of each phase. These durations were evaluated against a reference camera-based motion capture system and by trainers conducting video observations. The precision for the take-off release and take-off durations (indoor < 39 ms, outdoor = 27 ms) can be considered technically valid for performance assessment. The errors for early flight duration (indoor = 22 ms, outdoor = 119 ms) were comparable to the trainers' variability and should be interpreted with caution. No significant changes in the error were noted between indoor and outdoor conditions, and individual jumping technique did not influence the error of take-off release and take-off. Therefore, the proposed system can provide valuable information for performance evaluation of ski jumpers during training sessions.
Resumo:
National malaria control programmes have the responsibility to develop a policy for malaria disease management based on a set of defined criteria as efficacy, side effects, costs and compliance. These will fluctuate over time and national guidelines will require periodic re-assessment and revision. Changing a drug policy is a major undertaking that can take several years before being fully operational. The standard methods on which a decision can be taken are the in vivo and the in vitro tests. The latter allow a quantitative measurement of the drug response and the assessment of several drugs at once. However, in terms of drug policy change its results might be difficult to interpret although they may be used as an early warning system for 2nd or 3rd line drugs. The new WHO 14-days in vivo test addresses mainly the problem of treatment failure and of haematological parameters changes in sick children. It gives valuable information on whether a drug still `works'. None of these methods are well suited for large-scale studies. Molecular methods based on detection of mutations in parasite molecules targeted by antimalarial drugs could be attractive tools for surveillance. However, their relationship with in vivo test results needs to be established
Resumo:
The systematic collection of behavioural information is an important component of second-generation HIV surveillance. The extent of behavioural surveillance among injecting drug users (IDUs) in Europe was examined using data collected through a questionnaire sent to all 31 countries of the European Union and European Free Trade Association as part of a European-wide behavioural surveillance mapping study on HIV and other sexually transmitted infections. The questionnaire was returned by 28 countries during August to September 2008: 16 reported behavioural surveillance studies (two provided no further details). A total of 12 countries used repeated surveys for behavioural surveillance and five used their Treatment Demand Indicator system (three used both approaches). The data collected focused on drug use, injecting practices, testing for HIV and hepatitis C virus and access to healthcare. Eight countries had set national indicators: three indicators were each reported by five countries: the sharing any injecting equipment, uptake of HIV testing and uptake of hepatitis C virus testing. The recall periods used varied. Seven countries reported conducting one-off behavioural surveys (in one country without a repeated survey, these resulted an informal surveillance structure). All countries used convenience sampling, with service-based recruitment being the most common approach. Four countries had used respondent-driven sampling. Three fifths of the countries responding (18/28) reported behavioural surveillance activities among IDUs; however, harmonisation of behavioural surveillance indicators is needed.
Resumo:
The collection of data for the purpose of managing food safety includes both monitoring and surveillance. Monitoring is a system of collecting and disseminating data.
Resumo:
An ecological-evolutionary classification of Amazonian triatomines is proposed based on a revision of their main contemporary biogeographical patterns. Truly Amazonian triatomines include the Rhodniini, the Cavernicolini, and perhaps Eratyrus and some Bolboderini. The tribe Rhodniini comprises two major lineages (pictipes and robustus). The former gave rise to trans-Andean (pallescens) and Amazonian (pictipes) species groups, while the latter diversified within Amazonia (robustus group) and radiated to neighbouring ecoregions (Orinoco, Cerrado-Caatinga-Chaco, and Atlantic Forest). Three widely distributed Panstrongylus species probably occupied Amazonia secondarily, while a few Triatoma species include Amazonian populations that occur only in the fringes of the region. T. maculata probably represents a vicariant subset isolated from its parental lineage in the Caatinga-Cerrado system when moist forests closed a dry trans-Amazonian corridor. These diverse Amazonian triatomines display different degrees of synanthropism, defining a behavioural gradient from household invasion by adult triatomines to the stable colonisation of artificial structures. Anthropogenic ecological disturbance (driven by deforestation) is probably crucial in the onset of the process, but the fact that only a small fraction of species effectively colonises artificial environments suggests a role for evolution at the end of the gradient. Domestic infestation foci are restricted to drier subregions within Amazonia; thus, populations adapted to extremely humid rainforest microclimates may have limited chances of successfully colonising the slightly drier artificial microenvironments. These observations suggest several research avenues, from the use of climate data to map risk areas to the assessment of the synanthropic potential of individual vector species.
Resumo:
BACKGROUND. Listeria monocytogenes is the third most frequent cause of bacterial meningitis. The aim of this study is to know the incidence and risk factors associated with development of acute community-acquired Lm meningitis in adult patients and to evaluate the clinical features, management, and outcome in this prospective case series. METHODS. A descriptive, prospective, and multicentric study carried out in 9 hospitals in the Spanish Network for Research in Infectious Diseases (REIPI) over a 39-month period. All adults patients admitted to the participating hospitals with the diagnosis of acute community-acquired bacterial meningitis (Ac-ABM) were included in this study. All these cases were diagnosed on the basis of a compatible clinical picture and a positive cerebrospinal fluid (CSF) culture or blood culture. The patients were followed up until death or discharge from hospital. RESULTS. Two hundred and seventy-eight patients with Ac-ABM were included. Forty-six episodes of Lm meningitis were identified in 46 adult patients. In the multivariate analysis only age (OR 1.026; 95% CI 1.00-1.05; p = 0.042), immunosuppression (OR 2.520; 95% CI 1.05-6.00; p = 0.037), and CSF/blood glucose ratio (OR 39.42; 95% CI 4.01-387.50; p = 0.002) were independently associated with a Lm meningitis. The classic triad of fever, neck stiffness and altered mental status was present in 21 (49%) patients, 32% had focal neurological findings at presentation, 12% presented cerebellum dysfunction, and 9% had seizures. Twenty-nine (68%) patients were immunocompromised. Empirical antimicrobial therapy was intravenous ampicillin for 34 (79%) of 43 patients, in 11 (32%) of them associated to aminoglycosides. Definitive ampicillin plus gentamicin therapy was significantly associated with unfavourable outcome (67% vs 28%; p = 0.024) and a higher mortality (67% vs 32%; p = 0.040).The mortality rate was 28% (12 of 43 patients) and 5 of 31 (16.1%) surviving patients developed adverse clinical outcome. CONCLUSIONS Elderly or immunocompromised patients, and a higher CSF/blood glucose ratio in patients with Ac-ABM must alert clinicians about Lm aetiology. Furthermore, we observed a high incidence of acute community-acquired Lm meningitis in adults and the addition of aminoglycosides to treatment should be avoid in order to improve the patients' outcome. Nevertheless, despite developments in intensive care and antimicrobial therapy, this entity is still a serious disease that carries high morbidity and mortality rates.
Resumo:
Chagas disease, named after Carlos Chagas, who first described it in 1909, exists only on the American Continent. It is caused by a parasite, Trypanosoma cruzi, which is transmitted to humans by blood-sucking triatomine bugs and via blood transfusion. Chagas disease has two successive phases: acute and chronic. The acute phase lasts six-eight weeks. Several years after entering the chronic phase, 20-35% of infected individuals, depending on the geographical area, will develop irreversible lesions of the autonomous nervous system in the heart, oesophagus and colon, and of the peripheral nervous system. Data on the prevalence and distribution of Chagas disease improved in quality during the 1980s as a result of the demographically representative cross-sectional studies in countries where accurate information was not previously available. A group of experts met in Brasilia in 1979 and devised standard protocols to carry out countrywide prevalence studies on human T. cruzi infection and triatomine house infestation. Thanks to a coordinated multi-country programme in the Southern Cone countries, the transmission of Chagas disease by vectors and via blood transfusion was interrupted in Uruguay in 1997, in Chile in 1999 and in Brazil in 2006; thus, the incidence of new infections by T. cruzi across the South American continent has decreased by 70%. Similar multi-country initiatives have been launched in the Andean countries and in Central America and rapid progress has been reported towards the goal of interrupting the transmission of Chagas disease, as requested by a 1998 Resolution of the World Health Assembly. The cost-benefit analysis of investment in the vector control programme in Brazil indicates that there are savings of US$17 in medical care and disabilities for each dollar spent on prevention, showing that the programme is a health investment with very high return. Many well-known research institutions in Latin America were key elements of a worldwide network of laboratories that carried out basic and applied research supporting the planning and evaluation of national Chagas disease control programmes. The present article reviews the current epidemiological trends for Chagas disease in Latin America and the future challenges in terms of epidemiology, surveillance and health policy.
Resumo:
BACKGROUND: Surveillance is an essential element of surgical site infection (SSI) prevention. Few studies have evaluated the long-term effect of these programmes. AIM: To present data from a 13-year multicentre SSI surveillance programme from western and southern Switzerland. METHODS: Surveillance with post-discharge follow-up was performed according to the US National Nosocomial Infections Surveillance (NNIS) system methods. SSI rates were calculated for each surveyed type of surgery, overall and by year of participation in the programme. Risk factors for SSI and the effect of surveillance time on SSI rates were analysed by multiple logistic regression. FINDINGS: Overall SSI rates were 18.2% after 7411 colectomies, 6.4% after 6383 appendicectomies, 2.3% after 7411 cholecystectomies, 1.7% after 9933 herniorrhaphies, 1.6% after 6341 hip arthroplasties, and 1.3% after 3667 knee arthroplasties. The frequency of SSI detected after discharge varied between 21% for colectomy and 94% for knee arthroplasty. Independent risk factors for SSI differed between operations. The NNIS risk index was predictive of SSI in gastrointestinal surgery only. Laparoscopic technique was protective overall, but associated with higher rates of organ-space infections after appendicectomy. The duration of participation in the surveillance programme was not associated with a decreased SSI rate for any of the included procedure. CONCLUSION: These data confirm the effect of post-discharge surveillance on SSI rates and the protective effect of laparoscopy. There is a need to establish alternative case-mix adjustment methods. In contrast to other European programmes, no positive impact of surveillance duration on SSI rates was observed.
Resumo:
Risk factor surveillance is a complementary tool of morbidity and mortality surveillance that improves the likelihood that public health interventions are implemented in a timely fashion. The aim of this study was to identify population predictors of malaria outbreaks in endemic municipalities of Colombia with the goal of developing an early warning system for malaria outbreaks. We conducted a multiple-group, exploratory, ecological study at the municipal level. Each of the 290 municipalities with endemic malaria that we studied was classified according to the presence or absence of outbreaks. The measurement of variables was based on historic registries and logistic regression was performed to analyse the data. Altitude above sea level [odds ratio (OR) 3.65, 95% confidence interval (CI) 1.34-9.98], variability in rainfall (OR 1.85, 95% CI 1.40-2.44) and the proportion of inhabitants over 45 years of age (OR 0.17, 95% CI 0.08-0.38) were factors associated with malaria outbreaks in Colombian municipalities. The results suggest that environmental and demographic factors could have a significant ability to predict malaria outbreaks on the municipal level in Colombia. To advance the development of an early warning system, it will be necessary to adjust and standardise the collection of required data and to evaluate the accuracy of the forecast models.
Resumo:
Epidemiological surveillance systems are essential and require efficient collaborations between family doctors and public health services. Such a system has to take into account the increase in the number of health problems to be studied. Information gathered at an individual level should imply decisions at a population level which in turn should impact on the individual patient. Epidemiological surveillance requires a well organized, representative and constantly revised system led by motivated, adequately trained doctors.
Resumo:
Countries could use the monitoring of drug resistance in malaria parasites as an effective early warning system to develop the timely response mechanisms that are required to avert the further spread of malaria. Drug resistance surveillance is essential in areas where no drug resistance has been reported, especially if neighbouring countries have previously reported resistance. Here, we present the results of a four-year surveillance program based on the sequencing of the pfcrt gene of Plasmodium falciparum populations from endemic areas of Honduras. All isolates were susceptible to chloroquine, as revealed by the pfcrt “CVMNK” genotype in codons 72-76.
Resumo:
We refer to Oswaldo Cruz’s reports dating from 1913 about the necessities of a healthcare system for the Brazilian Amazon Region and about the journey of Carlos Chagas to 27 locations in this region and the measures that would need to be adopted. We discuss the risks of endemicity of Chagas disease in the Amazon Region. We recommend that epidemiological surveillance of Chagas disease in the Brazilian Amazon Region and Pan-Amazon region should be implemented through continuous monitoring of the human population that lives in the area, their housing, the environment and the presence of triatomines. The monitoring should be performed with periodic seroepidemiological surveys, semi-annual visits to homes by health agents and the training of malaria microscopists and healthcare technicians to identify Trypanosoma cruzi from patients’ samples and T. cruzi infection rates among the triatomines caught. We recommend health promotion and control of Chagas disease through public health policies, especially through sanitary education regarding the risk factors for Chagas disease. Finally, we propose a healthcare system through base hospitals, intermediate-level units in the areas of the Brazilian Amazon Region and air transportation, considering the distances to be covered for medical care.
Resumo:
BACKGROUND: European Surveillance of Congenital Anomalies (EUROCAT) is a network of population-based congenital anomaly registries in Europe surveying more than 1 million births per year, or 25% of the births in the European Union. This paper describes the potential of the EUROCAT collaboration for pharmacoepidemiology and drug safety surveillance. METHODS: The 34 full members and 6 associate members of the EUROCAT network were sent a questionnaire about their data sources on drug exposure and on drug coding. Available data on drug exposure during the first trimester available in the central EUROCAT database for the years 1996-2000 was summarised for 15 out of 25 responding full members. RESULTS: Of the 40 registries, 29 returned questionnaires (25 full and 4 associate members). Four of these registries do not collect data on maternal drug use. Of the full members, 15 registries use the EUROCAT drug code, 4 use the international ATC drug code, 3 registries use another coding system and 7 use a combination of these coding systems. Obstetric records are the most frequently used sources of drug information for the registries, followed by interviews with the mother. Only one registry uses pharmacy data. Percentages of cases with drug exposure (excluding vitamins/minerals) varied from 4.4% to 26.0% among different registries. The categories of drugs recorded varied widely between registries. CONCLUSIONS: Practices vary widely between registries regarding recording drug exposure information. EUROCAT has the potential to be an effective collaborative framework to contribute to post-marketing drug surveillance in relation to teratogenic effects, but work is needed to implement ATC drug coding more widely, and to diversify the sources of information used to determine drug exposure in each registry.