998 resultados para Monitoring stations
Resumo:
OBJECTIVE: This study aimed to survey current practices in European epilepsy monitoring units (EMUs) with emphasis on safety issues. METHODS: A 37-item questionnaire investigating characteristics and organization of EMUs, including measures for prevention and management of seizure-related serious adverse events (SAEs), was distributed to all identified European EMUs plus one located in Israel (N=150). RESULTS: Forty-eight (32%) EMUs, located in 18 countries, completed the questionnaire. Epilepsy monitoring unit beds are 1-2 in 43%, 3-4 in 34%, and 5-6 in 19% of EMUs; staff physicians are 1-2 in 32%, 3-4 in 34%, and 5-6 in 19% of EMUs. Personnel operating in EMUs include epileptologists (in 69% of EMUs), clinical neurophysiologists trained in epilepsy (in 46% of EMUs), child neurologists (in 35% of EMUs), neurology and clinical neurophysiology residents (in 46% and in 8% of EMUs, respectively), and neurologists not trained in epilepsy (in 27% of EMUs). In 20% of EMUs, patients' observation is only intermittent or during the daytime and primarily carried out by neurophysiology technicians and/or nurses (in 71% of EMUs) or by patients' relatives (in 40% of EMUs). Automatic detection systems for seizures are used in 15%, for body movements in 8%, for oxygen desaturation in 33%, and for ECG abnormalities in 17% of EMUs. Protocols for management of acute seizures are lacking in 27%, of status epilepticus in 21%, and of postictal psychoses in 87% of EMUs. Injury prevention consists of bed protections in 96% of EMUs, whereas antisuffocation pillows are employed in 21%, and environmental protections in monitoring rooms and in bathrooms are implemented in 38% and in 25% of EMUs, respectively. The most common SAEs were status epilepticus reported by 79%, injuries by 73%, and postictal psychoses by 67% of EMUs. CONCLUSIONS: All EMUs have faced different types of SAEs. Wide variation in practice patterns and lack of protocols and of precautions to ensure patients' safety might promote the occurrence and severity of SAEs. Our findings highlight the need for standardized and shared protocols for an effective and safe management of patients in EMUs.
Resumo:
PURPOSE OF REVIEW: Multimodal monitoring (MMM) is routinely applied in neurointensive care. Unfortunately, there is no robust evidence on which MMM-derived physiologic variables are the most clinically relevant, how and when they should be monitored, and whether MMM impacts outcome. The complexity is even higher because once the data are continuously collected, interpretation and integration of these complex physiologic events into targeted individualized care is still embryonic. RECENT FINDINGS: Recent clinical investigation mainly focused on intracranial pressure, perfusion of the brain, and oxygen availability along with electrophysiology. Moreover, a series of articles reviewing the available evidence on all the MMM tools, giving practical recommendations for bedside MMM, has been published, along with other consensus documents on the role of neuromonitoring and electroencephalography in this setting. SUMMARY: MMM allows comprehensive exploration of the complex pathophysiology of acute brain damage and, depending on the different configuration of the pathological condition we are treating, the application of targeted individualized care. Unfortunately, we still lack robust evidence on how to better integrate MMM-derived information at the bedside to improve patient management. Advanced informatics is promising and may provide us a supportive tool to interpret physiologic events and guide pathophysiological-based therapeutic decisions.
Resumo:
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Resumo:
BACKGROUND: In acute respiratory failure, arterial blood gas analysis (ABG) is used to diagnose hypercapnia. Once non-invasive ventilation (NIV) is initiated, ABG should at least be repeated within 1 h to assess PaCO2 response to treatment in order to help detect NIV failure. The main aim of this study was to assess whether measuring end-tidal CO2 (EtCO2) with a dedicated naso-buccal sensor during NIV could predict PaCO2 variation and/or PaCO2 absolute values. The additional aim was to assess whether active or passive prolonged expiratory maneuvers could improve the agreement between expiratory CO2 and PaCO2. METHODS: This is a prospective study in adult patients suffering from acute hypercapnic respiratory failure (PaCO2 ≥ 45 mmHg) treated with NIV. EtCO2 and expiratory CO2 values during active and passive expiratory maneuvers were measured using a dedicated naso-buccal sensor and compared to concomitant PaCO2 values. The agreement between two consecutive values of EtCO2 (delta EtCO2) and two consecutive values of PaCO2 (delta PaCO2) and between PaCO2 and concomitant expiratory CO2 values was assessed using the Bland and Altman method adjusted for the effects of repeated measurements. RESULTS: Fifty-four datasets from a population of 11 patients (8 COPD and 3 non-COPD patients), were included in the analysis. PaCO2 values ranged from 39 to 80 mmHg, and EtCO2 from 12 to 68 mmHg. In the observed agreement between delta EtCO2 and deltaPaCO2, bias was -0.3 mmHg, and limits of agreement were -17.8 and 17.2 mmHg. In agreement between PaCO2 and EtCO2, bias was 14.7 mmHg, and limits of agreement were -6.6 and 36.1 mmHg. Adding active and passive expiration maneuvers did not improve PaCO2 prediction. CONCLUSIONS: During NIV delivered for acute hypercapnic respiratory failure, measuring EtCO2 using a dedicating naso-buccal sensor was inaccurate to predict both PaCO2 and PaCO2 variations over time. Active and passive expiration maneuvers did not improve PaCO2 prediction. TRIAL REGISTRATION: ClinicalTrials.gov: NCT01489150.
Resumo:
BACKGROUND: The need to contextualise wastewater-based figures about illicit drug consumption by comparing them with other indicators has been stressed by numerous studies. The objective of the present study was to further investigate the possibility of combining wastewater data to conventional statistics to assess the reliability of the former method and obtain a more balanced picture of illicit drug consumption in the investigated area. METHODS: Wastewater samples were collected between October 2013 and July 2014 in the metropolitan area of Lausanne (226,000 inhabitants), Switzerland. Methadone, its metabolite 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP), the exclusive metabolite of heroin, 6-monoacetylmorphine (6-MAM), and morphine loads were used to estimate the amounts of methadone and heroin consumed. RESULTS: Methadone consumption estimated from EDDP was in agreement with the expectations. Heroin estimates based on 6-MAM loads were inconsistent. Estimates obtained from morphine loads, combined to prescription/sales data, were in agreement with figures derived from syringe distribution data and general population surveys. CONCLUSIONS: The results obtained for methadone allowed assessing the reliability of the selected sampling strategy, supporting its ability to capture the consumption of a small cohort (i.e., 743 patients). Using morphine as marker, in combination with prescription/sales data, estimates in accordance with other indicators about heroin use were obtained. Combining different sources of data allowed strengthening the results and suggested that the different indicators (i.e., administration route, average dosage and number of consumers) contribute to depict a realistic representation of the phenomenon in the investigated area. Heroin consumption was estimated to approximately 13gday(-1) (118gday(-1) at street level).
Resumo:
Dynamic adaptations of one"s behavior by means of performance monitoring are a central function of the human executive system, that underlies considerable interindividual variation. Converging evidence from electrophysiological and neuroimaging studies in both animals and humans hints atthe importance ofthe dopaminergic system forthe regulation of performance monitoring. Here, we studied the impact of two polymorphisms affecting dopaminergic functioning in the prefrontal cortex [catechol-O-methyltransferase (COMT) Val108/158Met and dopamine D4 receptor (DRD4) single-nucleotide polymorphism (SNP)-521] on neurophysiological correlates of performance monitoring. We applied a modified version of a standard flanker task with an embedded stop-signal task to tap into the different functions involved, particularly error monitoring, conflict detection and inhibitory processes. Participants homozygous for the DRD4 T allele produced an increased error-related negativity after both choice errors and failed inhibitions compared with C-homozygotes. This was associated with pronounced compensatory behavior reflected in higher post-error slowing. No group differences were seen in the incompatibility N2, suggesting distinct effects of the DRD4 polymorphism on error monitoring processes. Additionally, participants homozygous for the COMTVal allele, with a thereby diminished prefrontal dopaminergic level, revealed increased prefrontal processing related to inhibitory functions, reflected in the enhanced stop-signal-related components N2 and P3a. The results extend previous findings from mainly behavioral and neuroimaging data on the relationship between dopaminergic genes and executive functions and present possible underlying mechanisms for the previously suggested association between these dopaminergic polymorphisms and psychiatric disorders as schizophrenia or attention deficit hyperactivity disorder.
Resumo:
Water is vital to humans and each of us needs at least 1.5 L of safe water a day to drink. Beginning as long ago as 1958 the World Health Organization (WHO) has published guidelines to help ensure water is safe to drink. Focused from the start on monitoring radionuclides in water, and continually cooperating with WHO, the International Standardization Organization (ISO) has been publishing standards on radioactivity test methods since 1978. As reliable, comparable and"fit for purpose" results are an essential requirement for any public health decision based on radioactivity measurements, international standards of tested and validated radionuclide test methods are an important tool for production of such measurements. This paper presents the ISO standards already published that could be used as normative references by testing laboratories in charge of radioactivity monitoring of drinking water as well as those currently under drafting and the prospect of standardized fast test methods in response to a nuclear accident.
Resumo:
Tärkeä tehtävä ympäristön tarkkailussa on arvioida ympäristön nykyinen tila ja ihmisen siihen aiheuttamat muutokset sekä analysoida ja etsiä näiden yhtenäiset suhteet. Ympäristön muuttumista voidaan hallita keräämällä ja analysoimalla tietoa. Tässä diplomityössä on tutkittu vesikasvillisuudessa hai vainuja muutoksia käyttäen etäältä hankittua mittausdataa ja kuvan analysointimenetelmiä. Ympäristön tarkkailuun on käytetty Suomen suurimmasta järvestä Saimaasta vuosina 1996 ja 1999 otettuja ilmakuvia. Ensimmäinen kuva-analyysin vaihe on geometrinen korjaus, jonka tarkoituksena on kohdistaa ja suhteuttaa otetut kuvat samaan koordinaattijärjestelmään. Toinen vaihe on kohdistaa vastaavat paikalliset alueet ja tunnistaa kasvillisuuden muuttuminen. Kasvillisuuden tunnistamiseen on käytetty erilaisia lähestymistapoja sisältäen valvottuja ja valvomattomia tunnistustapoja. Tutkimuksessa käytettiin aitoa, kohinoista mittausdataa, minkä perusteella tehdyt kokeet antoivat hyviä tuloksia tutkimuksen onnistumisesta.
Resumo:
BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.
Resumo:
The citrus nursery tree is produced through the bud grafting process, in which rootstock is usually grown from seed germination. The objective of this research was to evaluate, in two dissimilar environmental conditions, the viability and polyembryony expression of five citrus rootstocks seeds stored in different periods under refrigeration. The rootstock varieties evaluated were: Rangpur lime (Citrus limonia Osb. cv. Limeira), Trifoliate orange (Poncirus trifoliata Raf. cv. Limeira), Citrumelo (P. trifoliata x C. paradisi Macf. cv. Swingle), Sunki mandarin (C. sunki Hort. ex Tanaka) and Volkamer lemon (C. volkameriana Ten. & Pasq. cv. Catania 2). The experimental design was the randomized blocks in a 11 x 5 x 2 factorial scheme, evaluating from time zero to the tenth month of storage, the five varieties of rootstock in two environments: germination and growth B.O.D type chamber (Biological Oxygen Demand - Eletrolab Brand Model FC 122) at 25 °C; and greenhouse seedbed with partial temperature control (22 °C to 36 °C) and humidity control (75-85%). The plot had 24 seeds in four replicates, using trays with substrate in greenhouse and Petri dishes with filter paper in B.O.D. chamber. The seed germination rate and polyembryony expression were evaluated monthly. It was concluded that Trifoliate and Citrumelo Swingle seeds can be stored for up to seven months, while Volkamer lemon, Rangpur lime and Sunki seeds can be stored for up to ten months. The polyembryony expression rate was slightly higher when measured in greenhouse than in B.O.D. chamber and remained stable in both environments until the seventh month, from which dropped sharply. Citrumelo Swingle seeds expressed the highest polyembryony rate (18.8%), followed by Rangpur lime and Volkamer lemon (average value of 13.7%), Sunki (9.4%) and Trifoliate (3.2%). Despite some differences among varieties, the viability of rootstock stored seeds can be monitored either in the greenhouse or in B.O.D. germination chamber, the latter being the faster and more suitable method.
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.