946 resultados para Monitoring methods
Resumo:
Peri-insular hemispherotomy is a surgical technique used in the treatment of drug-resistant epilepsy of hemispheric origin. It is based on the exposure of insula and semi-circular sulci, providing access to the lateral ventricle through a supra- and infra-insular window. From inside the ventricle, a parasagittal callosotomy is performed. The basal and medial portion of the frontal lobe is isolated. Projections to the anterior commissure are interrupted at the time of amygdala resection. The hippocampal tail and fimbria-fornix are disrupted posteriorly. We report our experience of 18 cases treated with this approach. More than half of them presented with congenital epilepsy. Neuronavigation was useful in precisely determining the center and extent of the craniotomy, as well as the direction of tractotomies and callosotomy, allowing minimal exposure and blood loss. Intra-operative monitoring by scalp EEG on the contralateral hemisphere was used to follow the progression of the number of interictal spikes during the disconnection procedure. Approximately 90% of patients were in Engel's Class I. We observed one case who presented with transient postoperative neurological deterioration probably due to CSF overdrainage and documented one case of incomplete disconnection in a patient presenting with hemimegalencephaly who needed a second operation. We observed a good correlation between a significant decrease in the number of spikes at the end of the procedure and seizure outcome. Peri-insular hemispherotomy provides a functional disconnection of the hemisphere with minimal resection of cerebral tissue. It is an efficient technique with a low complication rate. Intra-operative EEG monitoring might be used as a predictive factor of completeness of the disconnection and consequently, seizure outcome.
Oral cancer treatments and adherence: medication event monitoring system assessment for capecitabine
Resumo:
Background: Oncological treatments are traditionally administered via intravenous injection by qualified personnel. Oral formulas which are developing rapidly are preferred by patients and facilitate administration however they may increase non-adherence. In this study 4 common oral chemotherapeutics are given to 50 patients, who are still in the process of inclusion, divided into 4 groups. The aim is to evaluate adherence and offer these patients interdisciplinary support with the joint help of doctors and pharmacists. We present here the results for capecitabine. Materials and Methods: The final goal is to evaluate adhesion in 50 patients split into 4 groups according to oral treatments (letrozole/exemestane, imatinib/sunitinib, capecitabine and temozolomide) using persistence and quality of execution as parameters. These parameters are evaluated using a medication event monitoring system (MEMS®) in addition to routine oncological visits and semi-structured interviews. Patients were monitored for the entire duration of treatment up to a maximum of 1 year. Patient satisfaction was assessed at the end of the monitoring period using a standardized questionary. Results: Capecitabine group included 2 women and 8 men with a median age of 55 years (range: 36−77 years) monitored for an average duration of 100 days (range: 5-210 days). Persistence was 98% and quality of execution 95%. 5 patients underwent cyclic treatment (2 out of 3 weeks) and 5 patients continuous treatment. Toxicities higher than grade 1 were grade 2−3 hand-foot syndrome in 1 patient and grade 3 acute coronary syndrome in 1 patient both without impact on adherence. Patients were satisfied with the interviews undergone during the study (57% useful, 28% very useful, 15% useless) and successfully integrated the MEMS® in their daily lives (57% very easily, 43% easily) according to the results obtained by questionary at the end of the monitoring period. Conclusion: Persistence and quality of execution observed in our Capecitabine group of patients were excellent and better than expected compared to previously published studies. The interdisciplinary approach allowed us to better identify and help patients with toxicities to maintain adherence. Overall patients were satisfied with the global interdisciplinary follow-up. With longer follow up better evaluation of our method and its impact will be possible. Interpretation of the results of patients in the other groups of this ongoing trial will provide us information for a more detailed analysis.
Resumo:
OBJECT: Cerebrovascular pressure reactivity is the ability of cerebral vessels to respond to changes in transmural pressure. A cerebrovascular pressure reactivity index (PRx) can be determined as the moving correlation coefficient between mean intracranial pressure (ICP) and mean arterial blood pressure. METHODS: The authors analyzed a database consisting of 398 patients with head injuries who underwent continuous monitoring of cerebrovascular pressure reactivity. In 298 patients, the PRx was compared with a transcranial Doppler ultrasonography assessment of cerebrovascular autoregulation (the mean index [Mx]), in 17 patients with the PET-assessed static rate of autoregulation, and in 22 patients with the cerebral metabolic rate for O(2). Patient outcome was assessed 6 months after injury. RESULTS: There was a positive and significant association between the PRx and Mx (R(2) = 0.36, p < 0.001) and with the static rate of autoregulation (R(2) = 0.31, p = 0.02). A PRx > 0.35 was associated with a high mortality rate (> 50%). The PRx showed significant deterioration in refractory intracranial hypertension, was correlated with outcome, and was able to differentiate patients with good outcome, moderate disability, severe disability, and death. The graph of PRx compared with cerebral perfusion pressure (CPP) indicated a U-shaped curve, suggesting that too low and too high CPP was associated with a disturbance in pressure reactivity. Such an optimal CPP was confirmed in individual cases and a greater difference between current and optimal CPP was associated with worse outcome (for patients who, on average, were treated below optimal CPP [R(2) = 0.53, p < 0.001] and for patients whose mean CPP was above optimal CPP [R(2) = -0.40, p < 0.05]). Following decompressive craniectomy, pressure reactivity initially worsened (median -0.03 [interquartile range -0.13 to 0.06] to 0.14 [interquartile range 0.12-0.22]; p < 0.01) and improved in the later postoperative course. After therapeutic hypothermia, in 17 (70.8%) of 24 patients in whom rewarming exceeded the brain temperature threshold of 37 degrees C, ICP remained stable, but the average PRx increased to 0.32 (p < 0.0001), indicating significant derangement in cerebrovascular reactivity. CONCLUSIONS: The PRx is a secondary index derived from changes in ICP and arterial blood pressure and can be used as a surrogate marker of cerebrovascular impairment. In view of an autoregulation-guided CPP therapy, a continuous determination of a PRx is feasible, but its value has to be evaluated in a prospective controlled trial.
Resumo:
BACKGROUND: Poor medication adherence is a frequent cause of treatment failure but is difficult to diagnose. In this study we have evaluated the impact of measuring adherence to cinacalcet-HCl and phosphate binders in dialysis patients with uncontrolled secondary hyperparathyroidism. METHODS: 7 chronic dialysis patients with iPTH-levels >= 300 pg/ml despite treatment with >= 60 mg cinacalcet-HCl were included. Medication adherence was measured using the "Medication Events Monitoring System" during 3 months, followed by another 3-month period without monitoring. The adherence results were monthly discussed with the patients, as well as strategies to improve them. RESULTS: During monitoring, the percentage of prescribed doses taken was higher for cinacalcet-HCl (87.4%) and sevelamer (86.3%) than for calcium acetate (76.1%), as was the taking adherence (81.9% vs. 57.3% vs. 49.1%) but not the percentage of drug holidays (12.3% vs. 4.5% vs. 3.6%). Mean PO4 levels (from 2.24 +/- 0.6 mmol/l to 1.73 +/- 0.41 mmol/l; p = 0.14) and Ca++ x PO4 product (4.73 +/- 1.43 to 3.41 +/- 1.04 mmol2/l2; p = 0.12) improved and iPTH-level improved significantly from 916 +/- 618 pg/ml to 442 +/- 326 pg/ml (p = 0.04), without any change in medication. However, as drug monitoring was interrupted, all laboratory parameters worsened again. CONCLUSIONS: Assessment of drug adherence helped to document episodes of non-compliance and helped to avoid seemingly necessary dose increases.
Resumo:
Aims: Therapeutic Drug Monitoring (TDM) is an established tool to optimize thepharmacotherapy with immunosupressants, antibiotics, antiretroviral agents, anticonvulsantsand psychotropic drugs. The TDM expert group of the Association ofNeuropsychopharmacolgy and Pharmacopsychiatry recommended clinical guidelinesfor TDM of psychotropic drugs in 2004 and in 2011. They allocate 4 levelsof recommendation based on studies reporting plasma concentrations and clinicaloutcomes. To evaluate the additional benefit for drugs without direct evidence forTDM and to verify the recommendation levels of the expert group the authorsbuilt a new rating scale. Methods: This rating scale included 28 items and wasdivided in 5 categories: Efficacy, toxicity, pharmacokinetics, patient characteristicsand cost effectiveness. A literature search was performed for 10 antidepressants,10 antipsychotics, 8 drugs used in the treatment of substance related disordersand lithium, thereafter, a comparison with the assessment of the TDMexpert group was carried out. Results: The antidepressants as well as the antipsychoticsshowed a high and significant correlation with the recommendations inthe consensus guidelines. However, meanderings could be detected for the drugsused in the therapy of substance related disorders, for which TDM is mostly notestablished yet. The result of the antidepressants and antipsychotics permits aclassification of the reachable points; upper 13 - TDM strongly recommended10 to 13 - TDM recommended, 8 to 10 - TDM useful and below 8 - TDMpotentially useful. Conclusion: These results suggest this rating scale is sensitiveto detect the appropriateness of TDM for drug treatment. For those drugs TDM isnot established a more objective estimation is possible, thus the scoring helps tofocus on the most likely drugs to require TDM.
Resumo:
The generation of an antigen-specific T-lymphocyte response is a complex multi-step process. Upon T-cell receptor-mediated recognition of antigen presented by activated dendritic cells, naive T-lymphocytes enter a program of proliferation and differentiation, during the course of which they acquire effector functions and may ultimately become memory T-cells. A major goal of modern immunology is to precisely identify and characterize effector and memory T-cell subpopulations that may be most efficient in disease protection. Sensitive methods are required to address these questions in exceedingly low numbers of antigen-specific lymphocytes recovered from clinical samples, and not manipulated in vitro. We have developed new techniques to dissect immune responses against viral or tumor antigens. These allow the isolation of various subsets of antigen-specific T-cells (with major histocompatibility complex [MHC]-peptide multimers and five-color FACS sorting) and the monitoring of gene expression in individual cells (by five-cell reverse transcription-polymerase chain reaction [RT-PCR]). We can also follow their proliferative life history by flow-fluorescence in situ hybridization (FISH) analysis of average telomere length. Recently, using these tools, we have identified subpopulations of CD8+ T-lymphocytes with distinct proliferative history and partial effector-like properties. Our data suggest that these subsets descend from recently activated T-cells and are committed to become differentiated effector T-lymphocytes.
Resumo:
Drug resistance is one of the principal obstacles blocking worldwide malaria control. In Colombia, malaria remains a major public health concern and drug-resistant parasites have been reported. In vitro drug susceptibility assays are a useful tool for monitoring the emergence and spread of drug-resistant Plasmodium falciparum. The present study was conducted as a proof of concept for an antimalarial drug resistance surveillance network based on in vitro susceptibility testing in Colombia. Sentinel laboratories were set up in three malaria endemic areas. The enzyme linked immunosorbent assay-histidine rich protein 2 and schizont maturation methods were used to assess the susceptibility of fresh P. falciparum isolates to six antimalarial drugs. This study demonstrates that an antimalarial drug resistance surveillance network based on in vitro methods is feasible in the field with the participation of a research institute, local health institutions and universities. It could also serve as a model for a regional surveillance network. Preliminary susceptibility results showed widespread chloroquine resistance, which was consistent with previous reports for the Pacific region. However, high susceptibility to dihydroartemisinin and lumefantrine compounds, currently used for treatment in the country, was also reported. The implementation process identified critical points and opportunities for the improvement of network sustainability strategies.
Resumo:
Nonimmediate drug hypersensitivity reactions (DHRs) are difficult to manage in daily clinical practice, mainly owing to their heterogeneous clinical manifestations and the lack of selective biological markers. In vitro methods are necessaryto establish a diagnosis, especially given the low sensitivity of skin tests and the inherent risks of drug provocation testing. In vitro evaluation of nonimmediate DHRs must include approaches that can be applied during the different phases of the reaction. During the acute phase, monitoring markers in both skin and peripheral blood helps to discriminate between immediate and nonimmediate DHRs with cutaneous responses and to distinguish between reactions that, although they present similar clinical symptoms, are produced by different immunological mechanisms and therefore have a different treatment and prognosis. During the resolution phase, in vitro testing is used to detect the response of T cells to drug stimulation; however, this approach has certain limitations, such as the lack of validated studies assessing sensitivity. Moreover, in vitro tests indicate an immune response that is not always related to a DHR. In this review, members of the Immunology and Drug Allergy Committee of the Spanish Society of Allergy and Clinical Immunology (SEAIC) provide an overview of the most widely used in vitro tests for evaluating nonimmediate DHRs.
Resumo:
Background:Amplitude-integrated electroencephalogram (aEEG) is increasingly used for neuromonitoring in preterms. We aimed to quantify the effects of gestational age (GA), postnatal age (PNA), and other perinatal factors on the development of aEEG early after birth in very preterm newborns with normal cerebral ultrasounds.Methods:Continuous aEEG was prospectively performed in 96 newborns (mean GA: 29.5 (range: 24.4-31.9) wk, birth weight 1,260 (580-2,120) g) during the first 96 h of life. aEEG tracings were qualitatively (maturity scores) and quantitatively (amplitudes) evaluated using preestablished criteria.Results:A significant increase in all aEEG measures was observed between day 1 and day 4 and for increasing GA (P < 0.001). The effect of PNA on aEEG development was 6.4- to 11.3-fold higher than that of GA. In multivariate regression, GA and PNA were associated with increased qualitative and quantitative aEEG measures, whereas small-for-GA status was independently associated with increased maximum aEEG amplitude (P = 0.003). Morphine administration negatively affected all aEEG measures (P < .05), and caffeine administration negatively affected qualitative aEEG measures (P = 0.02).Conclusion:During the first few days after birth, aEEG activity in very preterm infants significantly develops and is strongly subjected to the effect of PNA. Perinatal factors may alter the early aEEG tracing and interfere with its interpretation.
Resumo:
A new ambulatory method of monitoring physical activities in Parkinson's disease (PD) patients is proposed based on a portable data-logger with three body-fixed inertial sensors. A group of ten PD patients treated with subthalamic nucleus deep brain stimulation (STN-DBS) and ten normal control subjects followed a protocol of typical daily activities and the whole period of the measurement was recorded by video. Walking periods were recognized using two sensors on shanks and lying periods were detected using a sensor on trunk. By calculating kinematics features of the trunk movements during the transitions between sitting and standing postures and using a statistical classifier, sit-to-stand (SiSt) and stand-to-sit (StSi) transitions were detected and separated from other body movements. Finally, a fuzzy classifier used this information to detect periods of sitting and standing. The proposed method showed a high sensitivity and specificity for the detection of basic body postures allocations: sitting, standing, lying, and walking periods, both in PD patients and healthy subjects. We found significant differences in parameters related to SiSt and StSi transitions between PD patients and controls and also between PD patients with and without STN-DBS turned on. We concluded that our method provides a simple, accurate, and effective means to objectively quantify physical activities in both normal and PD patients and may prove useful to assess the level of motor functions in the latter.
Resumo:
The development of forensic intelligence relies on the expression of suitable models that better represent the contribution of forensic intelligence in relation to the criminal justice system, policing and security. Such models assist in comparing and evaluating methods and new technologies, provide transparency and foster the development of new applications. Interestingly, strong similarities between two separate projects focusing on specific forensic science areas were recently observed. These observations have led to the induction of a general model (Part I) that could guide the use of any forensic science case data in an intelligence perspective. The present article builds upon this general approach by focusing on decisional and organisational issues. The article investigates the comparison process and evaluation system that lay at the heart of the forensic intelligence framework, advocating scientific decision criteria and a structured but flexible and dynamic architecture. These building blocks are crucial and clearly lay within the expertise of forensic scientists. However, it is only part of the problem. Forensic intelligence includes other blocks with their respective interactions, decision points and tensions (e.g. regarding how to guide detection and how to integrate forensic information with other information). Formalising these blocks identifies many questions and potential answers. Addressing these questions is essential for the progress of the discipline. Such a process requires clarifying the role and place of the forensic scientist within the whole process and their relationship to other stakeholders.
Resumo:
In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
OBJECTIVES: Recommendations for EEG monitoring in the ICU are lacking. The Neurointensive Care Section of the ESICM assembled a multidisciplinary group to establish consensus recommendations on the use of EEG in the ICU. METHODS: A systematic review was performed and 42 studies were included. Data were extracted using the PICO approach, including: (a) population, i.e. ICU patients with at least one of the following: traumatic brain injury, subarachnoid hemorrhage, intracerebral hemorrhage, stroke, coma after cardiac arrest, septic and metabolic encephalopathy, encephalitis, and status epilepticus; (b) intervention, i.e. EEG monitoring of at least 30 min duration; (c) control, i.e. intermittent vs. continuous EEG, as no studies compared patients with a specific clinical condition, with and without EEG monitoring; (d) outcome endpoints, i.e. seizure detection, ischemia detection, and prognostication. After selection, evidence was classified and recommendations developed using the GRADE system. RECOMMENDATIONS: The panel recommends EEG in generalized convulsive status epilepticus and to rule out nonconvulsive seizures in brain-injured patients and in comatose ICU patients without primary brain injury who have unexplained and persistent altered consciousness. We suggest EEG to detect ischemia in comatose patients with subarachnoid hemorrhage and to improve prognostication of coma after cardiac arrest. We recommend continuous over intermittent EEG for refractory status epilepticus and suggest it for patients with status epilepticus and suspected ongoing seizures and for comatose patients with unexplained and persistent altered consciousness. CONCLUSIONS: EEG monitoring is an important diagnostic tool for specific indications. Further data are necessary to understand its potential for ischemia assessment and coma prognostication.
Resumo:
Immunotherapy of melanoma is aimed to mobilize cytolytic CD8+ T cells playing a central role in protective immunity. Despite numerous clinical vaccine trials, only few patients exhibited strong antigen-specific T-cell activation, stressing the need to improve vaccine strategies. For a rational development, we propose to focus on molecularly defined vaccine components, and evaluate their immunogenicity with highly reproducible and standardized methods for ex vivo immune monitoring. Careful immunogenicity comparison of vaccine formulations in phase I/II studies allow to select optimized vaccines for subsequent clinical efficacy testing in large scale phase III trials.