24 resultados para Cell Monitoring

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND HIV-1 viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not universally available. We examined monitoring of first-line and switching to second-line ART in sub-Saharan Africa, 2004-2013. METHODS Adult HIV-1 infected patients starting combination ART in 16 countries were included. Switching was defined as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to a protease inhibitor (PI)-based regimen, with a change of ≥1 NRTI. Virological and immunological failures were defined per World Health Organization criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% confidence intervals (CI) comparing routine VL monitoring, targeted VL monitoring, CD4 cell monitoring and clinical monitoring, adjusted for programme and individual characteristics. FINDINGS Of 297,825 eligible patients, 10,352 patients (3·5%) switched during 782,412 person-years of follow-up. Compared to CD4 monitoring hazard ratios for switching were 3·15 (95% CI 2·92-3·40) for routine VL, 1·21 (1·13-1·30) for targeted VL and 0·49 (0·43-0·56) for clinical monitoring. Overall 58.0% of patients with confirmed virological and 19·3% of patients with confirmed immunological failure switched within 2 years. Among patients who switched the percentage with evidence of treatment failure based on a single CD4 or VL measurement ranged from 32·1% with clinical to 84.3% with targeted VL monitoring. Median CD4 counts at switching were 215 cells/µl under routine VL monitoring but lower with other monitoring (114-133 cells/µl). INTERPRETATION Overall few patients switched to second-line ART and switching occurred late in the absence of routine viral load monitoring. Switching was more common and occurred earlier with targeted or routine viral load testing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The flower gene has been previously linked to the elimination of slow dividing epithelial cells during development in a process known as "cell competition." During cell competition, different isoforms of the Flower protein are displayed at the cell membrane and reveal the reduced fitness of slow proliferating cells, which are therefore recognized, eliminated, and replaced by their normally dividing neighbors. This mechanism acts as a "cell quality" control in proliferating tissues. RESULTS: Here, we use the Drosophila eye as a model to study how unwanted neurons are culled during retina development and find that flower is required and sufficient for the recognition and elimination of supernumerary postmitotic neurons, contained within incomplete ommatidia units. This constitutes the first description of the "Flower Code" functioning as a cell selection mechanism in postmitotic cells and is also the first report of a physiological role for this cell quality control machinery. CONCLUSIONS: Our results show that the "Flower Code" is a general system to reveal cell fitness and that it may play similar roles in creating optimal neural networks in higher organisms. The Flower Code seems to be a more general mechanism for cell monitoring and selection than previously recognized.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study. Methodology/Principal Findings We built up two prediction rules (“Snap-shot rule” for a single sample and “Track-shot rule” for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior ≥5% or <5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200×106/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold. Conclusions/Significance Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count >650 for a threshold of 200, >900 for 350, or >1150 for 500×106/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Worldwide, 700,000 infants are infected annually by HIV-1, most of them in resource-limited settings. Care for these children requires simple, inexpensive tests. We have evaluated HIV-1 p24 antigen for antiretroviral treatment (ART) monitoring in children. p24 by boosted enzyme-linked immunosorbent assay of heated plasma and HIV-1 RNA were measured prospectively in 24 HIV-1-infected children receiving ART. p24 and HIV-1 RNA concentrations and their changes between consecutive visits were related to the respective CD4+ changes. Age at study entry was 7.6 years; follow-up was 47.2 months, yielding 18 visits at an interval of 2.8 months (medians). There were 399 complete visit data sets and 375 interval data sets. Controlling for variation between individuals, there was a positive relationship between concentrations of HIV-1 RNA and p24 (P < 0.0001). While controlling for initial CD4+ count, age, sex, days since start of ART, and days between visits, the relative change in CD4+ count between 2 successive visits was negatively related to the corresponding relative change in HIV-1 RNA (P = 0.009), but not to the initial HIV-1 RNA concentration (P = 0.94). Similarly, we found a negative relationship with the relative change in p24 over the interval (P < 0.0001), whereas the initial p24 concentration showed a trend (P = 0.08). Statistical support for the p24 model and the HIV-1 RNA model was similar. p24 may be an accurate low-cost alternative to monitor ART in pediatric HIV-1 infection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large numbers and functionally competent T cells are required to protect from diseases for which antibody-based vaccines have consistently failed (1), which is the case for many chronic viral infections and solid tumors. Therefore, therapeutic vaccines aim at the induction of strong antigen-specific T-cell responses. Novel adjuvants have considerably improved the capacity of synthetic vaccines to activate T cells, but more research is necessary to identify optimal compositions of potent vaccine formulations. Consequently, there is a great need to develop accurate methods for the efficient identification of antigen-specific T cells and the assessment of their functional characteristics directly ex vivo. In this regard, hundreds of clinical vaccination trials have been implemented during the last 15 years, and monitoring techniques become more and more standardized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In low-income settings, treatment failure is often identified using CD4 cell count monitoring. Consequently, patients remain on a failing regimen, resulting in a higher risk of transmission. We investigated the benefit of routine viral load monitoring for reducing HIV transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To compare outcomes of antiretroviral therapy (ART) in South Africa, where viral load monitoring is routine, with those in Malawi and Zambia, where monitoring is based on CD4 cell counts. Methods: We included 18 706 adult patients starting ART in South Africa and 80 937 patients in Zambia or Malawi. We examined CD4 responses in models for repeated measures and the probability of switching to second-line regimens, mortality and loss to follow-up in multistate models, measuring time from 6 months. Results: In South Africa, 9.8% [95% confidence interval (CI) 9.1–10.5] had switched at 3 years, 1.3% (95% CI 0.9–1.6) remained on failing first-line regimens, 9.2% (95% CI 8.5–9.8) were lost to follow-up and 4.3% (95% CI 3.9–4.8) had died. In Malawi and Zambia, more patients were on a failing first-line regimen [3.7% (95% CI 3.6–3.9], fewer patients had switched [2.1% (95% CI 2.0–2.3)] and more patients were lost to follow-up [15.3% (95% CI 15.0–15.6)] or had died [6.3% (95% CI 6.0–6.5)]. Median CD4 cell counts were lower in South Africa at the start of ART (93 vs. 132 cells/μl; P < 0.001) but higher after 3 years (425 vs. 383 cells/μl; P < 0.001). The hazard ratio comparing South Africa with Malawi and Zambia after adjusting for age, sex, first-line regimen and CD4 cell count was 0.58 (0.50–0.66) for death and 0.53 (0.48–0.58) for loss to follow-up. Conclusion: Over 3 years of ART mortality was lower in South Africa than in Malawi or Zambia. The more favourable outcome in South Africa might be explained by viral load monitoring leading to earlier detection of treatment failure, adherence counselling and timelier switching to second-line ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: We wished to investigate the toxicity of four immunosuppressant and antimetabolic drugs, which are known to influence postoperative wound healing, on three different human ocular cell lines. METHODS: Acute toxicity to cyclosporin A, azathioprine, mitomicyn C and daunorubicin was assessed in Chang cells by monitoring their uptake of propidium iodide during a 3-h period. Chronic toxicity was assessed by monitoring the proliferation and viability of subconfluent cultures of Chang cells, human corneal endothelial cells (HCECs) and retinal pigmented epithelial (RPE) cells after continuous exposure to the drugs for 7 days. RESULTS: Acute toxicity testing revealed no obvious effects. However, the chronic toxicity tests disclosed a narrow concentration range over which cell proliferation decreased dramatically but calcein metabolism was sustained. Although the three lines reacted similarly to each agent, HCECs were the most vulnerable to daunorubicin and mitomycin. At a daunorubicin concentration of 0.05 microg/ml, a 75% decrease in calcein metabolism (P < 0.001) and a > or = 95% cell loss (P < 0.001) were observed. At a mitomycin concentration of 0.01 mug/ml, cell density decreased by 61% (P < 0.001) without a change in calcein metabolism, but at 0.1 microg/ml, the latter parameter decreased to 12% (P = 0.00014). At this concentration the proliferation of Chang and RPE cells decreased by more than 50%, whilst calcein metabolism was largely sustained. Cyclosporin inhibited cell proliferation moderately at lower concentrations (< 5 microg/ml; P=0.05) and substantially at higher ones, with a corresponding decline in calcein metabolism. Azathioprine induced a profound decrease in both parameters at concentrations above 5 microg/ml. CONCLUSION: Daunorubicin, cyclosporin and azathioprine could be used to inhibit excessive intraocular scarring after glaucoma and vitreoretinal surgery without overly reducing cell viability. The attributes of immunosuppressants lie in their combined antiproliferative and immunomodulatory effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiparameter cerebral monitoring has been widely applied in traumatic brain injury to study posttraumatic pathophysiology and to manage head-injured patients (e.g., combining O(2) and pH sensors with cerebral microdialysis). Because a comprehensive approach towards understanding injury processes will also require functional measures, we have added electrophysiology to these monitoring modalities by attaching a recording electrode to the microdialysis probe. These dual-function (microdialysis/electrophysiology) probes were placed in rats following experimental fluid percussion brain injuries, and in a series of severely head-injured human patients. Electrical activity (cell firing, EEG) was monitored concurrently with microdialysis sampling of extracellular glutamate, glucose and lactate. Electrophysiological parameters (firing rate, serial correlation, field potential occurrences) were analyzed offline and compared to dialysate concentrations. In rats, these probes demonstrated an injury-induced suppression of neuronal firing (from a control level of 2.87 to 0.41 spikes/sec postinjury), which was associated with increases in extracellular glutamate and lactate, and decreases in glucose levels. When placed in human patients, the probes detected sparse and slowly firing cells (mean = 0.21 spike/sec), with most units (70%) exhibiting a lack of serial correlation in the spike train. In some patients, spontaneous field potentials were observed, suggesting synchronously firing neuronal populations. In both the experimental and clinical application, the addition of the recording electrode did not appreciably affect the performance of the microdialysis probe. The results suggest that this technique provides a functional monitoring capability which cannot be obtained when electrophysiology is measured with surface or epidural EEG alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The incidence of bloodstream infection (BSI) in extracorporeal life support (ECLS) is reported between 0.9 and 19.5%. In January 2006, the Extracorporeal Life Support Organization (ELSO) reported an overall incidence of 8.78% distributed as follows: respiratory: 6.5% (neonatal), 20.8% (pediatric); cardiac: 8.2% (neonatal) and 12.6% (pediatric). METHOD: At BC Children's Hospital (BCCH) daily surveillance blood cultures (BC) are performed and antibiotic prophylaxis is not routinely recommended. Positive BC (BC+) were reviewed, including resistance profiles, collection time of BC+, time to positivity and mortality. White blood cell count, absolute neutrophile count, immature/total ratio, platelet count, fibrinogen and lactate were analyzed 48, 24 and 0 h prior to BSI. A univariate linear regression analysis was performed. RESULTS: From 1999 to 2005, 89 patients underwent ECLS. After exclusion, 84 patients were reviewed. The attack rate was 22.6% (19 BSI) and 13.1% after exclusion of coagulase-negative staphylococci (n = 8). BSI patients were significantly longer on ECLS (157 h) compared to the no-BSI group (127 h, 95% CI: 106-148). Six BSI patients died on ECLS (35%; 4 congenital diaphragmatic hernias, 1 hypoplastic left heart syndrome and 1 after a tetralogy repair). BCCH survival on ECLS was 71 and 58% at discharge, which is comparable to previous reports. No patient died primarily because of BSI. No BSI predictor was identified, although lactate may show a decreasing trend before BSI (P = 0.102). CONCLUSION: Compared with ELSO, the studied BSI incidence was higher with a comparable mortality. We speculate that our BSI rate is explained by underreporting of "contaminants" in the literature, the use of broad-spectrum antibiotic prophylaxis and a higher yield with daily monitoring BC. We support daily surveillance blood cultures as an alternative to antibiotic prophylaxis in the management of patients on ECLS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To describe temporal trends in baseline clinical characteristics, initial treatment regimens and monitoring of patients starting antiretroviral therapy (ART) in resource-limited settings. METHODS: We analysed data from 17 ART programmes in 12 countries in sub-Saharan Africa, South America and Asia. Patients aged 16 years or older with documented date of start of highly active ART (HAART) were included. Data were analysed by calculating medians, interquartile ranges (IQR) and percentages by regions and time periods. Not all centres provided data for 2006 and 2005 and 2006 were therefore combined. RESULTS: A total of 36,715 patients who started ART 1996-2006 were included in the analysis. Patient numbers increased substantially in sub-Saharan Africa and Asia, and the number of initial regimens declined, to four and five, respectively, in 2005-2006. In South America 20 regimes were used in 2005-2006. A combination of 3TC/D4T/NVP was used for 56% of African patients and 42% of Asian patients; AZT/3TC/EFV was used in 33% of patients in South America. The median baseline CD4 count increased in recent years, to 122 cells/microl (IQR 53-194) in 2005-2006 in Africa, 134 cells/microl (IQR 72-191) in Asia, and 197 cells/microl (IQR 61-277) in South America, but 77%, 78% and 51%, respectively, started with <200 cells/microl in 2005-2006. In all regions baseline CD4 cell counts were higher in women than men: differences were 22cells/microl in Africa, 65 cells/microl in Asia and 10 cells/microl in South America. In 2005-2006 a viral load at 6 months was available in 21% of patients Africa, 8% of Asian patients and 73% of patients in South America. Corresponding figures for 6-month CD4 cell counts were 74%, 77% and 81%. CONCLUSIONS: The public health approach to providing ART proposed by the World Health Organization has been implemented in sub-Saharan Africa and Asia. Although CD4 cell counts at the start of ART have increased in recent years, most patients continue to start with counts well below the recommended threshold. Particular attention should be paid to more timely initiation of ART in HIV-infected men.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inhibition of ErbB2 (HER2) with monoclonal antibodies, an effective therapy in some forms of breast cancer, is associated with cardiotoxicity, the pathophysiology of which is poorly understood. Recent data suggest, that dual inhibition of ErbB1 (EGFR) and ErbB2 signaling is more efficient in cancer therapy, however, cardiac safety of this therapeutic approach is unknown. We therefore tested an ErbB1-(CGP059326) and an ErbB1/ErbB2-(PKI166) tyrosine kinase inhibitor in an in-vitro system of adult rat ventricular cardiomyocytes and assessed their effects on 1. cell viability, 2. myofibrillar structure, 3. contractile function, and 4. MAPK- and Akt-signaling alone or in combination with Doxorubicin. Neither CGP nor PKI induced cardiomyocyte necrosis or apoptosis. PKI but not CGP caused myofibrillar structural damage that was additive to that induced by Doxorubicin at clinically relevant doses. These changes were associated with an inhibition of excitation-contraction coupling. PKI but not CGP decreased p-Erk1/2, suggesting a role for this MAP-kinase signaling pathway in the maintenance of myofibrils. These data indicate that the ErbB2 signaling pathway is critical for the maintenance of myofibrillar structure and function. Clinical studies using ErbB2-targeted inhibitors for the treatment of cancer should be designed to include careful monitoring for cardiac dysfunction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this single-center, cross-sectional study, we evaluated 44 very long-term survivors with a median follow-up of 17.5 years (range, 11-26 years) after hematopoietic stem cell transplantation. We assessed the telomere length difference in human leukocyte antigen-identical donor and recipient sibling pairs and searched for its relationship with clinical factors. The telomere length (in kb, mean +/- SD) was significantly shorter in all recipient blood cells compared with their donors' blood cells (P < .01): granulocytes (6.5 +/- 0.9 vs 7.1 +/- 0.9), naive/memory T cells (5.7 +/- 1.2 vs 6.6 +/- 1.2; 5.2 +/- 1.0 vs 5.7 +/- 0.9), B cells (7.1 +/- 1.1 vs 7.8 +/- 1.1), and natural killer/natural killer T cells (4.8 +/- 1.0 vs 5.6 +/- 1.3). Chronic graft-versus-host disease (P < .04) and a female donor (P < .04) were associated with a greater difference in telomere length between donor and recipient. Critically short telomeres have been described in degenerative diseases and secondary malignancies. If this hypothesis can be confirmed, identification of recipients at risk for cellular senescence could become part of monitoring long-term survivors after hematopoietic stem cell transplantation.