212 resultados para count models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Diabetic retinopathy (DR) is a leading cause of blindness, yet pertinent animal models are uncommon. The sand rat (Psammomys obesus), exhibiting diet-induced metabolic syndrome, might constitute a relevant model. METHODS: Adult P. obesus (n = 39) were maintained in captivity for 4 to 7 months and fed either vegetation-based diets (n = 13) or standard rat chow (n = 26). Although plant-fed animals exhibited uniform body weight and blood glucose levels over time, nearly 60% of rat chow-raised animals developed diabetes-like symptoms (test group). Animals were killed, and their eyes and vitreous were processed for immunochemistry. RESULTS: Compared with plant-fed animals, diabetic animals showed many abnormal vascular features, including vasodilation, tortuosity, and pericyte loss within the blood vessels, hyperproteinemia and elevated ratios of proangiogenic and antiangiogenic growth factors in the vitreous, and blood-retinal barrier breakdown. Furthermore, there were statistically significant decreases in retinal cell layer thicknesses and densities, accompanied by profound alterations in glia (downregulation of glutamine synthetase, glutamate-aspartate transporter, upregulation of glial fibrillar acidic protein) and many neurons (reduced expression of protein kinase Cα and Cξ in bipolar cells, axonal degeneration in ganglion cells). Cone photoreceptors were particularly affected, with reduced expression of short- and mid-/long-wavelength opsins. Hypercaloric diet nondiabetic animals showed intermediate values. CONCLUSIONS: Simple dietary modulation of P. obesus induces a rapid and severe phenotype closely resembling human type 2 DR. This species presents a valuable novel experimental model for probing the neural (especially cone photoreceptor) pathogenic modifications that are difficult to study in humans and for screening therapeutic strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Although CD4 cell count monitoring is used to decide when to start antiretroviral therapy in patients with HIV-1 infection, there are no evidence-based recommendations regarding its optimal frequency. It is common practice to monitor every 3 to 6 months, often coupled with viral load monitoring. We developed rules to guide frequency of CD4 cell count monitoring in HIV infection before starting antiretroviral therapy, which we validated retrospectively in patients from the Swiss HIV Cohort Study.Methodology/Principal Findings: We built up two prediction rules ("Snap-shot rule" for a single sample and "Track-shot rule" for multiple determinations) based on a systematic review of published longitudinal analyses of CD4 cell count trajectories. We applied the rules in 2608 untreated patients to classify their 18 061 CD4 counts as either justifiable or superfluous, according to their prior >= 5% or < 5% chance of meeting predetermined thresholds for starting treatment. The percentage of measurements that both rules falsely deemed superfluous never exceeded 5%. Superfluous CD4 determinations represented 4%, 11%, and 39% of all actual determinations for treatment thresholds of 500, 350, and 200x10(6)/L, respectively. The Track-shot rule was only marginally superior to the Snap-shot rule. Both rules lose usefulness for CD4 counts coming near to treatment threshold.Conclusions/Significance: Frequent CD4 count monitoring of patients with CD4 counts well above the threshold for initiating therapy is unlikely to identify patients who require therapy. It appears sufficient to measure CD4 cell count 1 year after a count > 650 for a threshold of 200, > 900 for 350, or > 1150 for 500x10(6)/L, respectively. When CD4 counts fall below these limits, increased monitoring frequency becomes advisable. These rules offer guidance for efficient CD4 monitoring, particularly in resource-limited settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Higher nighttime blood pressure (BP) and the loss of nocturnal dipping of BP are associated with an increased risk for cardiovascular events. However, the determinants of the loss of nocturnal BP dipping are only beginning to be understood. We investigated whether different indicators of physical activity were associated with the loss of nocturnal dipping of BP. METHODS: We conducted a cross-sectional study of 103 patients referred for 24-hour ambulatory monitoring of BP. We measured these patients' step count (SC), active energy expenditure (AEE), and total energy expenditure simultaneously, using actigraphs. RESULTS: In our study population of 103 patients, most of whom were hypertensive, SC and AEE were associated with nighttime systolic BP in univariate (SC, r = -0.28, P < 0.01; AEE, r = -0.20, P = 0.046) and multivariate linear regression analyses (SC, coefficient beta = -5.37, P < 0.001; AEE, coefficient beta = -0.24, P < 0.01). Step count was associated with both systolic (r = 0.23, P = 0.018) and diastolic (r = 0.20, P = 0.045) BP dipping. Nighttime systolic BP decreased progressively across the categories of sedentary, moderately active, and active participants (125mm Hg, 116mm Hg, 112mm Hg, respectively; P = 0.002). The degree of BP dipping of BP increased progressively across the same three categories of activity (respectively 8.9%, 14.6%, and 18.6%, P = 0.002, for systolic BP and respectively 12.8%, 18.1%, and 22.2%, P = 0.006, for diastolic BP). CONCLUSIONS: Step count is continuously associated with nighttime systolic BP and with the degree of BP dipping independently of 24-hour mean BP. The combined use of an actigraph for measuring indicators of physical activity and a device for 24-hour measurement of ambulatory BP may help identify patients at increased risk for cardiovascular events in whom increased physical activity toward higher target levels may be recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Colon carcinoma multicellular spheroids were incubated in vitro with radiolabelled MAbs. The more rapid penetration of fragments as compared to intact MAbs was clearly demonstrated. For the study of antibody localization in tumors in vivo, the model of nude mice with ligated kidneys was used. Although very artificial, this model allowed to demonstrate that, without urinary excretion, Fab fragments accumulated more rapidly into the tumor than intact MAbs and disappeared faster from the blood. This difference was less striking for F(ab')2 fragments. In the liver a decreased accumulation of both types of fragments as compared to intact MAbs was observed. Concerning radioimmunotherapy we think that Fab fragments are not useful because of their too short half-life in the circulation and in tumor and because they will probably be too toxic for the kidneys. Intact MAbs and F(ab')2 fragments have each their advantages. Intact MAbs show highest tumor accumulation in mice without ligated kidney, however, they remain mostly on the periphery of tumor nodules, as shown by autoradiography. F(ab')2 fragments have been found to penetrate deeper into the tumor and to accumulate less in the liver. It might be therefore an advantage to combine intact MAbs with F(ab')2 fragments, so that in the tumor two different regions could be attacked whereas in normal tissues toxicity could be distributed to different organs such as to the liver with intact MAbs and to the kidney with F(ab')2 fragments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to practical difficulties in obtaining direct genetic estimates of effective sizes, conservation biologists have to rely on so-called 'demographic models' which combine life-history and mating-system parameters with F-statistics in order to produce indirect estimates of effective sizes. However, for the same practical reasons that prevent direct genetic estimates, the accuracy of demographic models is difficult to evaluate. Here we use individual-based, genetically explicit computer simulations in order to investigate the accuracy of two such demographic models aimed at investigating the hierarchical structure of populations. We show that, by and large, these models provide good estimates under a wide range of mating systems and dispersal patterns. However, one of the models should be avoided whenever the focal species' breeding system approaches monogamy with no sex bias in dispersal or when a substructure within social groups is suspected because effective sizes may then be strongly overestimated. The timing during the life cycle at which F-statistics are evaluated is also of crucial importance and attention should be paid to it when designing field sampling since different demographic models assume different timings. Our study shows that individual-based, genetically explicit models provide a promising way of evaluating the accuracy of demographic models of effective size and delineate their field of applicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE To develop a score predicting the risk of adverse events (AEs) in pediatric patients with cancer who experience fever and neutropenia (FN) and to evaluate its performance. PATIENTS AND METHODS Pediatric patients with cancer presenting with FN induced by nonmyeloablative chemotherapy were observed in a prospective multicenter study. A score predicting the risk of future AEs (ie, serious medical complication, microbiologically defined infection, radiologically confirmed pneumonia) was developed from a multivariate mixed logistic regression model. Its cross-validated predictive performance was compared with that of published risk prediction rules. Results An AE was reported in 122 (29%) of 423 FN episodes. In 57 episodes (13%), the first AE was known only after reassessment after 8 to 24 hours of inpatient management. Predicting AE at reassessment was better than prediction at presentation with FN. A differential leukocyte count did not increase the predictive performance. The score predicting future AE in 358 episodes without known AE at reassessment used the following four variables: preceding chemotherapy more intensive than acute lymphoblastic leukemia maintenance (weight = 4), hemoglobin > or = 90 g/L (weight = 5), leukocyte count less than 0.3 G/L (weight = 3), and platelet count less than 50 G/L (weight = 3). A score (sum of weights) > or = 9 predicted future AEs. The cross-validated performance of this score exceeded the performance of published risk prediction rules. At an overall sensitivity of 92%, 35% of the episodes were classified as low risk, with a specificity of 45% and a negative predictive value of 93%. CONCLUSION This score, based on four routinely accessible characteristics, accurately identifies pediatric patients with cancer with FN at risk for AEs after reassessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: To investigate prevalence of transmitted drug-resistant human immunodeficiency virus (TDR) and factors associated with TDR and to compare virological and CD4 count response to combination antiretroviral therapy. METHODS: In this study, 525 mostly chronically infected EuroSIDA patients were included who had genotypic resistance tests performed on plasma samples collected while antiretroviral therapy naive. TDR was defined as at least one resistance mutation from a list proposed for genotypic TDR surveillance. Multivariable logistic regression was used to analyze factors associated with detection of TDR, with virological (viral load<500 copies/mL) and CD4 count response (>or=50% increase) to combination antiretroviral therapy at months 6-12. RESULTS: The overall prevalence of TDR was 11.4%, which was stable over 1996-2004. There were no significant differences in virological suppression (those resistant to at least one drug prescribed versus susceptible), adjusted odds ratio: 0.68 (95% confidence interval: 0.27 to 1.71; P=0.408) or CD4 count response, adjusted odds ratio: 1.65 (95% confidence interval: 0.73 to 3.73; P=0.231). CONCLUSIONS: Prevalence of TDR in antiretroviral-naive patients was found to be in line with other European studies. No significant differences were found in virological and CD4 count response after initiation of first-line combination antiretroviral therapy between resistant and susceptible patients, possibly due to the small number of patients with resistance and consequently low power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The outcome of Kaposi sarcoma varies. While many patients do well on highly active antiretroviral therapy, others have progressive disease and need chemotherapy. In order to predict which patients are at risk of unfavorable evolution, we established a prognostic score. METHOD: The survival analysis (Kaplan-Meier method; Cox proportional hazards models) of 144 patients with Kaposi sarcoma prospectively included in the Swiss HIV Cohort Study, from January 1996 to December 2004, was conducted. OUTCOME ANALYZED: use of chemotherapy or death. VARIABLES ANALYZED: demographics, tumor staging [T0 or T1 (16)], CD4 cell counts and HIV-1 RNA concentration, human herpesvirus 8 (HHV8) DNA in plasma and serological titers to latent and lytic antigens. RESULTS: Of 144 patients, 54 needed chemotherapy or died. In the univariate analysis, tumor stage T1, CD4 cell count below 200 cells/microl, positive HHV8 DNA and absence of antibodies against the HHV8 lytic antigen at the time of diagnosis were significantly associated with a bad outcome.Using multivariate analysis, the following variables were associated with an increased risk of unfavorable outcome: T1 [hazard ratio (HR) 5.22; 95% confidence interval (CI) 2.97-9.18], CD4 cell count below 200 cells/microl (HR 2.33; 95% CI 1.22-4.45) and positive HHV8 DNA (HR 2.14; 95% CI 1.79-2.85).We created a score with these variables ranging from 0 to 4: T1 stage counted for two points, CD4 cell count below 200 cells/microl for one point, and positive HHV8 viral load for one point. Each point increase was associated with a HR of 2.26 (95% CI 1.79-2.85). CONCLUSION: In the multivariate analysis, staging (T1), CD4 cell count (<200 cells/microl), positive HHV8 DNA in plasma, at the time of diagnosis, predict evolution towards death or the need of chemotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The detection of Parkinson's disease (PD) in its preclinical stages prior to outright neurodegeneration is essential to the development of neuroprotective therapies and could reduce the number of misdiagnosed patients. However, early diagnosis is currently hampered by lack of reliable biomarkers. (1) H magnetic resonance spectroscopy (MRS) offers a noninvasive measure of brain metabolite levels that allows the identification of such potential biomarkers. This study aimed at using MRS on an ultrahigh field 14.1 T magnet to explore the striatal metabolic changes occurring in two different rat models of the disease. Rats lesioned by the injection of 6-hydroxydopamine (6-OHDA) in the medial-forebrain bundle were used to model a complete nigrostriatal lesion while a genetic model based on the nigral injection of an adeno-associated viral (AAV) vector coding for the human α-synuclein was used to model a progressive neurodegeneration and dopaminergic neuron dysfunction, thereby replicating conditions closer to early pathological stages of PD. MRS measurements in the striatum of the 6-OHDA rats revealed significant decreases in glutamate and N-acetyl-aspartate levels and a significant increase in GABA level in the ipsilateral hemisphere compared with the contralateral one, while the αSyn overexpressing rats showed a significant increase in the GABA striatal level only. Therefore, we conclude that MRS measurements of striatal GABA levels could allow for the detection of early nigrostriatal defects prior to outright neurodegeneration and, as such, offers great potential as a sensitive biomarker of presymptomatic PD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cannabinoid receptor 1 (CB(1) receptor) controls several neuronal functions, including neurotransmitter release, synaptic plasticity, gene expression and neuronal viability. Downregulation of CB(1) expression in the basal ganglia of patients with Huntington's disease (HD) and animal models represents one of the earliest molecular events induced by mutant huntingtin (mHtt). This early disruption of neuronal CB(1) signaling is thought to contribute to HD symptoms and neurodegeneration. Here we determined whether CB(1) downregulation measured in patients with HD and mouse models was ubiquitous or restricted to specific striatal neuronal subpopulations. Using unbiased semi-quantitative immunohistochemistry, we confirmed previous studies showing that CB(1) expression is downregulated in medium spiny neurons of the indirect pathway, and found that CB(1) is also downregulated in neuropeptide Y (NPY)/neuronal nitric oxide synthase (nNOS)-expressing interneurons while remaining unchanged in parvalbumin- and calretinin-expressing interneurons. CB(1) downregulation in striatal NPY/nNOS-expressing interneurons occurs in R6/2 mice, Hdh(Q150/Q150) mice and the caudate nucleus of patients with HD. In R6/2 mice, CB(1) downregulation in NPY/nNOS-expressing interneurons correlates with diffuse expression of mHtt in the soma. This downregulation also occludes the ability of cannabinoid agonists to activate the pro-survival signaling molecule cAMP response element-binding protein in NPY/nNOS-expressing interneurons. Loss of CB(1) signaling in NPY/nNOS-expressing interneurons could contribute to the impairment of basal ganglia functions linked to HD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although dispersal is recognized as a key issue in several fields of population biology (such as behavioral ecology, population genetics, metapopulation dynamics or evolutionary modeling), these disciplines focus on different aspects of the concept and often make different implicit assumptions regarding migration models. Using simulations, we investigate how such assumptions translate into effective gene flow and fixation probability of selected alleles. Assumptions regarding migration type (e.g. source-sink, resident pre-emption, or balanced dispersal) and patterns (e.g. stepping-stone versus island dispersal) have large impacts when demes differ in sizes or selective pressures. The effects of fragmentation, as well as the spatial localization of newly arising mutations, also strongly depend on migration type and patterns. Migration rate also matters: depending on the migration type, fixation probabilities at an intermediate migration rate may lie outside the range defined by the low- and high-migration limits when demes differ in sizes. Given the extreme sensitivity of fixation probability to characteristics of dispersal, we underline the importance of making explicit (and documenting empirically) the crucial ecological/ behavioral assumptions underlying migration models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

n the last two decades, interest in species distribution models (SDMs) of plants and animals has grown dramatically. Recent advances in SDMs allow us to potentially forecast anthropogenic effects on patterns of biodiversity at different spatial scales. However, some limitations still preclude the use of SDMs in many theoretical and practical applications. Here, we provide an overview of recent advances in this field, discuss the ecological principles and assumptions underpinning SDMs, and highlight critical limitations and decisions inherent in the construction and evaluation of SDMs. Particular emphasis is given to the use of SDMs for the assessment of climate change impacts and conservation management issues. We suggest new avenues for incorporating species migration, population dynamics, biotic interactions and community ecology into SDMs at multiple spatial scales. Addressing all these issues requires a better integration of SDMs with ecological theory.