870 resultados para Maturity (Individuals)
Resumo:
Learned irrelevance (LIrr) refers to a form of selective learning that develops as a result of prior noncorrelated exposures of the predicted and predictor stimuli. In learning situations that depend on the associative link between the predicted and predictor stimuli, LIrr is expressed as a retardation of learning. It represents a form of modulation of learning by selective attention. Given the relevance of selective attention impairment to both positive and cognitive schizophrenia symptoms, the question remains whether LIrr impairment represents a state (relating to symptom manifestation) or trait (relating to schizophrenia endophenotypes) marker of human psychosis. We examined this by evaluating the expression of LIrr in an associative learning paradigm in (1) asymptomatic first-degree relatives of schizophrenia patients (SZ-relatives) and in (2) individuals exhibiting prodromal signs of psychosis ("ultrahigh risk" [UHR] patients) in each case relative to demographically matched healthy control subjects. There was no evidence for aberrant LIrr in SZ-relatives, but LIrr as well as associative learning were attenuated in UHR patients. It is concluded that LIrr deficiency in conjunction with a learning impairment might be a useful state marker predictive of psychotic state but a relatively weak link to a potential schizophrenia endophenotype.
Resumo:
Abstract Background and Aims: Data on the influence of calibration on accuracy of continuous glucose monitoring (CGM) are scarce. The aim of the present study was to investigate whether the time point of calibration has an influence on sensor accuracy and whether this effect differs according to glycemic level. Subjects and Methods: Two CGM sensors were inserted simultaneously in the abdomen on either side of 20 individuals with type 1 diabetes. One sensor was calibrated predominantly using preprandial glucose (calibration(PRE)). The other sensor was calibrated predominantly using postprandial glucose (calibration(POST)). At minimum three additional glucose values per day were obtained for analysis of accuracy. Sensor readings were divided into four categories according to the glycemic range of the reference values (low, ≤4 mmol/L; euglycemic, 4.1-7 mmol/L; hyperglycemic I, 7.1-14 mmol/L; and hyperglycemic II, >14 mmol/L). Results: The overall mean±SEM absolute relative difference (MARD) between capillary reference values and sensor readings was 18.3±0.8% for calibration(PRE) and 21.9±1.2% for calibration(POST) (P<0.001). MARD according to glycemic range was 47.4±6.5% (low), 17.4±1.3% (euglycemic), 15.0±0.8% (hyperglycemic I), and 17.7±1.9% (hyperglycemic II) for calibration(PRE) and 67.5±9.5% (low), 24.2±1.8% (euglycemic), 15.5±0.9% (hyperglycemic I), and 15.3±1.9% (hyperglycemic II) for calibration(POST). In the low and euglycemic ranges MARD was significantly lower in calibration(PRE) compared with calibration(POST) (P=0.007 and P<0.001, respectively). Conclusions: Sensor calibration predominantly based on preprandial glucose resulted in a significantly higher overall sensor accuracy compared with a predominantly postprandial calibration. The difference was most pronounced in the hypo- and euglycemic reference range, whereas both calibration patterns were comparable in the hyperglycemic range.
Resumo:
A high prevalence of gastroesophageal reflux (GERD) has been observed in individuals with cerebral palsy (CP). One of the main risks for dental erosion is GERD. This study aimed to evaluate the presence of GERD, variables related to dental erosion and associated with GERD (diet consumption, gastrointestinal symptoms, bruxism), and salivary flow rate, in a group of 46 non-institutionalized CP individuals aged from 3 to 13 years.
Resumo:
Objective. To investigate the relationship between coping and atherothrombotic biomarkers of an increased cardiovascular disease (CVD) risk in the elderly. Methods. We studied 136 elderly caregiving and noncaregiving men and women who completed the Ways of Coping Checklist to assess problem-focused coping, seeking social support (SSS), blamed self, wishful thinking, and avoidance coping. They had circulating levels of 12 biomarkers measured. We also probed for potential mediator and moderator variables (chronic stress, affect, health behavior, autonomic activity) for the relation between coping and biomarkers. Results. After controlling for demographic and CVD risk factors, greater use of SSS was associated with elevated levels of serum amyloid A (P = 0.001), C-reactive protein (CRP) (P = 0.002), vascular cellular adhesion molecule (VCAM)-1 (P = 0.021), and D-dimer (P = 0.032). There were several moderator effects. For instance, greater use of SSS was associated with elevated VCAM-1 (P < 0.001) and CRP (P = 0.001) levels in subjects with low levels of perceived social support and positive affect, respectively. The other coping styles were not significantly associated with any biomarker. Conclusions. Greater use of SSS might compromise cardiovascular health through atherothrombotic mechanisms, including elevated inflammation (i.e., serum amyloid A, CRP, VCAM-1) and coagulation (i.e., D-dimer) activity. Moderating variables need to be considered in this relationship.
Resumo:
BACKGROUND Current guidelines give recommendations for preferred combination antiretroviral therapy (cART). We investigated factors influencing the choice of initial cART in clinical practice and its outcome. METHODS We analyzed treatment-naive adults with human immunodeficiency virus (HIV) infection participating in the Swiss HIV Cohort Study and starting cART from January 1, 2005, through December 31, 2009. The primary end point was the choice of the initial antiretroviral regimen. Secondary end points were virologic suppression, the increase in CD4 cell counts from baseline, and treatment modification within 12 months after starting treatment. RESULTS A total of 1957 patients were analyzed. Tenofovir-emtricitabine (TDF-FTC)-efavirenz was the most frequently prescribed cART (29.9%), followed by TDF-FTC-lopinavir/r (16.9%), TDF-FTC-atazanavir/r (12.9%), zidovudine-lamivudine (ZDV-3TC)-lopinavir/r (12.8%), and abacavir/lamivudine (ABC-3TC)-efavirenz (5.7%). Differences in prescription were noted among different Swiss HIV Cohort Study sites (P < .001). In multivariate analysis, compared with TDF-FTC-efavirenz, starting TDF-FTC-lopinavir/r was associated with prior AIDS (relative risk ratio, 2.78; 95% CI, 1.78-4.35), HIV-RNA greater than 100 000 copies/mL (1.53; 1.07-2.18), and CD4 greater than 350 cells/μL (1.67; 1.04-2.70); TDF-FTC-atazanavir/r with a depressive disorder (1.77; 1.04-3.01), HIV-RNA greater than 100 000 copies/mL (1.54; 1.05-2.25), and an opiate substitution program (2.76; 1.09-7.00); and ZDV-3TC-lopinavir/r with female sex (3.89; 2.39-6.31) and CD4 cell counts greater than 350 cells/μL (4.50; 2.58-7.86). At 12 months, 1715 patients (87.6%) achieved viral load less than 50 copies/mL and CD4 cell counts increased by a median (interquartile range) of 173 (89-269) cells/μL. Virologic suppression was more likely with TDF-FTC-efavirenz, and CD4 increase was higher with ZDV-3TC-lopinavir/r. No differences in outcome were observed among Swiss HIV Cohort Study sites. CONCLUSIONS Large differences in prescription but not in outcome were observed among study sites. A trend toward individualized cART was noted suggesting that initial cART is significantly influenced by physician's preference and patient characteristics. Our study highlights the need for evidence-based data for determining the best initial regimen for different HIV-infected persons.
Resumo:
Increasing evidence suggests that the basic foundations of the self lie in the brain systems that represent the body. Specific sensorimotor stimulation has been shown to alter the bodily self. However, little is known about how disconnection of the brain from the body affects the phenomenological sense of the body and the self. Spinal cord injury (SCI) patients who exhibit massively reduced somatomotor processes below the lesion in the absence of brain damage are suitable for testing the influence of body signals on two important components of the self-the sense of disembodiment and body ownership. We recruited 30 SCI patients and 16 healthy participants, and evaluated the following parameters: (i) depersonalization symptoms, using the Cambridge Depersonalization Scale (CDS), and (ii) measures of body ownership, as quantified by the rubber hand illusion (RHI) paradigm. We found higher CDS scores in SCI patients, which show increased detachment from their body and internal bodily sensations and decreasing global body ownership with higher lesion level. The RHI paradigm reveals no alterations in the illusory ownership of the hand between SCI patients and controls. Yet, there was no typical proprioceptive drift in SCI patients with intact tactile sensation on the hand, which might be related to cortical reorganization in these patients. These results suggest that disconnection of somatomotor inputs to the brain due to spinal cord lesions resulted in a disturbed sense of an embodied self. Furthermore, plasticity-related cortical changes might influence the dynamics of the bodily self.
Resumo:
Telephone communication is a challenge for many hearing-impaired individuals. One important technical reason for this difficulty is the restricted frequency range (0.3-3.4 kHz) of conventional landline telephones. Internet telephony (voice over Internet protocol [VoIP]) is transmitted with a larger frequency range (0.1-8 kHz) and therefore includes more frequencies relevant to speech perception. According to a recently published, laboratory-based study, the theoretical advantage of ideal VoIP conditions over conventional telephone quality has translated into improved speech perception by hearing-impaired individuals. However, the speech perception benefits of nonideal VoIP network conditions, which may occur in daily life, have not been explored. VoIP use cannot be recommended to hearing-impaired individuals before its potential under more realistic conditions has been examined.
Resumo:
With a virus such as Human Immunodeficiency Virus (HIV) that has infected millions of people worldwide, and with many unaware that they are infected, it becomes vital to understand how the virus works and how it functions at the molecular level. Because there currently is no vaccine and no way to eradicate the virus from an infected person, any information about how the virus interacts with its host greatly increases the chances of understanding how HIV works and brings scientists one step closer to being able to combat such a destructive virus. Thousands of HIV viruses have been sequenced and are available in many online databases for public use. Attributes that are linked to each sequence include the viral load within the host and how sick the patient is currently. Being able to predict the stage of infection for someone is a valuable resource, as it could potentially aid in treatment options and proper medication use. Our approach of analyzing region-specific amino acid composition for select genes has been able to predict patient disease state up to an accuracy of 85.4%. Moreover, we output a set of classification rules based on the sequence that may prove useful for diagnosing the expected clinical outcome of the infected patient.
Resumo:
Steroidogenic factor-1 (SF-1/NR5A1) is a nuclear receptor that regulates adrenal and reproductive development and function. NR5A1 mutations have been detected in 46,XY individuals with disorders of sexual development (DSD) but apparently normal adrenal function and in 46,XX women with normal sexual development yet primary ovarian insufficiency (POI).
Resumo:
BACKGROUND: Morbidity and mortality of individuals co-infected with HIV and hepatitis C virus (HCV) is often determined by the course of their HCV infection. Only a selected proportion of those in need of HCV treatment are studied in randomized controlled trials (RCTs). We analysed the prevalence of HCV infection in a large cohort, the number of individuals requiring treatment, the eligibility for HCV treatment, and the outcome of the combination therapy with pegylated interferon-a and ribavirin in routine practice. METHODS: We analysed prescription patterns of HCV treatment and treatment outcomes among participants from the Swiss HIV Cohort Study with detectable hepatitis C viraemia (between January 2001 and October 2004). Efficacy was measured by the number of patients with undetectable HCV RNA at the end of therapy (EOTR) and at 6 months after treatment termination (SVR). Intention-to-continue-treatment principles were used. RESULTS: A total of 2150 of 7048 (30.5%) participants were coinfected with HCV; HCV RNA was detected in 60%, and not assessed in 26% of HCV-antibody-positive individuals. One hundred and sixty (12.5%) of HCV-RNA-positive patients started treatment. In patients infected with HCV genotypes 1/4 or 2/3, EOTR was achieved in 43.3% and 81.2% of patients, respectively, and SVR rates were 28.4% and 51.8%, respectively. More than 50% of the HCV-treated patients would have been excluded from two large published RCTs due to demographic, clinical and laboratory criteria. CONCLUSIONS: Despite clinical and psychosocial obstacles encountered in clinical practice, HCV treatment in HIV-coinfected individuals is feasible with results similar to those obtained in RCTs.
Resumo:
BACKGROUND: Patients coinfected with hepatitis C virus (HCV) and HIV experience higher mortality rates than patients infected with HIV alone. We designed a study to determine whether risks for later mortality are similar for HCV-positive and HCV-negative individuals when subjects are stratified on the basis of baseline CD4+ T-cell counts. METHODS: Antiretroviral-naive individuals, who initiated highly active antiretroviral therapy (HAART) between 1996 and 2002 were included in the study. HCV-positive and HCV-negative individuals were stratified separately by baseline CD4+ T-cell counts of 50 cell/microl increments. Cox-proportional hazards regression was used to model the effect of these strata with other variables on survival. RESULTS: CD4+ T-cell strata below 200 cells/microl, but not above, imparted an increased relative hazard (RH) of mortality for both HCV-positive and HCV-negative individuals. Among HCV-positive individuals, after adjustment for baseline age, HIV RNA levels, history of injection drug use and adherence to therapy, only CD4+ T-cell strata of <50 cells/microl (RH=4.60; 95% confidence interval [CI] 2.72-7.76) and 50-199 cells/microl (RH=2.49; 95% CI 1.63-3.81) were significantly associated with increased mortality when compared with those initiating therapy at cell counts >500 cells/microl. The same baseline CD4+ T-cell strata were found for HCV-negative individuals. CONCLUSION: In a within-groups analysis, the baseline CD4+ T-cell strata that are associated with increased RHs for mortality are the same for HCV-positive and HCV-negative individuals initiating HAART. However, a between-groups analysis reveals a higher absolute mortality risk for HCV-positive individuals.
Resumo:
In natural history studies of chronic disease, it is of interest to understand the evolution of key variables that measure aspects of disease progression. This is particularly true for immunological variables in persons infected with the Human Immunodeficiency Virus (HIV). The natural timescale for such studies is time since infection. However, most data available for analysis arise from prevalent cohorts, where the date of infection is unknown for most or all individuals. As a result, standard curve fitting algorithms are not immediately applicable. Here we propose two methods to circumvent this difficulty. The first uses repeated measurement data to provide information not only on the level of the variable of interest, but also on its rate of change, while the second uses an estimate of the expected time since infection. Both methods are based on the principal curves algorithm of Hastie and Stuetzle, and are applied to data from a prevalent cohort of HIV-infected homosexual men, giving estimates of the average pattern of CD4+ lymphocyte decline. These methods are applicable to natural history studies using data from prevalent cohorts where the time of disease origin is uncertain, provided certain ancillary information is available from external sources.