913 resultados para International Peat Society
Resumo:
Objective: There is convincing evidence that phonological, orthographic and semantic processes influence children’s ability to learn to read and spell words. So far only a few studies investigated the influence of implicit learning in literacy skills. Children are sensitive to the statistics of their learning environment. By frequent reading they acquire implicit knowledge about the frequency of letter patterns in written words, and they use this knowledge during reading and spelling. Additionally, semantic connections facilitate to storing of words in memory. Thus, the aim of the intervention study was to implement a word-picture training which is based on statistical and semantic learning. Furthermore, we aimed at examining the training effects in reading and spelling in comparison to an auditory-visual matching training and a working memory training program. Participants and Methods: One hundred and thirty-two children aged between 8 and 11 years participated in training in three weekly session of 12 minutes over 8 weeks, and completed other assessments of reading, spelling, working memory and intelligence before and after training. Results: Results revealed in general that the word-picture training and the auditory-visual matching training led to substantial gains in reading and spelling performance in comparison to the working-memory training. Although both children with and without learning difficulties profited in their reading and spelling after the word-picture training, the training program led to differential effects for the two groups. After the word-picture training on the one hand, children with learning difficulties profited more in spelling as children without learning difficulties, on the other hand, children without learning difficulties benefit more in word comprehension. Conclusions: These findings highlight the need for frequent reading trainings with semantic connections in order to support the acquisition of literacy skills.
Resumo:
In 2005, two ice cores with lengths of 58.7 and 57.6 m respectively to bedrock were recovered from the Miaoergou flat-topped glacier (43 degrees 03 ' 19 '' N, 94 degrees 19 ' 21 '' E; 4512 m a.s.l.), eastern Tien Shan. Pb-210 dating of one of the ice cores (57.6 m) was performed, and an age of AD 1851 +/- 6 at a depth of 35.2 m w.e. was determined. For the period AD 1851-2005, a mean annual net accumulation of 229 +/- 7 mm w.e. a(-1) was calculated. At the nearby oasis city of Hami (similar to 80 km from the Miaoergou flat-topped glacier) the annual precipitation rate is 38 mm w.e. a(-1), hence glacial meltwater is a major water supply for local residents. The surface activity concentration of Pb-210(ex) was found to be similar to 400 mBq kg(-1), which is higher than observed at other continental sites such as Belukha, Russia, and Tsambagarav, Mongolia, which have surface activity concentrations of 280 mBq kg(-1). The Pb-210 dating agrees well with the chronological sequence deduced from the annual-layer counting resulting from the seasonalities of delta O-18 and trace metals for the period AD 1953-2005, and beta-activity horizons resulting from atmospheric nuclear testing during the period AD 1962-63. We conclude that Pb-210 analysis is a suitable method for obtaining a continuous dating of the Miaoergou ice core for similar to 160 years, which can also be applied to other ice cores recovered from the mountains of western China.
Resumo:
Historical information is always relevant for clinical trial design. Additionally, if incorporated in the analysis of a new trial, historical data allow to reduce the number of subjects. This decreases costs and trial duration, facilitates recruitment, and may be more ethical. Yet, under prior-data conflict, a too optimistic use of historical data may be inappropriate. We address this challenge by deriving a Bayesian meta-analytic-predictive prior from historical data, which is then combined with the new data. This prospective approach is equivalent to a meta-analytic-combined analysis of historical and new data if parameters are exchangeable across trials. The prospective Bayesian version requires a good approximation of the meta-analytic-predictive prior, which is not available analytically. We propose two- or three-component mixtures of standard priors, which allow for good approximations and, for the one-parameter exponential family, straightforward posterior calculations. Moreover, since one of the mixture components is usually vague, mixture priors will often be heavy-tailed and therefore robust. Further robustness and a more rapid reaction to prior-data conflicts can be achieved by adding an extra weakly-informative mixture component. Use of historical prior information is particularly attractive for adaptive trials, as the randomization ratio can then be changed in case of prior-data conflict. Both frequentist operating characteristics and posterior summaries for various data scenarios show that these designs have desirable properties. We illustrate the methodology for a phase II proof-of-concept trial with historical controls from four studies. Robust meta-analytic-predictive priors alleviate prior-data conflicts ' they should encourage better and more frequent use of historical data in clinical trials.
Resumo:
Objective: There is evidence that children after mild traumatic brain injuries (mTBI) suffer ongoing post-concussive symptoms (PCS). However, results concerning neuropsychological outcome after mTBI are controversial. Thus, our aim was to examine group differences regarding neuropsychological outcome and PCS. Additionally, we explored the influence of current and pre-injury everyday attention problems on neuropsychological outcome in children after mTBI. Method: In a prospective short-term longitudinal study, 40 children (aged 6-16 years) after mTBI and 38 children after orthopedic injury (OI) underwent neuropsychological, socio-behavioral and PCS assessments in the acute stage and at 1 week, at 4 weeks, and 4 months after the injury. Results: Parents of children after mTBI observed significantly more PCS compared to parents of children after OI, especially in the acute stage. Our results revealed no neuropsychological or socio-behavioral differences over time between both groups. However, in children after mTBI, we found negative correlations between elevated levels of everyday attention problems and reduced neuropsychological performance. Furthermore, there was a negative influence of pre-injury everyday attention problems on neuropsychological performance in children after mTBI. Conclusion: In accordance with earlier studies, parents of children after mTBI initially observed significantly more PCS compared to parents of children after OI. There were no neuropsychological or socio-behavioral group differences between children after mTBI and OI in the post-acute period. However, our exploratory findings concerning the influence of everyday attention problems on neuropsychological outcome indicate that current and pre-injury everyday attention problems were negatively associated with neuropsychological performance in children after mTBI.
Resumo:
INTRODUCTION According to reports from observational databases, classic AIDS-defining opportunistic infections (ADOIs) occur in patients with CD4 counts above 500/µL on and off cART. Adjudication of these events is usually not performed. However, ADOIs are often used as endpoints, for example, in analyses on when to start cART. MATERIALS AND METHODS In the database, Swiss HIV Cohort Study (SHCS) database, we identified 91 cases of ADOIs that occurred from 1996 onwards in patients with the nearest CD4 count >500/µL. Cases of tuberculosis and recurrent bacterial pneumonia were excluded as they also occur in non-immunocompromised patients. Chart review was performed in 82 cases, and in 50 cases we identified CD4 counts within six months before until one month after ADOI and had chart review material to allow an in-depth review. In these 50 cases, we assessed whether (1) the ADOI fulfilled the SHCS diagnostic criteria (www.shcs.ch), and (2) HIV infection with CD4 >500/µL was the main immune-compromising condition to cause the ADOI. Adjudication of cases was done by two experienced clinicians who had to agree on the interpretation. RESULTS More than 13,000 participants were followed in SHCS in the period of interest. Twenty-four (48%) of the chart-reviewed 50 patients with ADOI and CD4 >500/µL had an HIV RNA <400 copies/mL at the time of ADOI. In the 50 cases, candida oesophagitis was the most frequent ADOI in 30 patients (60%) followed by pneumocystis pneumonia and chronic ulcerative HSV disease (Table 1). Overall chronic HIV infection with a CD4 count >500/µL was the likely explanation for the ADOI in only seven cases (14%). Other reasons (Table 1) were ADOIs occurring during primary HIV infection in 5 (10%) cases, unmasking IRIS in 1 (2%) case, chronic HIV infection with CD4 counts <500/µL near the ADOI in 13 (26%) cases, diagnosis not according to SHCS diagnostic criteria in 7 (14%) cases and most importantly other additional immune-compromising conditions such as immunosuppressive drugs in 14 (34%). CONCLUSIONS In patients with CD4 counts >500/ µL, chronic HIV infection is the cause of ADOIs in only a minority of cases. Other immuno-compromising conditions are more likely explanations in one-third of the patients, especially in cases of candida oesophagitis. ADOIs in HIV patients with high CD4 counts should be used as endpoints only with much caution in studies based on observational databases.
Resumo:
INTRODUCTION Proteinuria (PTU) is an important marker for the development and progression of renal disease, cardiovascular disease and death, but there is limited information about the prevalence and factors associated with confirmed PTU in predominantly white European HIV+ persons, especially in those with an estimated glomerular filtration rate (eGFR) of 60 mL/min/1.73 m(2). PATIENTS AND METHODS Baseline was defined as the first of two consecutive dipstick urine protein (DPU) measurements during prospective follow-up >1/6/2011 (when systematic data collection began). PTU was defined as two consecutive DUP >1+ (>30 mg/dL) >3 months apart; persons with eGFR <60 at either DPU measurement were excluded. Logistic regression investigated factors associated with PTU. RESULTS A total of 1,640 persons were included, participants were mainly white (n=1,517, 92.5%), male (n=1296, 79.0%) and men having sex with men (n=809; 49.3%). Median age at baseline was 45 (IQR 37-52 years), and CD4 was 570 (IQR 406-760/mm(3)). The median baseline date was 2/12 (IQR 11/11-6/12), and median eGFR was 99 (IQR 88-109 mL/min/1.73 m(2)). Sixty-nine persons had PTU (4.2%, 95% CI 3.2-4.7%). Persons with diabetes had increased odds of PTU, as were those with a prior non-AIDS (1) or AIDS event and those with prior exposure to indinavir. Among females, those with a normal eGFR (>90) and those with prior abacavir use had lower odds of PTU (Figure 1). CONCLUSIONS One in 25 persons with eGFR>60 had confirmed proteinuria at baseline. Factors associated with PTU were similar to those associated with CKD. The lack of association with antiretrovirals, particularly tenofovir, may be due to the cross-sectional design of this study, and additional follow-up is required to address progression to PTU in those without PTU at baseline. It may also suggest other markers are needed to capture the deteriorating renal function associated with antiretrovirals may be needed at higher eGFRs. Our findings suggest PTU is an early marker for impaired renal function.
Resumo:
INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.
Resumo:
Introduction: HIV-1 viral escape in the cerebrospinal fluid (CSF) despite viral suppression in plasma is rare [1,2]. We describe the case of a 50-year-old HIV-1 infected patient who was diagnosed with HIV-1 in 1995. Antiretroviral therapy (ART) was started in 1998 with a CD4 T cell count of 71 cells/ìL and HIV-viremia of 46,000 copies/mL. ART with zidovudine (AZT), lamivudine (3TC) and efavirenz achieved full viral suppression. After the patient had interrupted ART for two years, treatment was re-introduced with tenofovir (TDF), emtricitabin (FTC) and ritonavir boosted atazanavir (ATVr). This regimen suppressed HIV-1 in plasma for nine years and CD4 cells stabilized around 600 cells/ìL. Since July 2013, the patient complained about severe gait ataxia and decreased concentration. Materials and Methods: Additionally to a neurological examination, two lumbar punctures, a cerebral MRI and a neuropsycological test were performed. HIV-1 viral load in plasma and in CSF was quantified using Cobas TaqMan HIV-1 version 2.0 (Cobas Ampliprep, Roche diagnostic, Basel, Switzerland) with a detection limit of 20 copies/mL. Drug resistance mutations in HIV-1 reverse transcriptase and protease were evaluated using bulk sequencing. Results: The CSF in January 2014 showed a pleocytosis with 75 cells/ìL (100% mononuclear) and 1,184 HIV-1 RNA copies/mL, while HIV-1 in plasma was below 20 copies/mL. The resistance testing of the CSF-HIV-1 RNA showed two NRTI resistance-associated mutations (M184V and K65R) and one NNRTI resistance-associated mutation (K103N). The cerebral MRI showed increased signal on T2-weighted images in the subcortical and periventricular white matter, in the basal ganglia and thalamus. Four months after ART intensification with AZT, 3TC, boosted darunavir and raltegravir, the pleocytosis in CSF cell count normalized to 1 cell/ìL and HIV viral load was suppressed. The neurological symptoms improved; however, equilibrium disturbances and impaired memory persisted. The neuro-psychological evaluation confirmed neurocognitive impairments in executive functions, attention, working and nonverbal memory, speed of information processing, visuospatial abilities and motor skills. Conclusions: HIV-1 infected patients with neurological complaints prompt further investigations of the CSF including measurement of HIV viral load and genotypic resistance testing since isolated replication of HIV with drug resistant variants can rarely occur despite viral suppression in plasma. Optimizing ART by using drugs with improved CNS penetration may achieve viral suppression in CSF with improvement of neurological symptoms.
Resumo:
INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
Determining the expected age at a potential ice-core drilling site on a polar ice sheet generally depends on a combination of information from remote-sensing methods, estimates of current accumulation and modelling. This poses irreducible uncertainties in retrieving an undisturbed ice core of the desired age. Although recently perfected radar techniques will improve the picture of the ice sheet below future drilling sites, rapid prospective drillings could further increase the success of deep drilling projects. Here we design and explore a drilling system for a minimum-size rapid-access hole. The advantages of a small hole are the low demand for drilling fluid, low overall weight of the equipment, fast installing and de-installing and low costs. We show that, in theory, drilling of a 20mm hole to a depth of 3000m is possible in ~4 days. First concepts have been realized and verified in the field. Both the drill cuttings and the hole itself can be used to characterize the properties of the ice sheet and its potential to provide a trustworthy palaeo-record. A candidate drilling site could be explored in ~2 weeks, which would enable the characterization of several sites in one summer season.
Resumo:
Background: Disturbed interpersonal communication is a core problem in schizophrenia. Patients with schizophrenia often appear disconnected and "out of sync" when interacting with others. This may involve perception, cognition, motor behavior, and nonverbal expressiveness. Although well-known from clinical observation, mainstream research has neglected this area. Corresponding theoretical concepts, statistical methods, and assessment were missing. In recent research, however, it has been shown that objective, video-based measures of nonverbal behavior can be used to reliably quantify nonverbal behavior in schizophrenia. Newly developed algorithms allow for a calculation of movement synchrony. We found that the objective amount of movement of patients with schizophrenia during social interactions was closely related to the symptom profiles of these patients (Kupper et al., 2010). In addition and above the mere amount of movement, the degree of synchrony between patients and healthy interactants may be indicative of various problems in the domain of interpersonal communication and social cognition. Methods: Based on our earlier study, head movement synchrony was assessed objectively (using Motion Energy Analysis, MEA) in 378 brief, videotaped role-play scenes involving 27 stabilized outpatients diagnosed with paranoid-type schizophrenia. Results: Lower head movement synchrony was indicative of symptoms (negative symptoms, but also of conceptual disorganization and lack of insight), verbal memory, patients’ self-evaluation of competence, and social functioning. Many of these relationships remained significant even when corrected for the amount of movement of the patients. Conclusion: The results suggest that nonverbal synchrony may be an objective and sensitive indicator of the severity of symptoms, cognition and social functioning.