813 resultados para Early Years
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
BACKGROUND Coronary atherosclerosis begins early in life, but acute coronary syndromes in adults aged <30 years are exceptional. We aimed to investigate the rate of occurrence, clinical and angiographic characteristics, and long-term clinical outcome of acute coronary syndrome (ACS) in young patients who were referred to two Swiss hospitals. METHODS From 1994 to 2010, data on all patients with ACS aged <30 years were retrospectively retrieved from our database and the patients were contacted by phone or physician's visit. Baseline, lesion and procedural characteristics, and clinical outcome were compared between patients in whom an underlying atypical aetiology was found (non-ATS group; ATS: atherosclerosis) and patients in whom no such aetiology was detected (ATS group). The clinical endpoint was freedom from any major adverse cardiac event (MACE) during follow-up. RESULTS A total of 27 young patients with ACS aged <30 years were admitted during the study period. They accounted for 0.05% of all coronary angiograms performed. Mean patient age was 26.8 ± 3.5 years and 22 patients (81%) were men. Current smoking (81%) and dyslipidaemia (59%) were the most frequent risk factors. Typical chest pain (n = 23; 85%) and ST-segment elevation myocardial infarction (STEMI; n = 18 [67%]) were most often found. The ATS group consisted of 17 patients (63%) and the non-ATS group of 10 patients (37%). Hereditary thrombophilia was the most frequently encountered atypical aetiology (n = 4; 15%). At 5 years, mortality and MACE rate were 7% and 19%, respectively. CONCLUSION ACS in young patients is an uncommon condition with a variety of possible aetiologies and distinct risk factors. In-hospital and 5-year clinical outcome is satisfactory.
Resumo:
OBJECTIVE Little information is available on the early course of hypertension in type 1 diabetes. The aim of our study, therefore, was to document circadian blood pressure profiles in patients with a diabetes duration of up to 20 years and relate daytime and nighttime blood pressure to duration of diabetes, BMI, insulin therapy, and HbA1c. RESEARCH DESIGN AND METHODS Ambulatory profiles of 24-h blood pressure were recorded in 354 pediatric patients with type 1 diabetes (age 14.6 +/- 4.2 years, duration of diabetes 5.6 +/- 5.0 years, follow-up for up to 9 years). A total of 1,011 profiles were available for analysis from patients not receiving antihypertensive medication. RESULTS Although daytime mean systolic pressure was significantly elevated in diabetic subjects (+3.1 mmHg; P < 0.0001), daytime diastolic pressure was not different from from the height- and sex-adjusted normal range (+0.1 mmHg, NS). In contrast, both systolic and diastolic nighttime values were clearly elevated (+7.2 and +4.2 mmHg; P < 0.0001), and nocturnal dipping was reduced (P < 0.0001). Systolic blood pressure was related to overweight in all patients, while diastolic blood pressure was related to metabolic control in young adults. Blood pressure variability was significantly lower in girls compared with boys (P < 0.01). During follow-up, no increase of blood pressure was noted; however, diastolic nocturnal dipping decreased significantly (P < 0.03). Mean daytime blood pressure was significantly related to office blood pressure (r = +0.54 for systolic and r = +0.40 for diastolic pressure); however, hypertension was confirmed by ambulatory blood pressure measurement in only 32% of patients with elevated office blood pressure. CONCLUSIONS During the early course of type 1 diabetes, daytime blood pressure is higher compared with that of healthy control subjects. The elevation of nocturnal values is even more pronounced and nocturnal dipping is reduced. The frequency of white-coat hypertension is high among adolescents with diabetes, and ambulatory blood pressure monitoring avoids unnecessary antihypertensive treatment.
Resumo:
Aims The effects of a system based on minimally trained first responders (FR) dispatched simultaneously with the emergency medical services (EMS) of the local hospital in a mixed urban and rural area in Northwestern Switzerland were examined. Methods and results In this prospective study 500 voluntary fire fighters received a 4-h training in basic-life-support using automated-external-defibrillation (AED). FR and EMS were simultaneously dispatched in a two-tier rescue system. During the years 2001–2008, response times, resuscitation interventions and outcomes were monitored. 1334 emergencies were included. The FR reached the patients (mean age 60.4 ± 19 years; 65% male) within 6 ± 3 min after emergency calls compared to 12 ± 5 min by the EMS (p < 0.0001). Seventy-six percent of the 297 OHCAs occurred at home. Only 3 emergencies with resuscitation attempts occurred at the main railway station equipped with an on-site AED. FR were on the scene before arrival of the EMS in 1166 (87.4%) cases. Of these, the FR used AED in 611 patients for monitoring or defibrillation. CPR was initiated by the FR in 164 (68.9% of 238 resuscitated patients). 124 patients were defibrillated, of whom 93 (75.0%) were defibrillated first by the FR. Eighteen patients (of whom 13 were defibrillated by the FR) were discharged from hospital in good neurological condition. Conclusions Minimally trained fire fighters integrated in an EMS as FR contributed substantially to an increase of the survival rate of OHCAs in a mixed urban and rural area.
Resumo:
Background: Dementia is a multifaceted disorder that impairs cognitive functions, such as memory, language, and executive functions necessary to plan, organize, and prioritize tasks required for goal-directed behaviors. In most cases, individuals with dementia experience difficulties interacting with physical and social environments. The purpose of this study was to establish ecological validity and initial construct validity of a fire evacuation Virtual Reality Day-Out Task (VR-DOT) environment based on performance profiles as a screening tool for early dementia. Objective: The objectives were (1) to examine the relationships among the performances of 3 groups of participants in the VR-DOT and traditional neuropsychological tests employed to assess executive functions, and (2) to compare the performance of participants with mild Alzheimer’s-type dementia (AD) to those with amnestic single-domain mild cognitive impairment (MCI) and healthy controls in the VR-DOT and traditional neuropsychological tests used to assess executive functions. We hypothesized that the 2 cognitively impaired groups would have distinct performance profiles and show significantly impaired independent functioning in ADL compared to the healthy controls. Methods: The study population included 3 groups: 72 healthy control elderly participants, 65 amnestic MCI participants, and 68 mild AD participants. A natural user interface framework based on a fire evacuation VR-DOT environment was used for assessing physical and cognitive abilities of seniors over 3 years. VR-DOT focuses on the subtle errors and patterns in performing everyday activities and has the advantage of not depending on a subjective rating of an individual person. We further assessed functional capacity by both neuropsychological tests (including measures of attention, memory, working memory, executive functions, language, and depression). We also evaluated performance in finger tapping, grip strength, stride length, gait speed, and chair stands separately and while performing VR-DOTs in order to correlate performance in these measures with VR-DOTs because performance while navigating a virtual environment is a valid and reliable indicator of cognitive decline in elderly persons. Results: The mild AD group was more impaired than the amnestic MCI group, and both were more impaired than healthy controls. The novel VR-DOT functional index correlated strongly with standard cognitive and functional measurements, such as mini-mental state examination (MMSE; rho=0.26, P=.01) and Bristol Activities of Daily Living (ADL) scale scores (rho=0.32, P=.001). Conclusions: Functional impairment is a defining characteristic of predementia and is partly dependent on the degree of cognitive impairment. The novel virtual reality measures of functional ability seem more sensitive to functional impairment than qualitative measures in predementia, thus accurately differentiating from healthy controls. We conclude that VR-DOT is an effective tool for discriminating predementia and mild AD from controls by detecting differences in terms of errors, omissions, and perseverations while measuring ADL functional ability.
Resumo:
IMPORTANCE This study addresses the value of patients' reported symptoms as markers of tumor recurrence after definitive therapy for head and neck squamous cell carcinoma. OBJECTIVE To evaluate the correlation between patients' symptoms and objective findings in the diagnosis of local and/or regional recurrences of head and neck squamous cell carcinomas in the first 2 years of follow-up. DESIGN Retrospective single-institution study of a prospectively collected database. SETTING Regional hospital. PARTICIPANTS We reviewed the clinical records of patients treated for oral cavity, oropharyngeal, laryngeal, and hypopharyngeal carcinomas between January 1, 2008, and December 31, 2009, with a minimum follow-up of 2 years. MAIN OUTCOMES AND MEASURES Correlation between symptoms and oncologic status (recurrence vs remission) in the posttreatment period. RESULTS Of the 101 patients included, 30 had recurrences. Pain, odynophagia, and dysphonia were independently correlated with recurrence (odds ratios, 16.07, 11.20, and 5.90, respectively; P < .001). New-onset symptoms had the best correlation with recurrences. Correlation was better between 6 to 12 and 18 to 21 months after therapy and in patients initially treated unimodally (P < .05). Primary stage and tumor site had no effect. CONCLUSIONS AND RELEVANCE The correlation between symptoms and oncologic status is low during substantial periods within the first 2 years of follow-up. New-onset symptoms, especially pain, odynophagia, or dysphonia, better correlate with tumor recurrence, especially in patients treated unimodally.
Resumo:
OBJECTIVES In 2003 the International Breast Cancer Study Group (IBCSG) initiated the TEXT and SOFT randomized phase III trials to answer two questions concerning adjuvant treatment for premenopausal women with endocrine-responsive early breast cancer: 1-What is the role of aromatase inhibitors (AI) for women treated with ovarian function suppression (OFS)? 2-What is the role of OFS for women who remain premenopausal and are treated with tamoxifen? METHODS TEXT randomized patients to receive exemestane or tamoxifen with OFS. SOFT randomized patients to receive exemestane with OFS, tamoxifen with OFS, or tamoxifen alone. Treatment was for 5 years from randomization. RESULTS TEXT and SOFT successfully met their enrollment goals in 2011. The 5738 enrolled women had lower-risk disease and lower observed disease-free survival (DFS) event rates than anticipated. Consequently, 7 and 13 additional years of follow-up for TEXT and SOFT, respectively, were required to reach the targeted DFS events (median follow-up about 10.5 and 15 years). To provide timely answers, protocol amendments in 2011 specified analyses based on chronological time and median follow-up. To assess the AI question, exemestane + OFS versus tamoxifen + OFS, a combined analysis of TEXT and SOFT became the primary analysis (n = 4717). The OFS question became the primary analysis from SOFT, assessing the unique comparison of tamoxifen + OFS versus tamoxifen alone (n = 2045). The first reports are anticipated in mid- and late-2014. CONCLUSIONS We present the original designs of TEXT and SOFT and adaptations to ensure timely answers to two questions concerning optimal adjuvant endocrine treatment for premenopausal women with endocrine-responsive breast cancer. Trial Registration TEXT: Clinicaltrials.govNCT00066703 SOFT: Clinicaltrials.govNCT00066690.
Resumo:
PURPOSE In patients with hormone-dependent postmenopausal breast cancer, standard adjuvant therapy involves 5 years of the nonsteroidal aromatase inhibitors anastrozole and letrozole. The steroidal inhibitor exemestane is partially non-cross-resistant with nonsteroidal aromatase inhibitors and is a mild androgen and could prove superior to anastrozole regarding efficacy and toxicity, specifically with less bone loss. PATIENTS AND METHODS We designed an open-label, randomized, phase III trial of 5 years of exemestane versus anastrozole with a two-sided test of superiority to detect a 2.4% improvement with exemestane in 5-year event-free survival (EFS). Secondary objectives included assessment of overall survival, distant disease-free survival, incidence of contralateral new primary breast cancer, and safety. RESULTS In the study, 7,576 women (median age, 64.1 years) were enrolled. At median follow-up of 4.1 years, 4-year EFS was 91% for exemestane and 91.2% for anastrozole (stratified hazard ratio, 1.02; 95% CI, 0.87 to 1.18; P = .85). Overall, distant disease-free survival and disease-specific survival were also similar. In all, 31.6% of patients discontinued treatment as a result of adverse effects, concomitant disease, or study refusal. Osteoporosis/osteopenia, hypertriglyceridemia, vaginal bleeding, and hypercholesterolemia were less frequent on exemestane, whereas mild liver function abnormalities and rare episodes of atrial fibrillation were less frequent on anastrozole. Vasomotor and musculoskeletal symptoms were similar between arms. CONCLUSION This first comparison of steroidal and nonsteroidal classes of aromatase inhibitors showed neither to be superior in terms of breast cancer outcomes as 5-year initial adjuvant therapy for postmenopausal breast cancer by two-way test. Less toxicity on bone is compatible with one hypothesis behind MA.27 but requires confirmation. Exemestane should be considered another option as up-front adjuvant therapy for postmenopausal hormone receptor-positive breast cancer.
Resumo:
We examined the effects of ostracism in early adolescent populations using the cyberball paradigm (Williams, Cheung, & Choi, 2000). Ninety-one Swiss school students, aged 10–14 years, were randomly assigned to the ostracism (24 girls, 23 boys) or the inclusion (23 girls, 21 boys) condition and were led to believe that they were playing cyberball with two other same-sex students. In reality, they were computer-generated confederates. We assessed self-reported levels of mood before and after playing the game as well as sense of belonging, self-esteem, meaningful existence, and control after the game. Compared to nonostracized students, adolescents in the ostracism condition reported significantly lower levels of positive mood after playing the game. Furthermore, they reported a lower sense of belonging and lower levels of self-esteem, meaningful existence, and control. The present results from a non-English-speaking sample correspond well to the few earlier findings in adolescent and adult populations by suggesting that even brief periods of ostracism with unknown others can lead to a significant decrease in well-being in these age groups.
Resumo:
In this introductory paper we summarize the history and achievements of the Potrok Aike maar lake Sediment Archive Drilling prOject (PASADO), an interdisciplinary project embedded in the International Continental Scientific Drilling Program (ICDP). The stringent multiproxy approach adopted in this research combined with radiocarbon and luminescence dating provided the opportunity to synthesize a large body of hydrologically relevant data from Laguna Potrok Aike (southern Patagonia, Argentina). At this site, lake level was high from 51 ka until the early Holocene when the Southern Hemisphere Westerlies (SHW) were located further to the north. At 9.3 ka cal. BP the SHW moved southward and over the latitude of the study area (52 degrees S) causing a pronounced negative water balance with a lake level decrease of more than 50 m. Two millennia later, the SHW diminished in intensity and lake level rose to a subsequent maximum during the Little Ice Age. Since the 20th century, a strengthening of the SHW increased the evaporative stress resulting in a more negative water balance. A comparison of our data with other hydrological fluctuations at a regional scale in south-eastern Patagonia, provides new insights and also calls for better chronologies and high-resolution records of climate variability.
Resumo:
The aim of this study was (1) to examine whether childhood BMI is a significant predictor of restrained eating in preadolescents, (2) to investigate gender differences in restrained and emotional eating, and (3) to determine whether emotional problems, and body esteem were related to eating problems of preadolescents. In this longitudinal study with two measurement points, data from 428 children (50% female) were used. At time 1 (t1) children were on average 5.9 years old. BMI was assessed using objective measures. At time 2 (t2) participants were 12 years old. The adolescents and their parents completed questionnaires assessing restrained and emotional eating, body esteem, emotional problems, and BMI. Multiple regression analysis showed that restrained eating was significantly predicted by t1 BMI, by change in BMI between t1 and t2, and t2 body esteem. Emotional eating was, as expected, not predicted by t1 BMI, but associated with t2 body esteem and t2 emotional problems. Gender was not a significant predictor. The stability of BMI between childhood and preadolescence and its ability to predict restrained eating suggests that it is important to start prevention of overweight, body dissatisfaction and disordered eating at an early age
Resumo:
This study uses the widths, the spacing and the grain-size pattern of Oligo/Miocene alluvial fan conglomerates in the central segment of the Swiss Alpine foreland to reconstruct the topographic development of the Alps. These data are analysed with models of longitudinal stream profile development, to propose that the Alpine topography evolved from an early transient state where streams adjusted to rock uplift by headward retreat, to a mature phase where any changes in rock uplift were accommodated by vertical incision. The first stage comprises the time interval between ca 31 Ma and 22 Ma, when the Alpine streams deposited many small fans with a lateral spacing of <30 km in the north Alpine foreland. As the range evolved, the streams joined and the fans coalesced into a few large depositional systems with a lateral spacing of ca 80 to 100 km at 22 Ma. The models used here suggest that the overall elevation of the Alps increased rapidly within <5 Myr. The variability in pebble size increased either due to variations in sediment supply, enhanced orographic effects, or preferentially due to a change towards a stormier palaeoclimate. By 22 Ma, only two large rivers carried material into the foreland fans, suggesting that the major Alpine streams had established themselves. This second phase of stable drainage network was maintained until ca 5 Ma, when the uplift and erosion of the Molasse started and streams were redirected both in the Alps and in the foreland. This study illustrates that sedimentological archives of foreland basins can be used to reconstruct the chronology of the topographic development of mountain belts. It is suggested that the finite elevation of mountainous landscapes is reached early during orogeny and can be maintained for millions of years, provided that erosion is efficient.
Resumo:
Hereditary breast and ovarian cancer (HBOC) is caused by a mutation in the BRCA1 or BRCA2 genes. Women with a BRCA1/2 mutation are at increased risks for breast and ovarian cancer and often develop cancer at an earlier age than the general population. However, some women with a BRCA1/2 mutation do not develop breast or ovarian cancer under the age of 50 years. There have been no specific studies on BRCA positive women with no cancer prior to age 50, therefore this study sought to investigate factors within these women with no cancer under age 50 with respect to reproductive risk factors, BMI, tumor pathology, screening history, risk-reducing surgeries, and family history. 241 women were diagnosed with cancer prior to age 50, 92 with cancer at age 50 or older, and 20 women were over age 50 with no cancer. Data were stratified based on BRCA1 and BRCA2 mutation status. Within the cohorts we investigated differences between women who developed cancer prior to age 50 and those who developed cancer at age 50 or older. We also investigated the differences between women who developed cancer at age 50 or older and those who were age 50 or older with no cancer. Of the 92 women with a BRCA1/2 mutation who developed cancer at age 50 or older, 46 developed ovarian cancer first, 45 developed breast cancer, and one had breast and ovarian cancer diagnosed synchronously. BRCA2 carriers diagnosed age 50 or older were more likely to have ER/PR negative breast tumors when compared to BRCA2 carriers who were diagnosed before age 50. This is consistent with one other study that has been performed. Ashkenazi Jewish women with a BRCA1 mutation were more likely to be diagnosed age 50 or older than other ethnicities. Hispanic women with a BRCA2 mutation were more likely to be diagnosed prior to age 50 when compared to other ethnicities. No differences in reproductive factors or BMI were observed. Further characterization of BRCA positive women with no cancer prior to age 50 may aid in finding factors important in the development of breast or ovarian cancer.
Resumo:
The lithostratigraphic framework of Lake Van, eastern Turkey, has been systematically analysed to document the sedimentary evolution and the environmental history of the lake during the past ca 600,000 years. The lithostratigraphy and chemostratigraphy of a 219 m long drill core from Lake Van serves to separate global climate oscillations from local factors caused by tectonic and volcanic activity. An age model was established based on the climatostratigraphic alignment of chemical and lithological signatures, validated by 40Ar/39Ar ages. The drilled sequence consists of ca 76% lacustrine carbonaceous clayey silt, ca 2% fluvial deposits, ca 17% volcaniclastic deposits and 5% gaps. Six lacustrine lithotypes were separated from the fluvial and event deposits, such as volcaniclastics (ca 300 layers) and graded beds (ca 375 layers), and their depositional environments are documented. These lithotypes are: (i) graded beds frequently intercalated with varved clayey silts reflect rising lake-levels during the terminations; (ii) varved clayey silts reflect strong seasonality and an intralake oxic–anoxic boundary, for example, lake-level highstands during interglacials/interstadials; (iii) CaCO3-rich banded sediments are representative of a lowering of the oxic-anoxic boundary, for example, lake-level decreases during glacial inceptions; (iv) CaCO3-poor banded and mottled clayey silts reflect an oxic–anoxic boundary close to the sediment-water interface, for example, lake-level low-stands during glacials/stadials; (v) diatomaceous muds were deposited during the early beginning of the lake as a fresh water system; and (vi) fluvial sands and gravels indicate the initial flooding of the lake basin. The recurrence of lithologies (i) to (iv) follows the past five glacial/interglacial cycles. A 20 m thick disturbed unit reflects an interval of major tectonic activity in Lake Van at ca 414 ka BP.
Resumo:
In recent years, scholars have identified Early Iron Age Kinneret as belonging either to the kingdom of Geshur1 or at least as being part of an early Aramaean polity.2 It is the purpose of this paper to reexamine the archaeological evidence for such an assumption and to critically test the currently available data against this hypothesis.