94 resultados para Josefo, Flávio, 37 ou 8-ca.100

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to compare oxygen uptake (  V˙O2), hormone and plasma metabolite responses during the 30 min after submaximal incremental exercise (Incr) performed at the same relative/absolute exercise intensity and duration in lean (L) and obese (O) men. Eight L and 8 O men (BMI: 22.9±0.4; 37.2±1.8 kg · m(-2)) completed Incr and were then seated for 30 min.   V˙O2 was monitored during the first 10 min and from the 25-30(th) minutes of recovery. Blood samples were drawn for the determination of hormone (catecholamines, insulin) and plasma metabolite (NEFA, glycerol) concentrations. Excess post-exercise oxygen consumption (EPOC) magnitude during the first 10 min was similar in O and in L (3.5±0.4; 3.4±0.3 liters, respectively, p=0.86). When normalized to percent change (  V˙O2END=100%), %   V˙O2END during recovery was significantly higher from 90-120 s in O than in L (p≤0.04). There were no significant differences in catecholamines (p≥0.24), whereas insulin was significantly higher in O than in L during recovery (p=0.01). The time-course of glycerol was similar from 10-30 min of recovery (-42% for L; -41% for O, p=0.85), whereas significantly different patterns of NEFA were found from 10-30 min of recovery between groups (-18% for L; +8% for O, p=0.03). Despite similar EPOC, a difference in   V˙O2 modulation between groups was observed, likely due to faster initial rates of   V˙O2 decline in L than in O. The different patterns of NEFA between groups may suggest a lower NEFA reesterification during recovery in O, which was not involved in the rapid EPOC component.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: We investigated the incidence and outcome of progressive multifocal leukoencephalopathy (PML) in human immunodeficiency virus (HIV)-infected individuals before and after the introduction of combination antiretroviral therapy (cART) in 1996. METHODS: From 1988 through 2007, 226 cases of PML were reported to the Swiss HIV Cohort Study. By chart review, we confirmed 186 cases and recorded all-cause and PML-attributable mortality. For the survival analysis, 25 patients with postmortem diagnosis and 2 without CD4+ T cell counts were excluded, leaving a total of 159 patients (89 before 1996 and 70 during 1996-2007). RESULTS: The incidence rate of PML decreased from 0.24 cases per 100 patient-years (PY; 95% confidence interval [CI], 0.20-0.29 cases per 100 PY) before 1996 to 0.06 cases per 100 PY (95% CI, 0.04-0.10 cases per 100 PY) from 1996 onward. Patients who received a diagnosis before 1996 had a higher frequency of prior acquired immunodeficiency syndrome-defining conditions (P = .007) but similar CD4+ T cell counts (60 vs. 71 cells/microL; P = .25), compared with patients who received a diagnosis during 1996 or thereafter. The median time to PML-attributable death was 71 days (interquartile range, 44-140 days), compared with 90 days (interquartile range, 54-313 days) for all-cause mortality. The PML-attributable 1-year mortality rate decreased from 82.3 cases per 100 PY (95% CI, 58.8-115.1 cases per 100 PY) during the pre-cART era to 37.6 cases per 100 PY (95% CI, 23.4.-60.5 cases per 100 PY) during the cART era. In multivariate models, cART was the only factor associated with lower PML-attributable mortality (hazard ratio, 0.18; 95% CI, 0.07-0.50; P < .001), whereas all-cause mortality was associated with baseline CD4+ T cell count (hazard ratio per increase of 100 cells/microL, 0.52; 95% CI, 0.32-0.85; P = .010) and cART use (hazard ratio, 0.37; 95% CI, 0.19-0.75; P = .006). CONCLUSIONS: cART reduced the incidence and PML-attributable 1-year mortality, regardless of baseline CD4+ T cell count, whereas overall mortality was dependent on cART use and baseline CD4+ T cell count.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Adverse effects of combination antiretroviral therapy (CART) commonly result in treatment modification and poor adherence. METHODS: We investigated predictors of toxicity-related treatment modification during the first year of CART in 1318 antiretroviral-naive human immunodeficiency virus (HIV)-infected individuals from the Swiss HIV Cohort Study who began treatment between January 1, 2005, and June 30, 2008. RESULTS: The total rate of treatment modification was 41.5 (95% confidence interval [CI], 37.6-45.8) per 100 person-years. Of these, switches or discontinuations because of drug toxicity occurred at a rate of 22.4 (95% CI, 19.5-25.6) per 100 person-years. The most frequent toxic effects were gastrointestinal tract intolerance (28.9%), hypersensitivity (18.3%), central nervous system adverse events (17.3%), and hepatic events (11.5%). In the multivariate analysis, combined zidovudine and lamivudine (hazard ratio [HR], 2.71 [95% CI, 1.95-3.83]; P < .001), nevirapine (1.95 [1.01-3.81]; P = .050), comedication for an opportunistic infection (2.24 [1.19-4.21]; P = .01), advanced age (1.21 [1.03-1.40] per 10-year increase; P = .02), female sex (1.68 [1.14-2.48]; P = .009), nonwhite ethnicity (1.71 [1.18-2.47]; P = .005), higher baseline CD4 cell count (1.19 [1.10-1.28] per 100/microL increase; P < .001), and HIV-RNA of more than 5.0 log(10) copies/mL (1.47 [1.10-1.97]; P = .009) were associated with higher rates of treatment modification. Almost 90% of individuals with treatment-limiting toxic effects were switched to a new regimen, and 85% achieved virologic suppression to less than 50 copies/mL at 12 months compared with 87% of those continuing CART (P = .56). CONCLUSIONS: Drug toxicity remains a frequent reason for treatment modification; however, it does not affect treatment success. Close monitoring and management of adverse effects and drug-drug interactions are crucial for the durability of CART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Direct-acting antiviral agents (DAAs) have become the standard of care for the treatment of chronic hepatitis C virus (HCV) infection. We aimed to assess treatment uptake and efficacy in routine clinical settings among HIV/HCV coinfected patients after the introduction of the first generation DAAs. METHODS: Data on all Swiss HIV Cohort Study (SHCS) participants starting HCV protease inhibitor (PI) treatment between September 2011 and August 2013 were collected prospectively. The uptake and efficacy of HCV therapy were compared with those in the time period before the availability of PIs. RESULTS: Upon approval of PI treatment in Switzerland in September 2011, 516 SHCS participants had chronic HCV genotype 1 infection. Of these, 57 (11%) started HCV treatment during the following 2 years with either telaprevir, faldaprevir or boceprevir. Twenty-seven (47%) patients were treatment-naïve, nine (16%) were patients with relapse and 21 (37%) were partial or null responders. Twenty-nine (57%) had advanced fibrosis and 15 (29%) had cirrhosis. End-of-treatment virological response was 84% in treatment-naïve patients, 88% in patients with relapse and 62% in previous nonresponders. Sustained virological response was 78%, 86% and 40% in treatment-naïve patients, patients with relapse and nonresponders, respectively. Treatment uptake was similar before (3.8 per 100 patient-years) and after (6.1 per 100 patient-years) the introduction of PIs, while treatment efficacy increased considerably after the introduction of PIs. CONCLUSIONS: The introduction of PI-based HCV treatment in HIV/HCV-coinfected patients improved virological response rates, while treatment uptake remained low. Therefore, the introduction of PIs into the clinical routine was beneficial at the individual level, but had only a modest effect on the burden of HCV infection at the population level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Experiments were conducted with adult male rats to investigate the effects of dietary calcium (Ca) restriction upon intake and tissue distribution of cadmium (Cd), and Cd-metallothionein (Mt) synthesis. Four groups of animals were fed either a low-Ca, semisynthetic diet (0.1% Ca) or the same diet supplemented with 0.8% Ca (normal diet). The caloric intake was similar in all groups. Two groups (low-Ca and normal diet) were used as controls, and two groups (low-Ca and normal diet) received 100 mg/l Cd (as CdCl2) in drinking water. Cd levels in liver, kidney, spleen and red cells were measured in all animals after 8 weeks of treatment. Concomitantly, Mt levels in plasma, liver and kidney were evaluated by radioimmunoassay. Ca deficiency entailed marked and significant increases in accumulation of Cd and synthesis of Mt in all assayed tissues. It is concluded that dietary Ca restriction, independent of caloric intake, enhances Cd intestinal absorption and tissue accumulation, which is followed by increased tissue Mt synthesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Continental-scale assessments of 21st century global impacts of climate change on biodiversity have forecasted range contractions for many species. These coarse resolution studies are, however, of limited relevance for projecting risks to biodiversity in mountain systems, where pronounced microclimatic variation could allow species to persist locally, and are ill-suited for assessment of species-specific threat in particular regions. Here, we assess the impacts of climate change on 2632 plant species across all major European mountain ranges, using high-resolution (ca. 100 m) species samples and data expressing four future climate scenarios. Projected habitat loss is greater for species distributed at higher elevations; depending on the climate scenario, we find 36-55% of alpine species, 31-51% of subalpine species and 19-46% of montane species lose more than 80% of their suitable habitat by 2070-2100. While our high-resolution analyses consistently indicate marked levels of threat to cold-adapted mountain florae across Europe, they also reveal unequal distribution of this threat across the various mountain ranges. Impacts on florae from regions projected to undergo increased warming accompanied by decreased precipitation, such as the Pyrenees and the Eastern Austrian Alps, will likely be greater than on florae in regions where the increase in temperature is less pronounced and rainfall increases concomitantly, such as in the Norwegian Scandes and the Scottish Highlands. This suggests that change in precipitation, not only warming, plays an important role in determining the potential impacts of climate change on vegetation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Antiretroviral compounds have been predominantly studied in human immunodeficiency virus type 1 (HIV-1) subtype B, but only ~10% of infections worldwide are caused by this subtype. The analysis of the impact of different HIV subtypes on treatment outcome is important. METHODS: The effect of HIV-1 subtype B and non-B on the time to virological failure while taking combination antiretroviral therapy (cART) was analyzed. Other studies that have addressed this question were limited by the strong correlation between subtype and ethnicity. Our analysis was restricted to white patients from the Swiss HIV Cohort Study who started cART between 1996 and 2009. Cox regression models were performed; adjusted for age, sex, transmission category, first cART, baseline CD4 cell counts, and HIV RNA levels; and stratified for previous mono/dual nucleoside reverse-transcriptase inhibitor treatment. RESULTS: Included in our study were 4729 patients infected with subtype B and 539 with non-B subtypes. The most prevalent non-B subtypes were CRF02_AG (23.8%), A (23.4%), C (12.8%), and CRF01_AE (12.6%). The incidence of virological failure was higher in patients with subtype B (4.3 failures/100 person-years; 95% confidence interval [CI], 4.0-4.5]) compared with non-B (1.8 failures/100 person-years; 95% CI, 1.4-2.4). Cox regression models confirmed that patients infected with non-B subtypes had a lower risk of virological failure than those infected with subtype B (univariable hazard ratio [HR], 0.39 [95% CI, .30-.52; P < .001]; multivariable HR, 0.68 [95% CI, .51-.91; P = .009]). In particular, subtypes A and CRF02_AG revealed improved outcomes (multivariable HR, 0.54 [95% CI, .29-.98] and 0.39 [95% CI, .19-.79], respectively). CONCLUSIONS: Improved virological outcomes among patients infected with non-B subtypes invalidate concerns that these individuals are at a disadvantage because drugs have been designed primarily for subtype B infections.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Adherence to guidelines is associated with improved outcomes of patients with acute coronary syndrome (ACS). Clinical registries developed to assess quality of care at discharge often do not collect the reasons for non-prescription for proven efficacious preventive medication in Continental Europe. In a prospective cohort of patients hospitalized for an ACS, we aimed at measuring the rate of recommended treatment at discharge, using pre-specified quality indicators recommended in cardiologic guidelines and including systematic collection of reasons for non-prescription for preventive medications. METHODS: In a prospective cohort with 1260 patients hospitalized for ACS, we measured the rate of recommended treatment at discharge in 4 academic centers in Switzerland. Performance measures for medication at discharge were pre-specified according to guidelines, systematically collected for all patients and included in a centralized database. RESULTS: Six hundred and eighty eight patients(54.6%) were discharged with a main diagnosis of STEMI, 491(39%) of NSTEMI and 81(6.4%) of unstable angina. Mean age was 64 years and 21.3% were women. 94.6% were prescribed angiotensin converting enzyme inhibitors/angiotensin II receptor blockers at discharge when only considering raw prescription rates, but increased to 99.5% when including reasons non-prescription. For statins, rates increased from 98% to 98.6% when including reasons for non-prescription and for beta-blockers, from 82% to 93%. For aspirin, rates further increased from 99.4% to 100% and from to 99.8% to 100% for P2Y12 inhibitors. CONCLUSIONS: We found a very high adherence to ACS guidelines for drug prescriptions at discharge when including reasons for non-prescription to drug therapy. For beta-blockers, prescription rates were suboptimal, even after taking into account reason for non-prescription. In an era of improving quality of care to achieve 100% prescription rates at discharge unless contra-indicated, pre-specification of reasons for non-prescription for cardiovascular preventive medication permits to identify remaining gaps in quality of care at discharge. TRIAL REGISTRATION: ClinicalTrials.gov NCT01000701.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The use of bioabsorbable materials for orthopaedic useand traumatic fracture fixation in children has been poorly investigatedin the litterature and the effects on growing bones seem contradictory.The aim of the study is to compare the clinical and radiological resultsand evolution between bioabsorbable and traditional K-Wires for thetreatment of elbow epiphyseal fractures in children.Method: From jan. 2008 to Dec. 2009 21 children with similar fracturesand age were separated in two groups according to the way of fracturefixation: bioabsorbable K-Wire group and traditional K-Wire group.Follow-up was done at 3, 6 and 12 month post-operatively. Range ofmotion and elbow stability were measured for all patients. Theradiological evolution of the two groups were compared in term ofconsolidation, ossous resorption and radiolucencies. The clinicalresults were compared according to the Mayo Elbow Peformancescore. Controlateral elbow is compared with injured elbow in the twogroups.Results: In the bioabsorbable K-wire group, there were 10 children,including 5 girles and 5 boys with an average age of 9.5 years, rangingfrom 5 to 14 years. They were 7 external condylar fractures and3 epitrochlear fractures. In the traditional K-Wire group there were11 children, 2 girls and 9 boys with an average age of 7.6 years,ranging from 4 to 14 years. There were 10 external condylar fracturesand 1 epitrochlear fracture. At first follow up. The Mayo ElbowPerformance score was 93.8 (85-100 )for the bioabsorbable K-Wiregroup and 95.5 (85-100) for the traditional K-Wire group. In twochildren from the bioabsorbable K-Wire group there were transitoryradiolucencies along the wire tract on the x-ray, without clinicalmanifestation of it.We didn't see any premature closure of growingcartilage.Discussion: There is no significant differencies in term of clinical andradiological outcome between the two groups. The use ofbioabsorbable pins seems to be a good alternative to removabletraditional materials, avoiding a second operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sleep and wakefulness are complex behaviors that are influenced by many genetic and environmental factors, which are beginning to be discovered. The contribution of genetic components to sleep disorders is also increasingly recognized as important. Point mutations in the prion protein, period 2, and the prepro-hypocretin/orexin gene have been found as the cause of a few sleep disorders but the possibility that other gene defects may contribute to the pathophysiology of major sleep disorders is worth in-depth investigations. However, single gene disorders are rare and most common disorders are complex in terms of their genetic susceptibility, environmental effects, gene-gene, and gene-environment interactions. We review here the current progress in the genetics of normal and pathological sleep.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background and Aims: Discriminating irritable bowel syndrome (IBS) from inflammatorybowel disease (IBD) can be a clinical challenge as symptoms can overlap. We and othershave recently shown that fecal calprotectin (FC) is more accurate for discriminating IBSfrom IBD compared to C-reactive protein (CRP) and blood leukocytes. Data on the biomarkersused in daily gastroenterological practice are lacking. We therefore aimed to assess whichbiomarkers are used by gastroenterologists in their daily practice for discriminating IBSfrom IBD.Methods: A questionnaire was sent to all board certified gastroenterologists inSwitzerland focusing on demographic informations, number of IBS patients treated in thetime period from May 2009 to April 2010, and the specific biomarkers evaluated fordiscriminating IBS from IBD.Results: Response rate was 57% (153/270). Mean physician'sage was 50±9years, mean duration of gastroenterologic practice 14±8years, 52% of themwere working in private practice and 48% in hospitals. Thirty-nine percent had taken careof more than 100 IBS patients in the last 12 months, 37% had seen 41-100 and 24% hadseen 1-40 IBS patients. Gastroenterologists in private practice more frequently took care ofat least 40 IBS patients in a year compared to hospital-based gastroenterologists (P<0.001).The following biomarkers were determined for discriminating IBS from IBD: CRP 100%,FC 79%, hematogram (red blood cells and leukocytes) 70%, iron status (ferritin, transferrinsaturation) 59%, erythrocyte sedimentation rate 2.7%, protein electrophoresis 0.7%, andalpha-1 antitrypsin clearance 0.7%. There was a trend for using FC more often in privatepractice than in hospital (P = 0.08). Twenty-four percent of gastroenterologists had usedFC in the workup of more than 70% of patients classified as IBS, 22% had used FC in 30-70% of IBS patients, 39% in less than 30%, and 15% had never used FC for the work-upof suspected IBS. Eighty-nine percent of gastroenterologists considered FC to be superiorto CRP for discriminating IBS from IBD, 87% thought that patient's compliance for fecalsampling is high, and 51% judged the fee of USD 60 for a FC test as appropriate.Conclusions:FC is widely used in clinical practice to discriminate IBS from IBD. In accordance with thescientific evidence, the majority of gastroenterologists consider FC to be more accurate thanCRP for discriminating IBS from IBD. Gastroenterologists in private practice take care ofsignificantly more IBS patients than colleagues in hospital.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Biliary tract cancer (BTC) is a rare cancer in Europe and North America, characterized by wide geographic variation, with high incidence in some areas of Latin America and Asia. MATERIALS AND METHODS: BTC mortality and incidence have been updated according to recent data, using joinpoint regression analysis. RESULTS: Since the 1980s, decreasing trends in BTC mortality rates (age-standardized, world standard population) were observed in the European Union as a whole, in Australia, Canada, Hong Kong, Israel, New Zealand, and the United States, and high-risk countries such as Japan and Venezuela. Joinpoint regression analysis indicates that decreasing trends were more favorable over recent calendar periods. High-mortality rates are, however, still evident in central and eastern Europe (4-5/100,000 women), Japan (4/100,000 women), and Chile (16.6/100,000 women). Incidence rates identified other high-risk areas in India (8.5/100,000 women), Korea (5.6/100,000 women), and Shanghai, China (5.2/100,000 women). CONCLUSIONS: The decreasing BTC mortality trends essentially reflect more widespread and earlier adoption of cholecystectomy in several countries, since gallstones are the major risk factor for BTC. There are, however, high-risk areas, mainly from South America and India, where access to gall-bladder surgery remains inadequate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study examined the bottom-up influence of emotional context on response inhibition, an issue that remains largely unstudied in children. Thus, 62 participants, aged from 6 to 13 years old, were assessed with three stop signal tasks: one with circles, one with neutral faces, and one with emotional faces (happy and sad). Results showed that emotional context altered response inhibition ability in childhood. However, no interaction between age and emotional influence on response inhibition was found. Positive emotions were recognized faster than negative emotions, but the valence did not have a significant influence on response inhibition abilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Survival after pancreatic head adenocarcinoma surgery is determined by tumor characteristics, resection margins, and adjuvant chemotherapy. Few studies have analyzed the long-term impact of postoperative morbidity. The aim of the present study was to assess the impact of postoperative complications on long-term survival after pancreaticoduodenectomy for cancer. METHODS: Of 294 consecutive pancreatectomies performed between January 2000 and July 2011, a total of 101 pancreatic head resections for pancreatic ductal adenocarcinoma were retrospectively analyzed. Postoperative complications were classified on a five-grade validated scale and were correlated with long-term survival. Grade IIIb to IVb complications were defined as severe. RESULTS: Postoperative mortality and morbidity were 5 and 57 %, respectively. Severe postoperative complications occurred in 16 patients (16 %). Median overall survival was 1.4 years. Significant prognostic factors of survival were the N-stage of the tumor (median survival 3.4 years for N0 vs. 1.3 years for N1, p = 0.018) and R status of the resection (median survival 1.6 years for R0 vs. 1.2 years for R1, p = 0.038). Median survival after severe postoperative complications was decreased from 1.9 to 1.2 years (p = 0.06). Median survival for N0 or N1 tumor or after R0 resection was not influenced by the occurrence and severity of complications, but patients with a R1 resection and severe complications showed a worsened median survival of 0.6 vs. 2.0 years without severe complications (p = 0.0005). CONCLUSIONS: Postoperative severe morbidity per se had no impact on long-term survival except in patients with R1 tumor resection. These results suggest that severe complications after R1 resection predict poor outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESUME De plus en plus de familles se rendent vers des destinations tropicales, s'exposant à des agents infectieux et des maladies tropicales qu'ils ne rencontrent pas chez eux. Nous avons étudié 157 enfants (0-16 ans) et leurs parents partant pour les tropiques, qui ont tous consulté une clinique pré-voyage et qui étaient généralement compliants aux conseils prodigués. Les taux d'incidence de maladies communes chez les enfants et les adultes étaient respectivement de 16.9 (14.3-19.7) et 15.1 (12.7-17.8) épisodes/ 100 personnes-semaines. La diarrhée, les douleurs abdominales et la fièvre représentaient les plaintes les plus fréquentes. Il n'y avait pas de différence significative d'incidence des épisodes morbides entre les enfants et les adultes sauf pour la fièvre (plus fréquente chez les enfants). La plupart des épisodes avaient lieu dans les dix premiers jours du voyage. L'incidence de morbidité similaire chez les enfants et les adultes ainsi que l'aspect bénin des épisodes remet en question l'opinion selon laquelle il n'est pas sage de voyager avec des jeunes enfants.