959 resultados para Trost, Kirk
Resumo:
Ships’ protests have been used for centuries as legal documents to record and detail damages and indemnify Captains from fault. We use them in this article, along with data extracted through forensic synoptic analysis (McNally, 1994, 2004) to identify a tropical or subtropical system in the North Atlantic Ocean in 1785. They are shown to be viable sources of meteorological information. By comparing a damaging storm in New England in 1996, which included an offshore tropical system, with one reconstructed in 1785, we demonstrate that the tropical system identified in a ship’s protest played a significant role in the 1785 storm. With both forensic reconstruction and anecdotal evidence, we are able to assess that these storms are remarkably identical. The recurrence rate calculated in previous studies of the 1996 storm is 400–500 years. We suggest that reconstruction of additional years in the 1700s would provide the basis for a reanalysis of recurrence rates, with implications for future insurance and reinsurance rates. The application of the methodology to this new data source can also be used for extension of the hurricane database in the North Atlantic basin, and elsewhere, much further back into history than is currently available.
Resumo:
High brightness electron sources are of great importance for the operation of the hard X-ray free electron lasers. Field emission cathodes based on the double-gate metallic field emitter arrays (FEAs) can potentially offer higher brightness than the currently used ones. We report on the successful application of electron beam lithography for fabrication of the large-scale single-gate as well as double-gate FEAs. We demonstrate operational high-density single-gate FEAs with sub-micron pitch and total number of tips up to 106 as well as large-scale double-gate FEAs with large collimation gate apertures. The details of design, fabrication procedure and successful measurements of the emission current from the single- and double-gate cathodes are presented.
Resumo:
Patients suffering from bipolar affective disorder show deficits in working memory functions. In a previous functional magnetic resonance imaging study, we observed an abnormal hyperactivity of the amygdala in bipolar patients during articulatory rehearsal in verbal working memory. In the present study, we investigated the dynamic neurofunctional interactions between the right amygdala and the brain systems that underlie verbal working memory in both bipolar patients and healthy controls. In total, 18 euthymic bipolar patients and 18 healthy controls performed a modified version of the Sternberg item-recognition (working memory) task. We used the psychophysiological interaction approach in order to assess functional connectivity between the right amygdala and the brain regions involved in verbal working memory. In healthy subjects, we found significant negative functional interactions between the right amygdala and multiple cortical brain areas involved in verbal working memory. In comparison with the healthy control subjects, bipolar patients exhibited significantly reduced functional interactions of the right amygdala particularly with the right-hemispheric, i.e., ipsilateral, cortical regions supporting verbal working memory. Together with our previous finding of amygdala hyperactivity in bipolar patients during verbal rehearsal, the present results suggest that a disturbed right-hemispheric “cognitive–emotional” interaction between the amygdala and cortical brain regions underlying working memory may be responsible for amygdala hyperactivation and affects verbal working memory (deficits) in bipolar patients.
Resumo:
INTRODUCTION Proteinuria (PTU) is an important marker for the development and progression of renal disease, cardiovascular disease and death, but there is limited information about the prevalence and factors associated with confirmed PTU in predominantly white European HIV+ persons, especially in those with an estimated glomerular filtration rate (eGFR) of 60 mL/min/1.73 m(2). PATIENTS AND METHODS Baseline was defined as the first of two consecutive dipstick urine protein (DPU) measurements during prospective follow-up >1/6/2011 (when systematic data collection began). PTU was defined as two consecutive DUP >1+ (>30 mg/dL) >3 months apart; persons with eGFR <60 at either DPU measurement were excluded. Logistic regression investigated factors associated with PTU. RESULTS A total of 1,640 persons were included, participants were mainly white (n=1,517, 92.5%), male (n=1296, 79.0%) and men having sex with men (n=809; 49.3%). Median age at baseline was 45 (IQR 37-52 years), and CD4 was 570 (IQR 406-760/mm(3)). The median baseline date was 2/12 (IQR 11/11-6/12), and median eGFR was 99 (IQR 88-109 mL/min/1.73 m(2)). Sixty-nine persons had PTU (4.2%, 95% CI 3.2-4.7%). Persons with diabetes had increased odds of PTU, as were those with a prior non-AIDS (1) or AIDS event and those with prior exposure to indinavir. Among females, those with a normal eGFR (>90) and those with prior abacavir use had lower odds of PTU (Figure 1). CONCLUSIONS One in 25 persons with eGFR>60 had confirmed proteinuria at baseline. Factors associated with PTU were similar to those associated with CKD. The lack of association with antiretrovirals, particularly tenofovir, may be due to the cross-sectional design of this study, and additional follow-up is required to address progression to PTU in those without PTU at baseline. It may also suggest other markers are needed to capture the deteriorating renal function associated with antiretrovirals may be needed at higher eGFRs. Our findings suggest PTU is an early marker for impaired renal function.
Resumo:
INTRODUCTION Rates of both TB/HIV co-infection and multi-drug-resistant (MDR) TB are increasing in Eastern Europe (EE). Data on the clinical management of TB/HIV co-infected patients are scarce. Our aim was to study the clinical characteristics of TB/HIV patients in Europe and Latin America (LA) at TB diagnosis, identify factors associated with MDR-TB and assess the activity of initial TB treatment regimens given the results of drug-susceptibility tests (DST). MATERIAL AND METHODS We enrolled 1413 TB/HIV patients from 62 clinics in 19 countries in EE, Western Europe (WE), Southern Europe (SE) and LA from January 2011 to December 2013. Among patients who completed DST within the first month of TB therapy, we linked initial TB treatment regimens to the DST results and calculated the distribution of patients receiving 0, 1, 2, 3 and ≥4 active drugs in each region. Risk factors for MDR-TB were identified in logistic regression models. RESULTS Significant differences were observed between EE (n=844), WE (n=152), SE (n=164) and LA (n=253) for use of combination antiretroviral therapy (cART) at TB diagnosis (17%, 40%, 44% and 35%, p<0.0001), a definite TB diagnosis (culture and/or PCR positive for Mycobacterium tuberculosis; 47%, 71%, 72% and 40%, p<0.0001) and MDR-TB prevalence (34%, 3%, 3% and 11%, p <0.0001 among those with DST results). The history of injecting drug use [adjusted OR (aOR) = 2.03, (95% CI 1.00-4.09)], prior TB treatment (aOR = 3.42, 95% CI 1.88-6.22) and living in EE (aOR = 7.19, 95% CI 3.28-15.78) were associated with MDR-TB. For 569 patients with available DST, the initial TB treatment contained ≥3 active drugs in 64% of patients in EE compared with 90-94% of patients in other regions (Figure 1a). Had the patients received initial therapy with standard therapy [Rifampicin, Isoniazid, Pyrazinamide, Ethambutol (RHZE)], the corresponding proportions would have been 64% vs. 86-97%, respectively (Figure 1b). CONCLUSIONS In EE, TB/HIV patients had poorer exposure to cART, less often a definitive TB diagnosis and more often MDR-TB compared to other parts of Europe and LA. Initial TB therapy in EE was sub-optimal, with less than two-thirds of patients receiving at least three active drugs, and improved compliance with standard RHZE treatment does not seem to be the solution. Improved management of TB/HIV patients requires routine use of DST, initial TB therapy according to prevailing resistance patterns and more widespread use of cART.
Resumo:
autore Andrea Sennerto
Resumo:
BACKGROUND Dual antiplatelet therapy is recommended after coronary stenting to prevent thrombotic complications, yet the benefits and risks of treatment beyond 1 year are uncertain. METHODS Patients were enrolled after they had undergone a coronary stent procedure in which a drug-eluting stent was placed. After 12 months of treatment with a thienopyridine drug (clopidogrel or prasugrel) and aspirin, patients were randomly assigned to continue receiving thienopyridine treatment or to receive placebo for another 18 months; all patients continued receiving aspirin. The coprimary efficacy end points were stent thrombosis and major adverse cardiovascular and cerebrovascular events (a composite of death, myocardial infarction, or stroke) during the period from 12 to 30 months. The primary safety end point was moderate or severe bleeding. RESULTS A total of 9961 patients were randomly assigned to continue thienopyridine treatment or to receive placebo. Continued treatment with thienopyridine, as compared with placebo, reduced the rates of stent thrombosis (0.4% vs. 1.4%; hazard ratio, 0.29 [95% confidence interval {CI}, 0.17 to 0.48]; P<0.001) and major adverse cardiovascular and cerebrovascular events (4.3% vs. 5.9%; hazard ratio, 0.71 [95% CI, 0.59 to 0.85]; P<0.001). The rate of myocardial infarction was lower with thienopyridine treatment than with placebo (2.1% vs. 4.1%; hazard ratio, 0.47; P<0.001). The rate of death from any cause was 2.0% in the group that continued thienopyridine therapy and 1.5% in the placebo group (hazard ratio, 1.36 [95% CI, 1.00 to 1.85]; P=0.05). The rate of moderate or severe bleeding was increased with continued thienopyridine treatment (2.5% vs. 1.6%, P=0.001). An elevated risk of stent thrombosis and myocardial infarction was observed in both groups during the 3 months after discontinuation of thienopyridine treatment. CONCLUSIONS Dual antiplatelet therapy beyond 1 year after placement of a drug-eluting stent, as compared with aspirin therapy alone, significantly reduced the risks of stent thrombosis and major adverse cardiovascular and cerebrovascular events but was associated with an increased risk of bleeding. (Funded by a consortium of eight device and drug manufacturers and others; DAPT ClinicalTrials.gov number, NCT00977938.).
Resumo:
in Musik gesetzt und für eine Singst. mit Clavierbegleitung bearb. von S. T. Friedland
Resumo:
OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.
Resumo:
OBJECTIVES The aim of the study was to investigate the organization and delivery of HIV and tuberculosis (TB) health care and to analyse potential differences between treatment centres in Eastern (EE) and Western Europe (WE). METHODS Thirty-eight European HIV and TB treatment centres participating in the TB:HIV study within EuroCoord completed a survey on health care management for coinfected patients in 2013 (EE: 17 respondents; WE:21; 76% of all TB:HIV centres). Descriptive statistics were obtained for regional comparisons. The reported data on health care strategies were compared with actual clinical practice at patient level via data derived from the TB:HIV study. RESULTS Respondent centres in EE comprised: Belarus (n = 3), Estonia (1), Georgia (1), Latvia (1), Lithuania (1), Poland (4), Romania (1), the Russian Federation (4) and Ukraine (1); those in WE comprised: Belgium (1), Denmark (1), France (1), Italy (7), Spain (2), Switzerland (1) and UK (8). Compared with WE, treatment of HIV and TB in EE are less often located at the same site (47% in EE versus 100% in WE; P < 0.001) and less often provided by the same doctors (41% versus 90%, respectively; P = 0.002), whereas regular screening of HIV-infected patients for TB (80% versus 40%, respectively; P = 0.037) and directly observed treatment (88% versus 20%, respectively; P < 0.001) were more common in EE. The reported availability of rifabutin and second- and third-line anti-TB drugs was lower, and opioid substitution therapy (OST) was available at fewer centres in EE compared with WE (53% versus 100%, respectively; P < 0.001). CONCLUSIONS Major differences exist between EE and WE in relation to the organization and delivery of health care for HIV/TB-coinfected patients and the availability of anti-TB drugs and OST. Significant discrepancies between reported and actual clinical practices were found in EE.
Resumo:
BACKGROUND To cover the shortage of cadaveric organs, new approaches to expand the donor pool are needed. Here we report on a case of domino liver transplantation (DLT) using an organ harvested from a compound heterozygous patient with primary hyperoxaluria (PHO), who underwent combined liver and kidney transplantation. The DLT recipient developed early renal failure with oxaluria. The time to the progression to oxalosis with renal failure in such situations is unknown, but, based on animal data, we hypothesize that calcineurin inhibitors may play a detrimental role. METHODS A cadaveric liver and kidney transplantation was performed in a 52-year-old male with PHO. His liver was used for a 64-year-old patient with a non-resectable, but limited cholangiocarcinoma. RESULTS While the course of the PHO donor was uneventful, in the DLT recipient early post-operative, dialysis-dependent renal failure with hyperoxaluria developed. Histology of a kidney biopsy revealed massive calcium oxalate crystal deposition as the leading aetiological cause. CONCLUSIONS DLT using PHO organs for marginal recipients represents a possible therapeutic approach regarding graft function of the liver. However, it may negatively alter the renal outcome of the recipient in an unpredictable manner, especially with concomitant use of cyclosporin. Therefore, we suggest that, although DLT should be promoted, PHO organs are better excluded from such procedures.
Resumo:
PURPOSE To evaluate risk factors for survival in a large international cohort of patients with primary urethral cancer (PUC). METHODS A series of 154 patients (109 men, 45 women) were diagnosed with PUC in ten referral centers between 1993 and 2012. Kaplan-Meier analysis with log-rank test was used to investigate various potential prognostic factors for recurrence-free (RFS) and overall survival (OS). Multivariate models were constructed to evaluate independent risk factors for recurrence and death. RESULTS Median age at definitive treatment was 66 years (IQR 58-76). Histology was urothelial carcinoma in 72 (47 %), squamous cell carcinoma in 46 (30 %), adenocarcinoma in 17 (11 %), and mixed and other histology in 11 (7 %) and nine (6 %), respectively. A high degree of concordance between clinical and pathologic nodal staging (cN+/cN0 vs. pN+/pN0; p < 0.001) was noted. For clinical nodal staging, the corresponding sensitivity, specificity, and overall accuracy for predicting pathologic nodal stage were 92.8, 92.3, and 92.4 %, respectively. In multivariable Cox-regression analysis for patients staged cM0 at initial diagnosis, RFS was significantly associated with clinical nodal stage (p < 0.001), tumor location (p < 0.001), and age (p = 0.001), whereas clinical nodal stage was the only independent predictor for OS (p = 0.026). CONCLUSIONS These data suggest that clinical nodal stage is a critical parameter for outcomes in PUC.
Resumo:
OBJECTIVES Rates of TB/HIV coinfection and multi-drug resistant (MDR)-TB are increasing in Eastern Europe (EE). We aimed to study clinical characteristics, factors associated with MDR-TB and predicted activity of empiric anti-TB treatment at time of TB diagnosis among TB/HIV coinfected patients in EE, Western Europe (WE) and Latin America (LA). DESIGN AND METHODS Between January 1, 2011, and December 31, 2013, 1413 TB/HIV patients (62 clinics in 19 countries in EE, WE, Southern Europe (SE), and LA) were enrolled. RESULTS Significant differences were observed between EE (N = 844), WE (N = 152), SE (N = 164), and LA (N = 253) in the proportion of patients with a definite TB diagnosis (47%, 71%, 72% and 40%, p<0.0001), MDR-TB (40%, 5%, 3% and 15%, p<0.0001), and use of combination antiretroviral therapy (cART) (17%, 40%, 44% and 35%, p<0.0001). Injecting drug use (adjusted OR (aOR) = 2.03 (95% CI 1.00-4.09), prior anti-TB treatment (3.42 (1.88-6.22)), and living in EE (7.19 (3.28-15.78)) were associated with MDR-TB. Among 585 patients with drug susceptibility test (DST) results, the empiric (i.e. without knowledge of the DST results) anti-TB treatment included ≥3 active drugs in 66% of participants in EE compared with 90-96% in other regions (p<0.0001). CONCLUSIONS In EE, TB/HIV patients were less likely to receive a definite TB diagnosis, more likely to house MDR-TB and commonly received empiric anti-TB treatment with reduced activity. Improved management of TB/HIV patients in EE requires better access to TB diagnostics including DSTs, empiric anti-TB therapy directed at both susceptible and MDR-TB, and more widespread use of cART.
Resumo:
OBJECTIVES This study sought to compare rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE) (composite of death, myocardial infarction, or stroke) after coronary stenting with drug-eluting stents (DES) versus bare-metal stents (BMS) in patients who participated in the DAPT (Dual Antiplatelet Therapy) study, an international multicenter randomized trial comparing 30 versus 12 months of dual antiplatelet therapy in subjects undergoing coronary stenting with either DES or BMS. BACKGROUND Despite antirestenotic efficacy of coronary DES compared with BMS, the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Many clinicians perceive BMS to be associated with fewer adverse ischemic events and to require shorter-duration dual antiplatelet therapy than DES. METHODS Prospective propensity-matched analysis of subjects enrolled into a randomized trial of dual antiplatelet therapy duration was performed. DES- and BMS-treated subjects were propensity-score matched in a many-to-one fashion. The study design was observational for all subjects 0 to 12 months following stenting. A subset of eligible subjects without major ischemic or bleeding events were randomized at 12 months to continued thienopyridine versus placebo; all subjects were followed through 33 months. RESULTS Among 10,026 propensity-matched subjects, DES-treated subjects (n = 8,308) had a lower rate of stent thrombosis through 33 months compared with BMS-treated subjects (n = 1,718, 1.7% vs. 2.6%; weighted risk difference -1.1%, p = 0.01) and a noninferior rate of MACCE (11.4% vs. 13.2%, respectively, weighted risk difference -1.8%, p = 0.053, noninferiority p < 0.001). CONCLUSIONS DES-treated subjects have long-term rates of stent thrombosis that are lower than BMS-treated subjects. (The Dual Antiplatelet Therapy Study [DAPT study]; NCT00977938).
Resumo:
IMPORTANCE Despite antirestenotic efficacy of coronary drug-eluting stents (DES) compared with bare metal stents (BMS), the relative risk of stent thrombosis and adverse cardiovascular events is unclear. Although dual antiplatelet therapy (DAPT) beyond 1 year provides ischemic event protection after DES, ischemic event risk is perceived to be less after BMS, and the appropriate duration of DAPT after BMS is unknown. OBJECTIVE To compare (1) rates of stent thrombosis and major adverse cardiac and cerebrovascular events (MACCE; composite of death, myocardial infarction, or stroke) after 30 vs 12 months of thienopyridine in patients treated with BMS taking aspirin and (2) treatment duration effect within the combined cohorts of randomized patients treated with DES or BMS as prespecified secondary analyses. DESIGN, SETTING, AND PARTICIPANTS International, multicenter, randomized, double-blinded, placebo-controlled trial comparing extended (30-months) thienopyridine vs placebo in patients taking aspirin who completed 12 months of DAPT without bleeding or ischemic events after receiving stents. The study was initiated in August 2009 with the last follow-up visit in May 2014. INTERVENTIONS Continued thienopyridine or placebo at months 12 through 30 after stent placement, in 11,648 randomized patients treated with aspirin, of whom 1687 received BMS and 9961 DES. MAIN OUTCOMES AND MEASURES Stent thrombosis, MACCE, and moderate or severe bleeding. RESULTS Among 1687 patients treated with BMS who were randomized to continued thienopyridine vs placebo, rates of stent thrombosis were 0.5% vs 1.11% (n = 4 vs 9; hazard ratio [HR], 0.49; 95% CI, 0.15-1.64; P = .24), rates of MACCE were 4.04% vs 4.69% (n = 33 vs 38; HR, 0.92; 95% CI, 0.57-1.47; P = .72), and rates of moderate/severe bleeding were 2.03% vs 0.90% (n = 16 vs 7; P = .07), respectively. Among all 11,648 randomized patients (both BMS and DES), stent thrombosis rates were 0.41% vs 1.32% (n = 23 vs 74; HR, 0.31; 95% CI, 0.19-0.50; P < .001), rates of MACCE were 4.29% vs 5.74% (n = 244 vs 323; HR, 0.73; 95% CI, 0.62-0.87; P < .001), and rates of moderate/severe bleeding were 2.45% vs 1.47% (n = 135 vs 80; P < .001). CONCLUSIONS AND RELEVANCE Among patients undergoing coronary stent placement with BMS and who tolerated 12 months of thienopyridine, continuing thienopyridine for an additional 18 months compared with placebo did not result in statistically significant differences in rates of stent thrombosis, MACCE, or moderate or severe bleeding. However, the BMS subset may have been underpowered to identify such differences, and further trials are suggested. TRIAL REGISTRATION clinicaltrials.gov Identifier: NCT00977938.