999 resultados para Impact of trawling
Resumo:
BACKGROUND: Blocking leukocyte function-associated antigen (LFA)-1 in organ transplant recipients prolongs allograft survival. However, the precise mechanisms underlying the therapeutic potential of LFA-1 blockade in preventing chronic rejection are not fully elucidated. Cardiac allograft vasculopathy (CAV) is the preeminent cause of late cardiac allograft failure characterized histologically by concentric intimal hyperplasia. METHODS: Anti-LFA-1 monoclonal antibody was used in a multiple minor antigen-mismatched, BALB.B (H-2B) to C57BL/6 (H-2B), cardiac allograft model. Endogenous donor-specific CD8 T cells were tracked down using major histocompatibility complex multimers against the immunodominant H4, H7, H13, H28, and H60 minor Ags. RESULTS: The LFA-1 blockade prevented acute rejection and preserved palpable beating quality with reduced CD8 T-cell graft infiltration. Interestingly, less CD8 T cell infiltration was secondary to reduction of T-cell expansion rather than less trafficking. The LFA-1 blockade significantly suppressed the clonal expansion of minor histocompatibility antigen-specific CD8 T cells during the expansion and contraction phase. The CAV development was evaluated with morphometric analysis at postoperation day 100. The LFA-1 blockade profoundly attenuated neointimal hyperplasia (61.6 vs 23.8%; P < 0.05), CAV-affected vessel number (55.3 vs 15.9%; P < 0.05), and myocardial fibrosis (grade 3.29 vs 1.8; P < 0.05). Finally, short-term LFA-1 blockade promoted long-term donor-specific regulation, which resulted in attenuated transplant arteriosclerosis. CONCLUSIONS: Taken together, LFA-1 blockade inhibits initial endogenous alloreactive T-cell expansion and induces more regulation. Such a mechanism supports a pulse tolerance induction strategy with anti-LFA-1 rather than long-term treatment.
Resumo:
Antigenically evolving pathogens such as influenza viruses are difficult to control owing to their ability to evade host immunity by producing immune escape variants. Experimental studies have repeatedly demonstrated that viral immune escape variants emerge more often from immunized hosts than from naive hosts. This empirical relationship between host immune status and within-host immune escape is not fully understood theoretically, nor has its impact on antigenic evolution at the population level been evaluated. Here, we show that this relationship can be understood as a trade-off between the probability that a new antigenic variant is produced and the level of viraemia it reaches within a host. Scaling up this intra-host level trade-off to a simple population level model, we obtain a distribution for variant persistence times that is consistent with influenza A/H3N2 antigenic variant data. At the within-host level, our results show that target cell limitation, or a functional equivalent, provides a parsimonious explanation for how host immune status drives the generation of immune escape mutants. At the population level, our analysis also offers an alternative explanation for the observed tempo of antigenic evolution, namely that the production rate of immune escape variants is driven by the accumulation of herd immunity. Overall, our results suggest that disease control strategies should be further assessed by considering the impact that increased immunity--through vaccination--has on the production of new antigenic variants.
Resumo:
Tropical cyclones (TCs) are among the most devastating weather systems affecting the United States and Central America (USCA). Here we show that the Interdecadal Pacific Oscillation (IPO) strongly modulates TC activity over the North Atlantic (NA) and eastern North Pacific (eNP). During positive IPO phases, less (more) TCs were observed over NA (eNP), likely due to the presence of stronger (weaker) vertical wind shear and the resulting changes in genesis potential. Furthermore, TCs over NA tend to keep their tracks more eastward and recurve at lower latitudes during positive IPO phases. Such variations are largely determined by changes in steering flow instead of changes in genesis locations. Over the eNP, smaller track variations are observed at different IPO phases with stable, westward movements of TCs prevailing. These findings have substantial implications for understanding decadal to inter-decadal fluctuations in the risk of TC landfalls along USCA coasts.
Resumo:
© 2016 by the Midwest Political Science Association.Recent research has cast doubt on the potential for various electoral reforms to increase voter turnout. In this article, we examine the effectiveness of preregistration laws, which allow young citizens to register before being eligible to vote. We use two empirical approaches to evaluate the impact of preregistration on youth turnout. First, we implement difference-in-difference and lag models to bracket the causal effect of preregistration implementation using the 2000-2012 Current Population Survey. Second, focusing on the state of Florida, we leverage a discontinuity based on date of birth to estimate the effect of increased preregistration exposure on the turnout of young registrants. In both approaches, we find preregistration increases voter turnout, with equal effectiveness for various subgroups in the electorate. More broadly, observed patterns suggest that campaign context and supporting institutions may help to determine when and if electoral reforms are effective.
Resumo:
OBJECTIVES: Although the Dietary Approaches to Stop Hypertension (DASH) diet lowers blood pressure in adults with hypertension, how kidney function impacts this effect is not known. We evaluated whether Estimated Glomerular Filtration Rate (eGFR) modifies the effect of the DASH diet on blood pressure, markers of mineral metabolism, and markers of kidney function. METHODS: Secondary analysis of the DASH-Sodium trial, a multicenter, randomized, controlled human feeding study that evaluated the blood pressure lowering effect of the DASH diet at three levels of sodium intake. Data from 92 participants with pre-hypertension or stage 1 hypertension during the 3450 mg /day sodium diet assignment contributed to this analysis. Stored frozen plasma and urine specimens were used to measure kidney related laboratory outcomes. RESULTS: Effects of the DASH diet on blood pressure, phosphorus, intact parathyroid hormone, creatinine, and albuminuria were not modified by baseline eGFR (mean 84.5 ± 18.0 ml/min/1.73 m(2), range 44.1 to 138.6 ml/min/1.73 m(2)) or the presence of chronic kidney disease (N=13%). CONCLUSIONS: The impact of the DASH diet on blood pressure, markers of mineral metabolism, and markers of kidney function does not appear to be modified by eGFR in this small subset of DASH-Sodium trial participants with relatively preserved kidney function. Whether greater reduction in eGFR modifies the effects of DASH on kidney related measures is yet to be determined. A larger study in individuals with more advanced kidney disease is needed to establish the efficacy and safety of the DASH diet in this patient population.
Resumo:
In-hospital worsening heart failure represents a clinical scenario wherein a patient hospitalized for acute heart failure experiences a worsening of their condition, requiring escalation of therapy. Worsening heart failure is associated with worse in-hospital and postdischarge outcomes. Worsening heart failure is increasingly being used as an endpoint or combined endpoint in clinical trials, as it is unique to episodes of acute heart failure and captures an important event during the inpatient course. While prediction models have been developed to identify worsening heart failure, there are no known FDA-approved medications associated with decreased worsening heart failure. Continued study is warranted.
Resumo:
Previous authors have suggested a higher likelihood for industry-sponsored (IS) studies to have positive outcomes than non-IS studies, though the influence of publication bias was believed to be a likely confounder. We attempted to control for the latter using a prepublication database to compare the primary outcome of recent trials based on sponsorship. We used the "advanced search" feature in the clinicaltrials.gov website to identify recently completed phase III studies involving the implementation of a pharmaceutical agent or device for which primary data were available. Studies were categorized as either National Institutes of Health (NIH) sponsored or IS. Results were labeled "favorable" if the results favored the intervention under investigation or "unfavorable" if the intervention fared worse than standard medical treatment. We also performed an independent literature search to identify the cardiovascular trials as a case example and again categorized them into IS versus NIH sponsored. A total of 226 studies sponsored by NIH were found. When these were compared with the latest 226 IS studies, it was found that IS studies were almost 4 times more likely to report a positive outcome (odds ratio [OR] 3.90, 95% confidence interval [CI] 2.6087 to 5.9680, p <0.0001). As a case example of a specialty, we also identified 25 NIH-sponsored and 215 IS cardiovascular trials, with most focusing on hypertension therapy (31.6%) and anticoagulation (17.9%). IS studies were 7 times more likely to report favorable outcomes (OR 7.54, 95% CI 2.19 to 25.94, p = 0.0014). They were also considerably less likely to report unfavorable outcomes (OR 0.11, 95% CI 0.04 to 0.26, p <0.0001). In conclusion, the outcomes of large clinical studies especially cardiovascular differ considerably on the basis of their funding source, and publication bias appears to have limited influence on these findings.
Resumo:
Prenatal nicotine exposure (PNE) is linked to a large number of psychiatric disorders, including attention deficit hyperactivity disorder (ADHD). Current literature suggests that core deficits observed in ADHD reflect abnormal inhibitory control governed by the prefrontal cortex (PFC) of the brain. The PFC is structurally altered by PNE, but it is still unclear how neural firing is affected during tasks that test behavioral inhibition, such as the stop-signal task, or if neural correlates related to inhibitory control are affected after PNE in awake behaving animals. To address these questions, we recorded from single medial PFC (mPFC) neurons in control rats and PNE rats as they performed our stopsignal task. We found that PNE rats were faster for all trial types and were less likely to inhibit the behavioral response on STOP trials. Neurons in mPFC fired more strongly on STOP trials and were correlated with accuracy and reaction time. Although the number of neurons exhibiting significant modulation during task performance did not differ between groups, overall activity in PNE was reduced. We conclude that PNE makes rats impulsive and reduces firing in mPFC neurons that carry signals related to response inhibition.
Impact of tumor board recommendations on treatment outcome for locally advanced head and neck cancer
Resumo:
Background/Aims: To identify physician selection factors in the treatment of locally advanced head and neck cancer and how treatment outcome is affected by Tumor Board recommendations. Methods: A retrospective analysis of 213 patients treated for locally advanced head and neck cancer in a single institution was performed. All treatments followed Tumor Board recommendations: 115 patients had chemotherapy and radiation, and 98 patients received postoperative radiation. Patient characteristics, treatment toxicity, locoregional control and survival between these two treat- ment groups were compared. Patient survival was compared with survival data reported in randomized studies of locally advanced head and neck cancer. Results: There were no differences in comorbidity factors, and T or N stages between the two groups. A statistically significant number of patients with oropharyngeal and oral cavity tumors had chemoradiation and postoperative radiation, respectively (p < 0.0001). Grade 3-4 toxicities during treatment were 48 and 87% for the postoperative radiation and chemoradiation groups, respectively (p = 0.0001). There were no differences in survival, locoregional recurrences and distant metastases between the two groups. Patient survival was comparable to survival rates reported by randomized studies of locally advanced head and neck cancer. Conclusion: Disease sites remained the key determining factor for treatment selection. Multidisciplinary approaches provided optimal treatment outcome for locally advanced head and neck cancer, with overall survival in these patients being comparable to that reported in randomized clinical trials. Copyright © 2008 S. Karger AG.
Resumo:
info:eu-repo/semantics/nonPublished
Resumo:
The International Maritime Organisation (IMO) has adopted the use of computer simulation to assist in the assessment of the assembly time for passenger ships. A key parameter required for this analysis and specified as part of the IMO guidelines is the passenger response time distribution. It is demonstrated in this paper that the IMO specified response time distribution assumes an unrealistic mathematical form. This unrealistic mathematical form can lead to serious congestion issues being overlooked in the evacuation analysis and lead to incorrect conclusions concerning the suitability of vessel design. In light of these results, it is vital that IMO undertake research to generate passenger response time data suitable for use in evacuation analysis of passenger ships. Until this type of data becomes readily available, it is strongly recommended that rather than continuing to use the artificial and unrepresentative form of the response time distribution, IMO should adopt plausible and more realistic response time data derived from land based applications. © 2005: Royal Institution of Naval Architects.
Resumo:
This study investigates the use of computer modelled versus directly experimentally determined fire hazard data for assessing survivability within buildings using evacuation models incorporating Fractionally Effective Dose (FED) models. The objective is to establish a link between effluent toxicity, measured using a variety of small and large scale tests, and building evacuation. For the scenarios under consideration, fire simulation is typically used to determine the time non-survivable conditions develop within the enclosure, for example, when smoke or toxic effluent falls below a critical height which is deemed detrimental to evacuation or when the radiative fluxes reach a critical value leading to the onset of flashover. The evacuation calculation would the be used to determine whether people within the structure could evacuate before these critical conditions develop.
Resumo:
This work explores the impact of response time distributions on high-rise building evacuation. The analysis utilises response times extracted from printed accounts and interviews of evacuees from the WTC North Tower evacuation of 11 September 2001. Evacuation simulations produced using these “real” response time distributions are compared with simulations produced using instant and engineering response time distributions. Results suggest that while typical engineering approximations to the response time distribution may produce reasonable evacuation times for up to 90% of the building population, using this approach may underestimate total evacuation times by as much as 61%. These observations are applicable to situations involving large high-rise buildings in which travel times are generally expected to be greater than response times
Resumo:
This paper examines the influence of exit availability on evacuation time for a narrow body aircraft under certification trial conditions using computer simulation. A narrow body aircraft which has previously passed the certification trial is used as the test configuration. While maintaining the certification requirement of 50% of the available exits, six different exit configurations are examined. These include the standard certification configuration (one exit from each exit pair) and five other exit configurations based on commonly occurring exit combinations found in accidents. These configurations are based on data derived from the AASK database and the evacuation simulations are performed using the airEXODUS evacuation simulation software. The results show that the certification practice of using half the available exits predominately down one side of the aircraft is neither statistically relevant nor challenging. For the aircraft cabin layout examined, the exit configuration used in certification trial produces the shortest egress times. Furthermore, three of the six exit combinations investigated result in predicted egress times in excess of 90 seconds, suggesting that the aircraft would not satisfy the certification requirement under these conditions.