897 resultados para Effects and Usages
Resumo:
The introduction of advanced welding methods as an alternative joining process to riveting in the manufacture of primary aircraft structure has the potential to realize reductions in both manufacturing costs and structural weight. However, welding processes can introduce undesirable residual stresses and distortions in the final fabricated components, as well as localized loss of mechanical properties at the weld joints. The aim of this research is to determine and characterize the key process effects of advanced welding assembly methods on stiffened panel static strength performance. This in-depth understanding of the relationships between welding process effects and buckling and collapse strength is required to achieve manufacturing cost reductions without introducing structural analysis uncertainties and hence conservative over designed welded panels. This current work is focused at the sub-component level and examines the static strength of friction stir welded multi stiffener panels. The undertaken experimental and computational studies have demonstrated that local skin buckling is predominantly influenced by the magnitude of welding induced residual stresses and associated geometric distortions, whereas panel collapse behavior is sensitive to the lateral width of the physically joined skin and stiffener flange material, the strength of material in the Heat Affected Zone as well as the magnitude of the welding induced residual stresses. Copyright © 2006 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
--------------------------------------------------------------------------------
Reaxys Database Information
|
Resumo:
One of the most influential explanations of voting behaviour is based on economic factors: when the economy is doing well, voters reward the incumbent government and when the economy is doing badly, voters punish the incumbent. This reward-punishment model is thought to be particularly appropriate at second order contests such as European Parliament elections. Yet operationalising this economic voting model using citizens' perceptions of economic performance may suffer from endogeneity problems if citizens' perceptions are in fact a function of their party preferences rather than being a cause of their party preferences. Thus, this article models a 'strict' version of economic voting in which they purge citizens' economic perceptions of partisan effects and only use as a predictor of voting that portion of citizens' economic perceptions that is caused by the real world economy. Using data on voting at the 2004 European Parliament elections for 23 European Union electorates, the article finds some, but limited, evidence for economic voting that is dependent on both voter sophistication and clarity of responsibility for the economy within any country. First, only politically sophisticated voters' subjective economic assessments are in fact grounded in economic reality. Second, the portion of subjective economic assessments that is a function of the real world economy is a significant predictor of voting only in single party government contexts where there can be a clear attribution of responsibility. For coalition government contexts, the article finds essentially no impact of the real economy via economic perceptions on vote choice, at least at European Parliament elections.
Resumo:
Adenosine is a ubiquitous molecule present in every cell of the human body. It has a wide range of physiological functions mediated predominantly through specific cell surface adenosine receptors. Adenosine has both pro- and anti-inflammatory effects and acts on inflammatory and resident immune cells and antioxidant enzymes. The elevation of adenosine in the bronchoalveolar lavage (BAL) fluid of asthmatics combined with its bronchoconstrictor effect on the airways in asthmatics has led to increased research into the contribution of adenosine in the pathophysiology of inflammation and asthma. This review looks at the airway response to adenosine and at the interaction of adenosine with mast cells and basophils.
Resumo:
Previous research suggests that the digital cushion, a shock-absorbing structure in the claw, plays an important role in protecting cattle from lameness. This study aimed to assess the degree to which nutritional factors influence the composition of the digital cushion. This involved quantifying lipid content and fatty acid composition differences in digital cushion tissue from cattle offered diets with different amounts of linseed. Forty-six bulls were allocated to 1 of 4 treatments, which were applied for an average of 140 +/- 27 d during the finishing period. The treatments consisted of a linseed supplement offered once daily on top of the basal diet (grass silage:concentrate) at 0, 400, 800, or 1,200 g of supplement/animal per day. For each treatment, the concentrate offered was adjusted to ensure that total estimated ME intake was constant across treatments. Target BW at slaughter was 540 kg. Legs were collected in 3 batches after 120, 147 and 185 d on experiment. Six samples of the digital cushion were dissected from the right lateral hind claw of each animal. Lipids were extracted and expressed as a proportion of fresh tissue, and fatty acid composition of the digital cushion was determined by gas chromatography. Data were analyzed by ANOVA, with diet, location within the digital cushion, and their interactions as fixed effects and fat content (grams per 100 g of tissue) as a covariate. Linear or quadratic contrasts were examined. The lipid content of digital cushion tissue differed between sampling locations (P
Resumo:
Background: Drug scenes within several countries have changed in recent years to incorporate a range of licit psychoactive products, collectively known as “legal highs.” Hundreds of different legal high products have been described in the literature. Many of these products contain synthetic stimulants that allegedly
“mirror” the effects of some illicit drugs. In 2009–2010, growing concern by the UK and Irish governments focused on mephedrone, a synthetic stimulant that had become embedded within several drug scenes in Britain and Ireland. In April 2010, mephedrone and related cathinone derivatives were banned under
the UK’s Misuse of Drugs Act 1971. Setting aside “worse case scenarios” that have been portrayed by UK and Irish media, little is known about mephedrone use from the consumer’s perspective. The purpose of this paper was to (1) explore respondents’ experiences with mephedrone, (2) examine users’ perceptions
about the safety of mephedrone, and primarily to (3) examine sources of mephedrone supply during the pre- and post-ban periods.
Methods: Semi-structured interviews were conducted with 23 adults who had used mephedrone during 2009–2010. Data collection occurred in May and June 2010, following the ban on mephedrone. A total of 20/23 respondents had used mephedrone during the post-ban period, and the vast majority had prior
experience with ecstasy or cocaine. Respondents’ ages ranged from 19 to 51, approximately half of the sample were female and the majority (19 of 23) were employed in full- or part-time work.
Results: Most respondents reported positive experiences with mephedrone, and for some, the substance emerged as a drug of choice. None of the respondents reported that the once-legal status of mephedrone implied that it was safe to use. Very few respondents reported purchasing mephedrone from street-based
or on-line headshops during the pre-ban period, and these decisions were guided in part by respondents’ attempts to avoid “drug user” identities. Most respondents purchased or obtained mephedrone from friends or dealers, and mephedrone was widely available during the 10-week period following the ban. Respondents reported a greater reliance on dealers and a change in mephedrone packaging following the criminalisation of mephedrone.
Conclusion: The findings are discussed in the context of what appears to be a rapidly changing mephedrone market. We discuss the possible implications of criminalising mephedrone, including the potential displacement effects and the development of an illicit market.
Resumo:
This article examines the state regulation of sexual offenders in the particular context of pre-employment vetting. A successive range of statutory frameworks have been put in place, culminating in the Safeguarding Vulnerable Groups Act 2006, to prevent unsuitable individuals from working with the vulnerable, and children in particular. Contemporary legislative and policy developments are set against a backdrop of broader concerns in the area of crime and justice, namely risk regulation, preventative governance and ‘precautionary logic.’ Proponents of these approaches have largely ignored concerns over their feasibility. This article specifically addresses this fissure within the specific field of vetting. It is argued that ‘hyper innovation’ and state over-extension in this area are particularly problematic and have resulted in exceptionally uncertain and unsafe policies. These difficulties relate principally to unrealistic public expectations about the state’s ability to control crime; unintended and ambiguous policy effects; and ultimately the failure of the state to deliver on its self-imposed regulatory mandate to effectively manage risk.
Resumo:
Background: When cure is impossible, cancer treatment should focus on both length and quality of life. Maximisation of time without toxic effects could be one effective strategy to achieve both of these goals. The COIN trial assessed preplanned treatment holidays in advanced colorectal cancer to achieve this aim. Methods: COIN was a randomised controlled trial in patients with previously untreated advanced colorectal cancer. Patients received either continuous oxaliplatin and fluoropyrimidine combination (arm A), continuous chemotherapy plus cetuximab (arm B), or intermittent (arm C) chemotherapy. In arms A and B, treatment continued until development of progressive disease, cumulative toxic effects, or the patient chose to stop. In arm C, patients who had not progressed at their 12-week scan started a chemotherapy-free interval until evidence of disease progression, when the same treatment was restarted. Randomisation was done centrally (via telephone) by the MRC Clinical Trials Unit using minimisation. Treatment allocation was not masked. The comparison of arms A and B is described in a companion paper. Here, we compare arms A and C, with the primary objective of establishing whether overall survival on intermittent therapy was non-inferior to that on continuous therapy, with a predefined non-inferiority boundary of 1·162. Intention-to-treat (ITT) and per-protocol analyses were done. This trial is registered, ISRCTN27286448. Findings: 1630 patients were randomly assigned to treatment groups (815 to continuous and 815 to intermittent therapy). Median survival in the ITT population (n=815 in both groups) was 15·8 months (IQR 9·4—26·1) in arm A and 14·4 months (8·0—24·7) in arm C (hazard ratio [HR] 1·084, 80% CI 1·008—1·165). In the per-protocol population (arm A, n=467; arm C, n=511), median survival was 19·6 months (13·0—28·1) in arm A and 18·0 months (12·1—29·3) in arm C (HR 1·087, 0·986—1·198). The upper limits of CIs for HRs in both analyses were greater than the predefined non-inferiority boundary. Preplanned subgroup analyses in the per-protocol population showed that a raised baseline platelet count, defined as 400 000 per µL or higher (271 [28%] of 978 patients), was associated with poor survival with intermittent chemotherapy: the HR for comparison of arm C and arm A in patients with a normal platelet count was 0·96 (95% CI 0·80—1·15, p=0·66), versus 1·54 (1·17—2·03, p=0·0018) in patients with a raised platelet count (p=0·0027 for interaction). In the per-protocol population, more patients on continuous than on intermittent treatment had grade 3 or worse haematological toxic effects (72 [15%] vs 60 [12%]), whereas nausea and vomiting were more common on intermittent treatment (11 [2%] vs 43 [8%]). Grade 3 or worse peripheral neuropathy (126 [27%] vs 25 [5%]) and hand—foot syndrome (21 [4%] vs 15 [3%]) were more frequent on continuous than on intermittent treatment. Interpretation: Although this trial did not show non-inferiority of intermittent compared with continuous chemotherapy for advanced colorectal cancer in terms of overall survival, chemotherapy-free intervals remain a treatment option for some patients with advanced colorectal cancer, offering reduced time on chemotherapy, reduced cumulative toxic effects, and improved quality of life. Subgroup analyses suggest that patients with normal baseline platelet counts could gain the benefits of intermittent chemotherapy without detriment in survival, whereas those with raised baseline platelet counts have impaired survival and quality of life with intermittent chemotherapy and should not receive a treatment break.
Resumo:
We performed a meta-analysis to estimate the magnitude of C3 gene polymorphism effects, and their possible mode of action, on age-related macular degeneration (AMD). The meta-analysis included 16 studies for rs2230199 and 7 studies for rs1047286. Data extraction and risk of bias assessments were performed in duplicate, and heterogeneity and publication bias were explored. There was moderate evidence for association between both polymorphisms and AMD in individuals of European descent. For rs2230199, patients with CG and GG genotypes were 1.44 (95% CI: 1.33 – 1.56) and 1.88 (95% CI: 1.59 – 2.23) times more likely to have AMD than patients with CC genotype. For rs1047286, those with GA and AA genotypes had 1.27 (95% CI: 1.15 – 1.41) and 1.70 (95% CI: 1.27 – 2.11) times higher risk of AMD than those with GG genotypes. These gene effects suggested an additive model. The population attributable risks for the GG/GC and AA/GA genotypes are approximately 5-10%. Stratification of studies on the basis of ethnicity indicates that these variants are very infrequent in Asian populations and the significance of the effect observed is based largely on the high frequency of these variants within individuals of European descent. This meta-analysis supports the association between C3 and AMD and provides a robust estimate of the genetic risk.
Resumo:
Background: This is an update of a previous review (McGuinness 2006). Hypertension and cognitive impairment are prevalent in older people. Hypertension is a direct risk factor for vascular dementia (VaD) and recent studies have suggested hypertension impacts upon prevalence of Alzheimer's disease (AD). Therefore does treatment of hypertension prevent cognitive decline?
Objectives: To assess the effects of blood pressure lowering treatments for the prevention of dementia and cognitive decline in patients with hypertension but no history of cerebrovascular disease.
Search strategy: The Specialized Register of the Cochrane Dementia and Cognitive Improvement Group, The Cochrane Library, MEDLINE, EMBASE, PsycINFO, CINAHL, LILACS as well as many trials databases and grey literature sources were searched on 13 February 2008 using the terms: hypertens$ OR anti-hypertens$. Selection criteria: Randomized, double-blind, placebo controlled trials in which pharmacological or non-pharmacological interventions to lower blood pressure were given for at least six months.
Data collection and analysis: Two independent reviewers assessed trial quality and extracted data. The following outcomes were assessed: incidence of dementia, cognitive change from baseline, blood pressure level, incidence and severity of side effects and quality of life.
Main results: Four trials including 15,936 hypertensive subjects were identified. Average age was 75.4 years. Mean blood pressure at entry across the studies was 171/86 mmHg. The combined result of the four trials reporting incidence of dementia indicated no significant difference between treatment and placebo (236/7767 versus 259/7660, Odds Ratio (OR) = 0.89, 95% CI 0.74, 1.07) and there was considerable heterogeneity between the trials. The combined results from the three trials reporting change in Mini Mental State Examination (MMSE) did not indicate a benefit from treatment (Weighted Mean Difference (WMD) = 0.42, 95%CI 0.30, 0.53). Both systolic and diastolic blood pressure levels were reduced significantly in the three trials assessing this outcome (WMD = -10.22, 95% CI -10.78, -9.66 for systolic blood pressure, WMD = -4.28, 95% CI -4.58, -3.98 for diastolic blood pressure). Three trials reported adverse effects requiring discontinuation of treatment and the combined results indicated no significant difference (OR = 1.01, 95% CI 0.92, 1.11). When analysed separately, however, more patients on placebo in Syst Eur 1997 were likely to discontinue treatment due to side effects; the converse was true in SHEP 1991. Quality of life data could not be analysed in the four studies. Analysis of the included studies in this review was problematic as many of the control subjects received antihypertensive treatment because their blood pressures exceeded pre-set values. In most cases the study became a comparison between the study drug against a usual antihypertensive regimen.
Authors' conclusions: There is no convincing evidence fromthe trials identified that blood pressure lowering in late-life prevents the development of dementia or cognitive impairment in hypertensive patients with no apparent prior cerebrovascular disease. There were significant problems identified with analysing the data, however, due to the number of patients lost to follow-up and the number of placebo patients who received active treatment. This introduced bias. More robust results may be obtained by conducting a meta-analysis using individual patient data.
Resumo:
Research into the targeting of drug substances to a specific disease site has enjoyed sustained activity for many decades. The reason for such fervent activity is the considerable clinical advantages that can be gained when the delivery system plays a pivotal role in determining where the drug is deposited. When compared to conventional formulations where no such control exists, such as parenteral and oral systems, the sophisticated targeting device can reduce side effects and limit collateral damage to surrounding normal tissue. No more so is this important than in the area of oncology when dose-limiting side effects are often encountered as an ever present difficulty. In this review, the types of colloidal carrier commonly used in targeted drug delivery are discussed, such as gold and polymeric colloids. In particular, the process of attaching targeting capabilities is considered, with reference to antibody technologies used as the targeting motifs. Nanotechnology has brought together a means to carry both a drug and targeting ligand in self-contained constructs and their applications to both clinical therapy and diagnosis are discussed.
Resumo:
The nuclear accident in Chernobyl in 1986 is a dramatic example of the type of incidents that are characteristic of a risk society. The consequences of the incident are indeterminate, the causes complex and future developments unpredictable. Nothing can compensate for its effects and it affects a broad population indiscriminately. This paper examines the lived experience of those who experienced biographical disruption as residents of the region on the basis of qualitative case studies carried out in 2003 in the Chernobyl regions of Russia, Ukraine and Belarus. Our analysis indicates that informants tend to view their future as highly uncertain and unpredictable; they experience uncertainty about whether they are already contaminated, and they have to take hazardous decisions about where to go and what to eat. Fear, rumours and experts compete in supplying information to residents about the actual and potential consequences of the disaster, but there is little trust in, and only limited awareness of, the information that is provided. Most informants continue with their lives and do what they must or even what they like, even where the risks are known. They often describe their behaviour as being due to economic circumstances; where there is extreme poverty, even hazardous food sources are better than none. Unlike previous studies, we identify a pronounced tendency among informants not to separate the problems associated with the disaster from the hardships that have resulted from the break-up of the USSR, with both events creating a deep-seated sense of resignation and fatalism. Although most informants hold their governments to blame for lack of information, support and preventive measures, there is little or no collective action to have these put in place. This contrasts with previous research which has suggested that populations affected by disasters attribute crucial significance to that incident and, as a consequence, become increasingly politicized with regard to related policy agendas.
Resumo:
The Wing-Kristofferson (WK) model of movement timing emphasises the separation of central timer and motor processes. Several studies of repetitive timing have shown that increase in variability at longer intervals is attributable to timer processes; however, relatively little is known about the way motor aspects of timing are affected by task movement constraints. In the present study, we examined timing variability in finger tapping with differences in interval to assess central timer effects, and with differences in movement amplitude to assess motor implementation effects. Then, we investigated whether effects of motor timing observed at the point of response (flexion offset/tap) are also evident in extension, which would suggest that both phases are subject to timing control. Eleven participants performed bimanual simultaneous tapping, at two target intervals (400, 600 ms) with the index finger of each hand performing movements of equal (3 or 6 cm) or unequal amplitude (left hand 3, right hand 6 cm and vice versa). As expected, timer variability increased with the mean interval but showed only small, non-systematic effects with changes in movement amplitude. Motor implementation variability was greater in unequal amplitude conditions. The same pattern of motor variability was observed both at flexion and extension phases of movement. These results suggest that intervals are generated by a central timer, triggering a series of events at the motor output level including flexion and the following extension, which are explicitly represented in the timing system.
Resumo:
Maternal diabetes mellitus is associated with increased teratogenesis, which can occur in pregestational type 1 and type 2 diabetes. Cardiac defects and with neural tube defects are the most common malformations observed in fetuses of pregestational diabetic mothers. The exact mechanism by which diabetes exerts its teratogenic effects and induces embryonic malformations is unclear. Whereas the sequelae of maternal pregestational diabetes, such as modulating insulin levels, altered fat levels, and increased reactive oxygen species, may play a role in fetal damage during diabetic pregnancy, hyperglycemia is thought to be the primary teratogen, causing particularly adverse effects on cardiovascular development. Fetal cardiac defects are associated with raised maternal glycosylated hemoglobin levels and are up to five times more likely in infants of mothers with pregestational diabetes compared with those without diabetes. The resulting anomalies are varied and include transposition of the great arteries, mitral and pulmonary atresia, double outlet of the right ventricle, tetralogy of Fallot, and fetal cardiomyopathy.
Resumo:
Aim. This paper is a report of a study exploring and comparing the experience of men and women with colorectal cancer at diagnosis and during surgery.
Background. Men have higher incidence and mortality rates for nearly all cancers and frequently use health behaviours that reflect their masculinity. There has been minimal investigation into the influence of gender on the experience of a ‘shared’ cancer.
Methods. From November 2006 to November 2008, a qualitative study was conducted involving 38 individuals (24 men, 14 women) with colorectal cancer. Data were generated using semi-structured interviews at four time points over an 18-month period. This paper reports the participants’ experience at diagnosis and during surgery (time point 1) with the purpose of examining the impact of gender on this experience.
Findings. In general, men appeared more accepting of their diagnosis. The majority of females seemed more emotional and more affected by the physical side effects. However, there was variation in both gender groups, with some men and women portraying both ‘masculine’ and ‘feminine’ traits. There was also individual variation in relation to context.
Conclusions. It appears that many men may have been experiencing side effects and/or psychological distress that they were reluctant to discuss, particularly as some men portrayed typical ‘masculine’ traits in public, but felt able to open up in private. Nurses should not make assumptions based on the traditional view of masculinity, and should determine how each man wants to deal with their diagnosis and not presume that all men need to ‘open up’ about their illness.