36 resultados para In-cylinder Pressure Analysis
em CentAUR: Central Archive University of Reading - UK
Resumo:
There is an urgent need to treat individuals with high blood pressure (BP) with effective dietary strategies. Previous studies suggest a small, but significant decrease in BP after lactotripeptides (LTP) ingestion, although the data are inconsistent. The study aim was to perform a comprehensive meta-analysis of data from all relevant randomised controlled trials (RCT). Medline, Cochrane library, EMBASE and Web of Science were searched until May 2014. Eligibility criteria were RCT that examined the effects of LTP on BP in adults, with systolic BP (SBP) and diastolic BP (DBP) as outcome measures. Thirty RCT met the inclusion criteria, which resulted in 33 sets of data. The pooled treatment effect for SBP was −2.95 mmHg (95% CI: −4.17, −1.73; p < 0.001), and for DBP was −1.51 mmHg (95% CI: −2.21, −0.80; p < 0.001). Sub-group analyses revealed that reduction of BP in Japanese studies was significantly greater, compared with European studies (p = 0.002 for SBP and p < 0.001 for DBP). The 24-h ambulatory BP (AMBP) response to LTP supplementation was statistically non-significant (p = 0.101 for SBP and p = 0.166 for DBP). Both publication bias and “small-study effect” were identified, which shifted the treatment effect towards less significant SBP and non-significant DBP reduction after LTP consumption. LTP may be effective in BP reduction, especially in Japanese individuals; however sub-group, meta-regression analyses and statistically significant publication biases suggest inconsistencies.
Resumo:
Background: Although a large number of randomized controlled trials (RCTs) have examined the impact of the n-3 (ω-3) fatty acids EPA (20:5n-3) and DHA (22:6n-3) on blood pressure and vascular function, the majority have used doses of EPA+DHA of > 3 g per d,which are unlikely to be achieved by diet manipulation. Objective: The objective was to examine, using a retrospective analysis from a multi-center RCT, the impact of recommended, dietary achievable EPA+DHA intakes on systolic and diastolic blood pressure and microvascular function in UK adults. Design: Healthy men and women (n = 312) completed a double-blind, placebo-controlled RCT consuming control oil, or fish oil providing 0.7 g or 1.8 g EPA+DHA per d in random order each for 8 wk. Fasting blood pressure and microvascular function (using Laser Doppler Iontophoresis) were assessed and plasma collected for the quantification of markers of vascular function. Participants were retrospectively genotyped for the eNOS rs1799983 variant. Results: No impact of n-3 fatty acid treatment or any treatment * eNOS genotype interactions were evident in the group as a whole for any of the clinical or biochemical outcomes. Assessment of response according to hypertension status at baseline indicated a significant (P=0.046) fish oil-induced reduction (mean 5 mmHg) in systolic blood pressure specifically in those with isolated systolic hypertension (n=31). No dose response was observed. Conclusions: These findings indicate that, in those with isolated systolic hypertension, daily doses of EPA+DHA as low as 0.7 g bring about clinically meaningful blood pressure reductions which, at a population level, would be associated with lower cardiovascular disease risk. Confirmation of findings in an RCT where participants are prospectively recruited on the basis of blood pressure status is required to draw definite conclusions. The Journal of Nutrition NUTRITION/2015/220475 Version 4
Resumo:
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of engagement and important issues such as the balance of power between adults and children, training, support, ethical considerations, time and resources. We argue that involving children in data analysis processes can have several benefits, including enabling a greater understanding of children's perspectives and helping to prioritise children's agendas in policy and practice. (C) 2007 The Author(s). Journal compilation (C) 2007 National Children's Bureau.
Resumo:
Standard form contracts are typically developed through a negotiated consensus, unless they are proffered by one specific interest group. Previously published plans of work and other descriptions of the processes in construction projects tend to focus on operational issues, or they tend to be prepared from the point of view of one or other of the dominant interest groups. Legal practice in the UK permits those who draft contracts to define their terms as they choose. There are no definitive rulings from the courts that give an indication as to the detailed responsibilities of project participants. The science of terminology offers useful guidance for discovering and describing terms and their meanings in their practical context, but has never been used for defining terms for responsibilities of participants in the construction project management process. Organizational analysis enables the management task to be deconstructed into its elemental parts in order that effective organizational structures can be developed. Organizational mapping offers a useful technique for reducing text-based descriptions of project management roles and responsibilities to a comparable basis. Research was carried out by means of a desk study, detailed analysis of nine plans of work and focus groups representing all aspects of the construction industry. No published plan of work offers definitive guidance. There is an enormous amount of variety in the way that terms are used for identifying responsibilities of project participants. A catalogue of concepts and terms (a “Terminology”) has been compiled and indexed to enable those who draft contracts to choose the most appropriate titles for project participants. The purpose of this terminology is to enable the selection and justification of appropriate terms in order to help define roles. The terminology brings an unprecedented clarity to the description of roles and responsibilities in construction projects and, as such, will be helpful for anyone seeking to assemble a team and specify roles for project participants.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Recent developments in genetic data analysis: what can they tell us about human demographic history?
Resumo:
Over the last decade, a number of new methods of population genetic analysis based on likelihood have been introduced. This review describes and explains the general statistical techniques that have recently been used, and discusses the underlying population genetic models. Experimental papers that use these methods to infer human demographic and phylogeographic history are reviewed. It appears that the use of likelihood has hitherto had little impact in the field of human population genetics, which is still primarily driven by more traditional approaches. However, with the current uncertainty about the effects of natural selection, population structure and ascertainment of single-nucleotide polymorphism markers, it is suggested that likelihood-based methods may have a greater impact in the future.
Resumo:
Human D-2Long (D-2L) and D-2Short (D-2S) dopamine receptor isoforms were modified at their N-terminus by the addition of a human immunodeficiency virus (HIV) or a FLAG epitope tag. The receptors were then expressed in Spodoptera frugiperda 9 (Sf9) cells using the baculovirus system, and their oligomerization was investigated by means of co-immunoprecipitation and time-resolved fluorescence resonance energy transfer (FRET). [H-3] Spiperone labelled D-2 receptors in membranes prepared from Sf9 cells expressing epitope-tagged D-2L or D-2S receptors, with a pK(d) value of approximate to 10. Co-immunoprecipitation using antibodies specific for the tags showed constitutive homo-oligomerization of D-2L and D-2S receptors in Sf9 cells. When the FLAG-tagged D-2S and HIV-tagged D-2L receptors were co-expressed, co-immunoprecipitation showed that the two isoforms can also form hetero-oligomers in Sf9 cells. Time-resolved FRET with europium and XL665-labelled antibodies was applied to whole Sf9 cells and to membranes from Sf9 cells expressing epitope-tagged D-2 receptors. In both cases, constitutive homo-oligomers were revealed for D-2L and D-2S isoforms. Time-resolved FRET also revealed constitutive homo-oligomers in HEK293 cells expressing FLAG-tagged D-2S receptors. The D-2 receptor ligands dopamine, R-(-) propylnorapomorphine, and raclopride did not affect oligomerization of D-2L and D-2S in Sf9 and HEK293 cells. Human D-2 dopamine receptors can therefore form constitutive oligomers in Sf9 cells and in HEK293 cells that can be detected by different approaches, and D-2 oligomerization in these cells is not regulated by ligands.
Resumo:
Consumers increasingly demand convenience foods of the highest quality in terms of natural flavor and taste, and which are freedom additives and preservatives. This demand has triggered the need for the development of a number of nonthermal approaches to food processing, of which high-pressure technology has proven to be very valuable. A number of recent publications have demonstrated novel and diverse uses of this technology. Its novel features, which include destruction of microorganisms at room temperature or lower, have made the technology commerically attractive. Enzymes forming bacteria can be by the application of pressure-thermal combinations. This review aims to identify the opportunities and challenges associated with this technology. In addition to discussing the effects of high pressure on food components, this review covers the combined effects of high pressure processing with: gamma irradiation, alternating current, ultrasound, and carbon dioxide or anti-microbial treatment. Further, the applications of this technology in various sectors-fruits and vegetables, dairy and meat processing-have been dealt with extensively. The integration of high-pressure with other matured processing operations such as blanching, dehydration, osmotic dehydration, rehyrdration, frying, freezing/thawing and solid-liquid extraction has been shown to open up new processing options. The key challenges identified include: heat transfer problems and resulting non-uniformity in processing, obtaining reliable and reproducible data, for process validation, lack of detailed knowledge about the interaction between high pressure, and a number of food constituents, packaging and statutory issues.
Resumo:
Background: Several lines of evidence suggest that the dietary isoflavone genistein (Gen) has beneficial effects with regard to cardiovascular disease and in particular on aspects related to blood pressure and angiogenesis. The biological action of Gen may be, at Least in part, attributed to its ability to affect cell signalling and response. However, so far, most of the molecular mechanisms underlying the activity of Gen in the endothelium are unknown. Methods and results: To examine the transcriptional response to 2.5 mu M Gen on primary human endothelial cells (HUVEC), we applied cDNA array technology both under baseline condition and after treatment with the pro-atherogenic stimulus, copper-oxidized LDL. The alteration of the expression patterns of individual transcripts was substantiated using either RT-PCR or Northern blotting. Gen significantly affected the expression of genes encoding for proteins centrally involved in the vascular tone such as endothelin-converting enzyme-1, endothetin-2, estrogen related receptor a and atria[ natriuretic peptide receptor A precursor. Furthermore, Gen countered the effect of oxLDL on mRNA levels encoding for vascular endothelial growth factor receptor 165, types 1 and 2. Conclusions: Our data indicate that physiologically achievable levels of Gen change the expression of mRNA encoding for proteins involved in the control of blood pressure under baseline conditions and reduce the angiogenic response to oxLDL in the endothelium. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we address issues in segmentation Of remotely sensed LIDAR (LIght Detection And Ranging) data. The LIDAR data, which were captured by airborne laser scanner, contain 2.5 dimensional (2.5D) terrain surface height information, e.g. houses, vegetation, flat field, river, basin, etc. Our aim in this paper is to segment ground (flat field)from non-ground (houses and high vegetation) in hilly urban areas. By projecting the 2.5D data onto a surface, we obtain a texture map as a grey-level image. Based on the image, Gabor wavelet filters are applied to generate Gabor wavelet features. These features are then grouped into various windows. Among these windows, a combination of their first and second order of statistics is used as a measure to determine the surface properties. The test results have shown that ground areas can successfully be segmented from LIDAR data. Most buildings and high vegetation can be detected. In addition, Gabor wavelet transform can partially remove hill or slope effects in the original data by tuning Gabor parameters.
Resumo:
A role for sequential test procedures is emerging in genetic and epidemiological studies using banked biological resources. This stems from the methodology's potential for improved use of information relative to comparable fixed sample designs. Studies in which cost, time and ethics feature prominently are particularly suited to a sequential approach. In this paper sequential procedures for matched case–control studies with binary data will be investigated and assessed. Design issues such as sample size evaluation and error rates are identified and addressed. The methodology is illustrated and evaluated using both real and simulated data sets.
Resumo:
A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.