96 resultados para Prats, Modest

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: In 1997, the US Food and Drug Administration passed a unique ruling that allowed oat bran to be registered as the first cholesterol-reducing food at a dosage of 3 g beta-glucan/d. OBJECTIVE: The effects of a low dose of oat bran in the background diet only were investigated in volunteers with mild-to-moderate hyperlipidemia. DESIGN: The study was a double-blind, placebo-controlled, randomized, parallel study. Sixty-two healthy men (n = 31) and women (n = 31) were randomly allocated to consume either 20 g oat bran concentrate (OBC; containing 3 g beta-glucan) or 20 g wheat bran (control) daily for 8 wk. Fasting blood samples were collected at weeks -1, 0, 4, 8, and 12. A subgroup (n = 17) was studied postprandially after consumption of 2 meals (containing no OBC or wheat bran) at baseline and after supplementation. Fasting plasma samples were analyzed for total cholesterol, HDL cholesterol, triacylglycerol, glucose, and insulin. LDL cholesterol was measured by using the Friedewald formula. The postprandial samples were anlayzed for triacylglycerol, glucose, and insulin. RESULTS: No significant difference was observed in fasting plasma cholesterol, LDL cholesterol, glucose, or insulin between the OBC and wheat-bran groups. HDL-cholesterol concentrations fell significantly from weeks 0 to 8 in the OBC group (P = 0.05). There was a significant increase in fasting glucose concentrations after both OBC (P = 0.03) and wheat-bran (P = 0.02) consumption. No significant difference was found between the OBC and wheat-bran groups in any of the postprandial variables measured. CONCLUSIONS: A low dosage of beta-glucan (3 g/d) did not significantly reduce total cholesterol or LDL cholesterol in volunteers with plasma cholesterol concentrations representative of a middle-aged UK population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND:Apolioprotein E (APOE) genotype is reported to influence a person's fasting lipid profile and potentially the response to dietary fat manipulation. The impact of APOE genotype on the responsiveness to meals of varying fat composition is unknown. OBJECTIVE:We examined the effect of meals containing 50 g of fat rich in saturated fatty acids (SFAs), unsaturated fatty acids (UNSATs), or SFAs with fish oil (SFA-FO) on postprandial lipemia. METHOD:A randomized, controlled, test meal study was performed in men recruited according to the APOE genotype (n = 10 APOE3/3, n = 11 APOE3/E4). RESULTS:For the serum apoE response (meal × genotype interaction P = 0.038), concentrations were on average 8% lower after the UNSAT than the SFA-FO meal in APOE4 carriers (P = 0.015) only. In the genotype groups combined, there was a delay in the time to reach maximum triacylglycerol (TG) concentration (mean ± SEM: 313 ± 25 vs. 266 ± 27 min) and higher maximum nonesterified fatty acid (0.73 ± 0.05 vs. 0.60 ± 0.03 mmol/L) and glucose (7.92 ± 0.22 vs. 7.25 ± 0.22 mmol/L) concentrations after the SFA than the UNSAT meal, respectively (P ≤ 0.05). In the Svedberg flotation rate 60-400 TG-rich lipoprotein fraction, meal × genotype interactions were observed for incremental area under the curve (IAUC) for the TG (P = 0.038) and apoE (P = 0.016) responses with a 58% lower apoE IAUC after the UNSAT than the SFA meal (P = 0.017) in the E4 carriers. CONCLUSIONS:Our data indicate that APOE genotype had a modest impact on the postprandial response to meals of varying fat composition in normolipidemic men. The physiologic importance of greater apoE concentrations after the SFA-rich meals in APOE4 carriers may reflect an impact on TG-rich lipoprotein clearance from the circulation. This trial was registered at clinicaltrials.gov as NCT01522482.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AEA Technology has provided an assessment of the probability of α-mode containment failure for the Sizewell B PWR. After a preliminary review of the methodologies available it was decided to use the probabilistic approach described in the paper, based on an extension of the methodology developed by Theofanous et al. (Nucl. Sci. Eng. 97 (1987) 259–325). The input to the assessment is 12 probability distributions; the bases for the quantification of these distributions are discussed. The α-mode assessment performed for the Sizewell B PWR has demonstrated the practicality of the event-tree method with input data represented by probability distributions. The assessment itself has drawn attention to a number of topics, which may be plant and sequence dependent, and has indicated the importance of melt relocation scenarios. The α-mode failure probability following an accident that leads to core melt relocation to the lower head for the Sizewell B PWR has been assessed as a few parts in 10 000, on the basis of current information. This assessment has been the first to consider elevated pressures (6 MPa and 15 MPa) besides atmospheric pressure, but the results suggest only a modest sensitivity to system pressure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show that small quantities of 1,3:2,4-di(4-chlorobenzylidene) sorbitol dispersed in poly(epsilon-caprolactone) provide a very effective self-assembling nanoscale framework which, with a flow field, yields extremely high levels of polymer crystal orientation. During modest shear flow of the polymer melt, the additive forms highly extended nano-particles which adopt a preferred alignment with respect to the flow field. On cooling, polymer crystallisation is directed by these particles. This chloro substituted dibenzylidene sorbitol is considerably more effective at directing the crystal growth of poly(epsilon-caprolactone) than the unsubstituted compound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The simulated annealing approach to structure solution from powder diffraction data, as implemented in the DASH program, is easily amenable to parallelization at the individual run level. Modest increases in speed of execution can therefore be achieved by executing individual DASH runs on the individual cores of CPUs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In previous work we have found that Cp2TiCl2 and its corresponding deriv. of tamoxifen, Titanocene tamoxifen, show an unexpected proliferative effect on hormone dependent breast cancer cells MCF-7. In order to check if this behavior is a general trend for titanocene derivs. we have tested two other titanocene derivs., Titanocene Y and Titanocene K, on this cell line. Interestingly, these two titanocene complexes behave in a totally different manner. Titanocene K is highly proliferative on MCF-7 cells even at low concns. (0.5 .mu.M), thus behave almost similarly to Cp2TiCl2. This proliferative effect is also obsd. in the presence of bovine serum albumin (BSA). In contrast, Titanocene Y alone has almost no effect on MCF-7 at a concn. of 10 .mu.M, but exhibits a significant dose dependent cytotoxic effect of up to 50% when incubated with BSA (20-50 .mu.g/mL). This confirms the crucial role played by the binding to serum proteins in the expression of the in vivo, cytotoxicity of the titanocene complexes. From the hydridolithiation reaction of 6-p-anisylfulvene with LiBEt3H followed by transmetallation with iron dichloride [bis-[(p-methoxy-benzyl)cyclopentadienyl]iron(II)] (Ferrocene Y) was synthesized. This complex, which was characterized by single crystal X-ray diffraction, contains the robust ferrocenyl unit instead of Ti assocd. with easily leaving groups such as chlorine and shows only a modest cytotoxicity against MCF-7 or MDA-MB-231 cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate thin films of cylinder-forming diblock copolymer confined between electrically charged parallel plates, using self-consistent-field theory ( SCFT) combined with an exact treatment for linear dielectric materials. Our study focuses on the competition between the surface interactions, which tend to orient cylinder domains parallel to the plates, and the electric field, which favors a perpendicular orientation. The effect of the electric field on the relative stability of the competing morphologies is demonstrated with equilibrium phase diagrams, calculated with the aid of a weak-field approximation. As hoped, modest electric fields are shown to have a significant stabilizing effect on perpendicular cylinders, particularly for thicker films. Our improved SCFT-based treatment removes most of the approximations implemented by previous approaches, thereby managing to resolve outstanding qualitative inconsistencies among different approximation schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The commonly held view of the conditions in the North Atlantic at the last glacial maximum, based on the interpretation of proxy records, is of large-scale cooling compared to today, limited deep convection, and extensive sea ice, all associated with a southward displaced and weakened overturning thermohaline circulation (THC) in the North Atlantic. Not all studies support that view; in particular, the "strength of the overturning circulation" is contentious and is a quantity that is difficult to determine even for the present day. Quasi-equilibrium simulations with coupled climate models forced by glacial boundary conditions have produced differing results, as have inferences made from proxy records. Most studies suggest the weaker circulation, some suggest little or no change, and a few suggest a stronger circulation. Here results are presented from a three-dimensional climate model, the Hadley Centre Coupled Model version 3 (HadCM3), of the coupled atmosphere - ocean - sea ice system suggesting, in a qualitative sense, that these diverging views could all have occurred at different times during the last glacial period, with different modes existing at different times. One mode might have been characterized by an active THC associated with moderate temperatures in the North Atlantic and a modest expanse of sea ice. The other mode, perhaps forced by large inputs of meltwater from the continental ice sheets into the northern North Atlantic, might have been characterized by a sluggish THC associated with very cold conditions around the North Atlantic and a large areal cover of sea ice. The authors' model simulation of such a mode, forced by a large input of freshwater, bears several of the characteristics of the Climate: Long-range Investigation, Mapping, and Prediction (CLIMAP) Project's reconstruction of glacial sea surface temperature and sea ice extent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We separate and quantify the sources of uncertainty in projections of regional (*2,500 km) precipitation changes for the twenty-first century using the CMIP3 multi-model ensemble, allowing a direct comparison with a similar analysis for regional temperature changes. For decadal means of seasonal mean precipitation, internal variability is the dominant uncertainty for predictions of the first decade everywhere, and for many regions until the third decade ahead. Model uncertainty is generally the dominant source of uncertainty for longer lead times. Scenario uncertainty is found to be small or negligible for all regions and lead times, apart from close to the poles at the end of the century. For the global mean, model uncertainty dominates at all lead times. The signal-to-noise ratio (S/N) of the precipitation projections is highest at the poles but less than 1 almost everywhere else, and is far lower than for temperature projections. In particular, the tropics have the highest S/N for temperature, but the lowest for precipitation. We also estimate a ‘potential S/N’ by assuming that model uncertainty could be reduced to zero, and show that, for regional precipitation, the gains in S/N are fairly modest, especially for predictions of the next few decades. This finding suggests that adaptation decisions will need to be made in the context of high uncertainty concerning regional changes in precipitation. The potential to narrow uncertainty in regional temperature projections is far greater. These conclusions on S/N are for the current generation of models; the real signal may be larger or smaller than the CMIP3 multi-model mean. Also note that the S/N for extreme precipitation, which is more relevant for many climate impacts, may be larger than for the seasonal mean precipitation considered here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To evaluate CBTp delivered by non-expert therapists, using CBT relevant measures. Methods: Participants (N=74) were randomised into immediate therapy or waiting list control groups. The therapy group was offered six months of therapy and followed up three months later. The waiting list group received therapy after waiting nine months (becoming the delayed therapy group). Results: Depression improved in the combined therapy group at both the end of therapy and follow-up. Other significant effects were found in only one of the two therapy groups (positive symptoms; cognitive flexibility; uncontrollability of thoughts) or one of the two timepoints (end of therapy: PANSS general symptoms, anxiety, suicidal ideation, social functioning, resistance to voices; follow-up: power beliefs about voices, negative symptoms). There was no difference in costs between the groups. Conclusions: The only robust improvement was in depression. Nevertheless, there were further encouraging but modest improvements in both emotional and cognitive variables, in addition to psychotic symptoms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper argues that the Uruguay Round Agreement on Agriculture (URAA) introduced the market liberal paradigm as the ideational underpinning of the new farm trade regime. Though the immediate consequences in terms of limitations on agricultural support and protection were very modest, the Agreement did impact on the way in which domestic farm policy evolves. It forced EU agricultural policy makers to consider the agricultural negotiations when reforming the Common Agricultural Policy (CAP). The new paradigm in global farm trade resulted in a process of institutional layering in which concerns raised in the World Trade Organization (WTO) were gradually incorporated in EU agricultural institutions. This has resulted in gradual reform of the CAP in which policy instruments have been changed in order to make the CAP more WTO compatible. The underlying paradigm, the state-assisted paradigm, has been sustained though it has been rephrased by introducing the concept of multifunctionality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To improve the welfare of the rural poor and keep them in the countryside, the government of Botswana has been spending 40% of the value of agricultural GDP on agricultural support services. But can investment make smallholder agriculture prosperous in such adverse conditions? This paper derives an answer by applying a two-output six-input stochastic translog distance function, with inefficiency effects and biased technical change to panel data for the 18 districts and the commercial agricultural sector, from 1979 to 1996 This model demonstrates that herds are the most important input, followed by draft power. land and seeds. Multilateral indices for technical change, technical efficiency and total factor productivity (TFP) show that the technology level of the commercial agricultural sector is more than six times that of traditional agriculture and that the gap has been increasing, due to technological regression in traditional agriculture and modest progress in commercial agriculture. Since the levels of efficiency are similar, the same patient is repeated by the TFP indices. This result highlights the policy dilemma of the trade-off between efficiency and equity objectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent studies of the current state of rural education and training (RET) systems in sub-Saharan Africa have assessed their ability to provide for the learning needs essential for more knowledgeable and productive small-scale rural households. These are most necessary if the endemic causes of rural poverty (poor nutrition, lack of sustainable livelihoods, etc.) are to be overcome. A brief historical background and analysis of the major current constraints to improvement in the sector are discussed. Paramount among those factors leading to its present 'malaise' is the lack of a whole-systems perspective and the absence of any coherent policy framework in most countries. There is evidence of some recent innovations, both in the public sector and through the work of non-governmental organisations (NGOs), civil society organisations (CSOs) and other private bodies. These provide hope of a new sense of direction that could lead towards meaningful 'revitalisation' of the sector. A suggested framework offers 10 key steps which, it is argued, could largely be achieved with modest internal resources and very little external support, provided that the necessary leadership and managerial capacities are in place. (C) 2006 Elsevier Ltd. All rights reserved.