810 resultados para incremental EM
Resumo:
Recent research shows that because they rely on separate goals, cognitions about not performing a behaviour are not simple opposites of cognitions about performing the same behaviour. Using this perspective, two studies (N = 758 & N = 104) examined the psycho-social determinants of reduction in resource consumption. Results showed that goals associated with reducing versus not reducing resource consumption were not simple opposites (Study 1). Additionally, the discriminant validity of the Theory of Planned Behaviour constructs associated with reducing versus not reducing resource consumption was demonstrated (Study 1 & 2). Moreover, results revealed the incremental validity of both Intentions (to reduce and to not reduce resource consumption) for predicting a series of behaviours (Study 1 & 2). Finally, results indicated a mediation role for the importance of ecological dimensions on the effect of both Intentions on a mock TV choice and a mediation role for the importance of non ecological dimensions on the effect of Intention of not reducing on the same TV choice. Discussion is organized around the consequences, at both theoretical and applied levels, of considering separate motivational systems for reducing and not reducing resource consumption.
Resumo:
Food security is one of this century’s key global challenges. By 2050 the world will require increased crop production in order to feed its predicted 9 billion people. This must be done in the face of changing consumption patterns, the impacts of climate change and the growing scarcity of water and land. Crop production methods will also have to sustain the environment, preserve natural resources and support livelihoods of farmers and rural populations around the world. There is a pressing need for the ‘sustainable intensifi cation’ of global agriculture in which yields are increased without adverse environmental impact and without the cultivation of more land. Addressing the need to secure a food supply for the whole world requires an urgent international effort with a clear sense of long-term challenges and possibilities. Biological science, especially publicly funded science, must play a vital role in the sustainable intensifi cation of food crop production. The UK has a responsibility and the capacity to take a leading role in providing a range of scientifi c solutions to mitigate potential food shortages. This will require signifi cant funding of cross-disciplinary science for food security. The constraints on food crop production are well understood, but differ widely across regions. The availability of water and good soils are major limiting factors. Signifi cant losses in crop yields occur due to pests, diseases and weed competition. The effects of climate change will further exacerbate the stresses on crop plants, potentially leading to dramatic yield reductions. Maintaining and enhancing the diversity of crop genetic resources is vital to facilitate crop breeding and thereby enhance the resilience of food crop production. Addressing these constraints requires technologies and approaches that are underpinned by good science. Some of these technologies build on existing knowledge, while others are completely radical approaches, drawing on genomics and high-throughput analysis. Novel research methods have the potential to contribute to food crop production through both genetic improvement of crops and new crop and soil management practices. Genetic improvements to crops can occur through breeding or genetic modifi cation to introduce a range of desirable traits. The application of genetic methods has the potential to refi ne existing crops and provide incremental improvements. These methods also have the potential to introduce radical and highly signifi cant improvements to crops by increasing photosynthetic effi ciency, reducing the need for nitrogen or other fertilisers and unlocking some of the unrealised potential of crop genomes. The science of crop management and agricultural practice also needs to be given particular emphasis as part of a food security grand challenge. These approaches can address key constraints in existing crop varieties and can be applied widely. Current approaches to maximising production within agricultural systems are unsustainable; new methodologies that utilise all elements of the agricultural system are needed, including better soil management and enhancement and exploitation of populations of benefi cial soil microbes. Agronomy, soil science and agroecology—the relevant sciences—have been neglected in recent years. Past debates about the use of new technologies for agriculture have tended to adopt an either/or approach, emphasising the merits of particular agricultural systems or technological approaches and the downsides of others. This has been seen most obviously with respect to genetically modifi ed (GM) crops, the use of pesticides and the arguments for and against organic modes of production. These debates have failed to acknowledge that there is no technological panacea for the global challenge of sustainable and secure global food production. There will always be trade-offs and local complexities. This report considers both new crop varieties and appropriate agroecological crop and soil management practices and adopts an inclusive approach. No techniques or technologies should be ruled out. Global agriculture demands a diversity of approaches, specific to crops, localities, cultures and other circumstances. Such diversity demands that the breadth of relevant scientific enquiry is equally diverse, and that science needs to be combined with social, economic and political perspectives. In addition to supporting high-quality science, the UK needs to maintain and build its capacity to innovate, in collaboration with international and national research centres. UK scientists and agronomists have in the past played a leading role in disciplines relevant to agriculture, but training in agricultural sciences and related topics has recently suffered from a lack of policy attention and support. Agricultural extension services, connecting farmers with new innovations, have been similarly neglected in the UK and elsewhere. There is a major need to review the support for and provision of extension services, particularly in developing countries. The governance of innovation for agriculture needs to maximise opportunities for increasing production, while at the same time protecting societies, economies and the environment from negative side effects. Regulatory systems need to improve their assessment of benefits. Horizon scanning will ensure proactive consideration of technological options by governments. Assessment of benefi ts, risks and uncertainties should be seen broadly, and should include the wider impacts of new technologies and practices on economies and societies. Public and stakeholder dialogue—with NGOs, scientists and farmers in particular—needs to be a part of all governance frameworks.
Resumo:
This article explores the nature and impact of path dependence in British rail coal haulage before 1939. It examines the factors which locked Britain's railways into a system of small coal wagons with highly fragmented ownership, the cost penalties of this system, and the reasons that attempts at modernization were unsuccessful. The analysis highlights the importance of decentralized ownership of a highly durable installed base of complementary infrastructure. Technical and institutional interrelatedness blocked incremental modernization, while the political requirement to compensate private wagon owners for the loss of their wagon stock made wholesale rationalization financially unattractive.
Resumo:
Background: Medication errors are common in primary care and are associated with considerable risk of patient harm. We tested whether a pharmacist-led, information technology-based intervention was more effective than simple feedback in reducing the number of patients at risk of measures related to hazardous prescribing and inadequate blood-test monitoring of medicines 6 months after the intervention. Methods: In this pragmatic, cluster randomised trial general practices in the UK were stratified by research site and list size, and randomly assigned by a web-based randomisation service in block sizes of two or four to one of two groups. The practices were allocated to either computer-generated simple feedback for at-risk patients (control) or a pharmacist-led information technology intervention (PINCER), composed of feedback, educational outreach, and dedicated support. The allocation was masked to general practices, patients, pharmacists, researchers, and statisticians. Primary outcomes were the proportions of patients at 6 months after the intervention who had had any of three clinically important errors: non-selective non-steroidal anti-inflammatory drugs (NSAIDs) prescribed to those with a history of peptic ulcer without co-prescription of a proton-pump inhibitor; β blockers prescribed to those with a history of asthma; long-term prescription of angiotensin converting enzyme (ACE) inhibitor or loop diuretics to those 75 years or older without assessment of urea and electrolytes in the preceding 15 months. The cost per error avoided was estimated by incremental cost-eff ectiveness analysis. This study is registered with Controlled-Trials.com, number ISRCTN21785299. Findings: 72 general practices with a combined list size of 480 942 patients were randomised. At 6 months’ follow-up, patients in the PINCER group were significantly less likely to have been prescribed a non-selective NSAID if they had a history of peptic ulcer without gastroprotection (OR 0∙58, 95% CI 0∙38–0∙89); a β blocker if they had asthma (0∙73, 0∙58–0∙91); or an ACE inhibitor or loop diuretic without appropriate monitoring (0∙51, 0∙34–0∙78). PINCER has a 95% probability of being cost eff ective if the decision-maker’s ceiling willingness to pay reaches £75 per error avoided at 6 months. Interpretation: The PINCER intervention is an effective method for reducing a range of medication errors in general practices with computerised clinical records. Funding: Patient Safety Research Portfolio, Department of Health, England.
Resumo:
This paper applies a reading of the postmodernisation of law to the incremental reform of agricultural holdings legislation over the last century. In charting the shifting legal basis of agricultural tenancies, from ‘black letter’ positivism to the cultural contextuality of sumptuary law, the paper theorises that the underlying political imperative has been allied to the changing significance of property ownership and use. Rather than reflecting the long-term official desire to maintain the let sector in British agriculture, however, the paper argues that this process has had other aims. In particular, it has been about an annexation of law to legitimise the retention of landowner power while presenting a rhetorical ‘democratisation’ of farming, away from its plutocratic associations and towards a new narrative of ‘depersonalised’ business.
Resumo:
Cloud imagery is not currently used in numerical weather prediction (NWP) to extract the type of dynamical information that experienced forecasters have extracted subjectively for many years. For example, rapidly developing mid-latitude cyclones have characteristic signatures in the cloud imagery that are most fully appreciated from a sequence of images rather than from a single image. The Met Office is currently developing a technique to extract dynamical development information from satellite imagery using their full incremental 4D-Var (four-dimensional variational data assimilation) system. We investigate a simplified form of this technique in a fully nonlinear framework. We convert information on the vertical wind field, w(z), and profiles of temperature, T(z, t), and total water content, qt (z, t), as functions of height, z, and time, t, to a single brightness temperature by defining a 2D (vertical and time) variational assimilation testbed. The profiles of w, T and qt are updated using a simple vertical advection scheme. We define a basic cloud scheme to obtain the fractional cloud amount and, when combined with the temperature field, we convert this information into a brightness temperature, having developed a simple radiative transfer scheme. With the exception of some matrix inversion routines, all our code is developed from scratch. Throughout the development process we test all aspects of our 2D assimilation system, and then run identical twin experiments to try and recover information on the vertical velocity, from a sequence of observations of brightness temperature. This thesis contains a comprehensive description of our nonlinear models and assimilation system, and the first experimental results.
Resumo:
A new incremental four-dimensional variational (4D-Var) data assimilation algorithm is introduced. The algorithm does not require the computationally expensive integrations with the nonlinear model in the outer loops. Nonlinearity is accounted for by modifying the linearization trajectory of the observation operator based on integrations with the tangent linear (TL) model. This allows us to update the linearization trajectory of the observation operator in the inner loops at negligible computational cost. As a result the distinction between inner and outer loops is no longer necessary. The key idea on which the proposed 4D-Var method is based is that by using Gaussian quadrature it is possible to get an exact correspondence between the nonlinear time evolution of perturbations and the time evolution in the TL model. It is shown that J-point Gaussian quadrature can be used to derive the exact adjoint-based observation impact equations and furthermore that it is straightforward to account for the effect of multiple outer loops in these equations if the proposed 4D-Var method is used. The method is illustrated using a three-level quasi-geostrophic model and the Lorenz (1996) model.
Resumo:
Objective: An exaggerated postprandial triacylglycerol (TAG) response is an important determinant of cardiovascular disease risk. With increased recognition of the role of leptin in systemic macronutrient metabolism, we used a candidate gene approach to examine the impact of the common leptin receptor (LEPR) Gln223Arg polymorphism (rs1137101) on postprandial lipaemia. Methods and results: Healthy adults (n ¼ 251) underwent a sequential meal postprandial investigation, in which blood samples were taken at regular intervals after a test breakfast (t ¼ 0) and lunch (t ¼ 330 min). Fasting total- and low-density lipoprotein cholesterol were 9% lower in the ArgArg than GlnArg group (P < 0.04), whereas fasting TAG was 27% lower in the ArgArg than GlnGln group (P < 0.02). The magnitude of the postprandial TAG response was also significantly lower in the ArgArg compared with the GlnArg and GlnGln genotypes, with a 26% lower area under the curve (AUC) and incremental AUC in the ArgArg individuals (P � 0.023). Genotype*gender interactions were evident for fasting and postprandial TAG responses (P < 0.05), with the genotype effect only evident in males. Regression analysis indicated that the LEPR genotype and genotype*gender interactions were independent predictors of the TAG AUC, accounting for 6.3% of the variance. Our main findings were replicated in the independent LIPGENE-Cordoba postprandial cohort of metabolic syndrome subjects (n ¼ 75), with a 52% lower TAG AUC in the ArgArg than GlnGln male subjects (P ¼ 0.018). Conclusion: We report for the first time that the common LEPR Gln223Arg genotype is an important predictor of postprandial TAG in males. The mechanistic basis of these associations remains to be determined.
Resumo:
Abstract Objective: Studies have started to question whether a specific component or combinations of metabolic syndrome (MetS) components may be more important in relation to cardiovascular disease risk. Our aim was to examine the impact of the presence of raised fasting glucose as a MetS component on postprandial lipaemia. Methods: Men classified with the MetS underwent a sequential test meal investigation, in which blood samples were taken at regular intervals after a test breakfast (t=0 min) and lunch (t=330 min). Lipids, glucose and insulin were measured in the fasting and postprandial samples. Results: MetS subjects with 3 or 4 components were subdivided into those without (n=34) and with (n=23) fasting hyperglycaemia (≥ 5.6 mmol/l), irrespective of the combination of components. Fasting lipids and insulin were similar in the two groups, with glucose significantly higher in the men with glucose as a MetS component (P<0.001). Following the test meals, there was a higher maximum concentration (maxC), area under the curve (AUC) and incremental AUC (P≤0.016) for the postprandial triacylglycerol (TAG) response in men with fasting hyperglycaemia. Greater glucose AUC (P<0.001) and insulin maxC (P=0.010) was also observed in these individuals after the test meals. Multivariate regression analysis revealed fasting glucose to be an important predictor of the postprandial TAG and glucose response. Conclusion: Our data analysis has revealed a greater impairment of postprandial TAG than glucose response in MetS subjects with raised fasting glucose. The worsening of postprandial lipaemic control may contribute to the greater CVD risk reported in individuals with MetS component combinations which include hyperglycaemia.
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.
Resumo:
Dietary nitrate, from beetroot, has been reported to lower blood pressure (BP) by the sequential reduction of nitrate to nitrite and further to NO in the circulation. However, the impact of beetroot on microvascular vasodilation and arterial stiffness is unknown. In addition, beetroot is consumed by only 4.5% of the UK population, whereas bread is a staple component of the diet. Thus, we investigated the acute effects of beetroot bread (BB) on microvascular vasodilation, arterial stiffness, and BP in healthy participants. Twenty-three healthy men received 200 g bread containing 100 g beetroot (1.1 mmol nitrate) or 200 g control white bread (CB; 0 g beetroot, 0.01 mmol nitrate) in an acute, randomized, open-label, controlled crossover trial. The primary outcome was postprandial microvascular vasodilation measured by laser Doppler iontophoresis and the secondary outcomes were arterial stiffness measured by Pulse Wave Analysis and Velocity and ambulatory BP measured at regular intervals for a total period of 6 h. Plasma nitrate and nitrite were measured at regular intervals for a total period of 7 h. The incremental area under the curve (0-6 h after ingestion of bread) for endothelium-independent vasodilation was greater (P = 0.017) and lower for diastolic BP (DBP; P = 0.032) but not systolic (P = 0.99) BP after BB compared with CB. These effects occurred in conjunction with increases in plasma and urinary nitrate (P < 0.0001) and nitrite (P < 0.001). BB acutely increased endothelium-independent vasodilation and decreased DBP. Therefore, enriching bread with beetroot may be a suitable vehicle to increase intakes of cardioprotective beetroot in the diet and may provide new therapeutic perspectives in the management of hypertension.
Resumo:
Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick
Resumo:
We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.
Resumo:
The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.