40 resultados para Critical to Satisfaction
Resumo:
Controlled human intervention trials are required to confirm the hypothesis that dietary fat quality may influence insulin action. The aim was to develop a food-exchange model, suitable for use in free-living volunteers, to investigate the effects of four experimental diets distinct in fat quantity and quality: high SFA (HSFA); high MUFA (HMUFA) and two low-fat (LF) diets, one supplemented with 1.24g EPA and DHA/d (LFn-3). A theoretical food-exchange model was developed. The average quantity of exchangeable fat was calculated as the sum of fat provided by added fats (spreads and oils), milk, cheese, biscuits, cakes, buns and pastries using data from the National Diet and Nutrition Survey of UK adults. Most of the exchangeable fat was replaced by specifically designed study foods. Also critical to the model was the use of carbohydrate exchanges to ensure the diets were isoenergetic. Volunteers from eight centres across Europe completed the dietary intervention. Results indicated that compositional targets were largely achieved with significant differences in fat quantity between the high-fat diets (39.9 (SEM 0.6) and 38.9 (SEM 0.51) percentage energy (%E) from fat for the HSFA and HMUFA diets respectively) and the low-fat diets (29.6 (SEM 0.6) and 29.1 (SEM 0.5) %E from fat for the LF and LFn-3 diets respectively) and fat quality (17.5 (SEM 0.3) and 10.4 (SEM 0.2) %E front SFA and 12.7 (SEM 0.3) and 18.7 (SEM 0.4) %E MUFA for the HSFA and HMUFA diets respectively). In conclusion, a robust, flexible food-exchange model was developed and implemented successfully in the LIPGENE dietary intervention trial.
Resumo:
The cheese industry has continually sought a robust method to monitor milk coagulation. Measurement of whey separation is also critical to control cheese moisture content, which affects quality. The objective of this study was to demonstrate that an online optical sensor detecting light backscatter in a vat could be applied to monitor both coagulation and syneresis during cheesemaking. A prototype sensor having a large field of view (LFV) relative to curd particle size was constructed. Temperature, cutting time, and calcium chloride addition were varied to evaluate the response of the sensor over a wide range of coagulation and syneresis rates. The LFV sensor response was related to casein micelle aggregation and curd firming during coagulation and to changes in curd moisture and whey fat contents during syneresis. The LFV sensor has potential as an online, continuous sensor technology for monitoring both coagulation and syneresis during cheesemaking.
Resumo:
This article summarises recent revisions to the investment development path (IDP) as postulated by Narula and Dunning (2010). The IDP provides a framework to understand the dynamic interaction between foreign direct investment (FDI) and economic development. The revisions take into account some recent changes in the global economic environment. This paper argues that studies based on the IDP should adopt a broader perspective, encompassing the idiosyncratic economic structure of countries as well as the heterogeneous nature of FDI. It is critical to understand the complex forces and interactions that determine the turning points in a country’s IDP, and to more explicitly acknowledge the role of historical, social and political circumstances in hindering or promoting FDI. We discuss some of the implications for Eastern European countries and provide some guidelines for future research.
Resumo:
This study examines criteria for the existence of two stable states of the Atlantic Meridional Overturning Circulation (AMOC) using a combination of theory and simulations from a numerical coupled atmosphere–ocean climate model. By formulating a simple collection of state parameters and their relationships, the authors reconstruct the North Atlantic Deep Water (NADW) OFF state behavior under a varying external salt-flux forcing. This part (Part I) of the paper examines the steady-state solution, which gives insight into the mechanisms that sustain the NADW OFF state in this coupled model; Part II deals with the transient behavior predicted by the evolution equation. The nonlinear behavior of the Antarctic Intermediate Water (AAIW) reverse cell is critical to the OFF state. Higher Atlantic salinity leads both to a reduced AAIW reverse cell and to a greater vertical salinity gradient in the South Atlantic. The former tends to reduce Atlantic salt export to the Southern Ocean, while the latter tends to increases it. These competing effects produce a nonlinear response of Atlantic salinity and salt export to salt forcing, and the existence of maxima in these quantities. Thus the authors obtain a natural and accurate analytical saddle-node condition for the maximal surface salt flux for which a NADW OFF state exists. By contrast, the bistability indicator proposed by De Vries and Weber does not generally work in this model. It is applicable only when the effect of the AAIW reverse cell on the Atlantic salt budget is weak.
Resumo:
Starch is the most widespread and abundant storage carbohydrate in crops and its production is critical to both crop yield and quality. As regards the starch content in the seeds of crop plants, there are distinct difference between grasses (Poaceae) and dicots. However, few studies have described the evolutionary pattern of genes in the starch biosynthetic pathway in these two groups of plants. In this study, therefore, an attempt was made to compare the evolutionary rate, gene duplication and selective pattern of the key genes involved in this pathway between the two groups, using five grasses and five dicots as materials. The results showed (i) distinct differences in patterns of gene duplication and loss between grasses and dicots; duplication in grasses mainly occurred prior to the divergence of grasses, whereas duplication mostly occurred in individual species within the dicots; there is less gene loss in grasses than in dicots; (ii) a considerably higher evolutionary rate in grasses than in dicots in most gene families analyzed; (iii) evidence of a different selective pattern between grasses and dicots; positive selection may have occurred asymmetrically in grasses in some gene families, e.g. AGPase small subunit. Therefore, we deduced that gene duplication contributes to, and a higher evolutionary rate is associated with, the higher starch content in grasses. In addition, two novel aspects of the evolution of the starch biosynthetic pathway were observed.
Resumo:
Wernicke’s aphasia (WA) is the classical neurological model of comprehension impairment and, as a result, the posterior temporal lobe is assumed to be critical to semantic cognition. This conclusion is potentially confused by (a) the existence of patient groups with semantic impairment following damage to other brain regions (semantic dementia and semantic aphasia) and (b) an ongoing debate about the underlying causes of comprehension impairment in WA. By directly comparing these three patient groups for the first time, we demonstrate that the comprehension impairment in Wernicke’s aphasia is best accounted for by dual deficits in acoustic-phonological analysis (associated with pSTG) and semantic cognition (associated with pMTG and angular gyrus). The WA group were impaired on both nonverbal and verbal comprehension assessments consistent with a generalised semantic impairment. This semantic deficit was most similar in nature to that of the semantic aphasia group suggestive of a disruption to semantic control processes. In addition, only the WA group showed a strong effect of input modality on comprehension, with accuracy decreasing considerably as acoustic-phonological requirements increased. These results deviate from traditional accounts which emphasise a single impairment and, instead, implicate two deficits underlying the comprehension disorder in WA.
Resumo:
Effectively preparing and planning for Customer Relationship Management (CRM) strategy is critical to CRM implementation success. A lack of a common and systematic way to implement CRM means that focus must be placed on the pre-implementation stage to ensure chance of success. Although existing CRM implementation approaches evidence the need to concentrate mostly on the pre-implementation stage, they fail to address some key issues, which raises the need for a generic framework that address CRM strategy analysis. This paper proposes a framework to support effective CRM pre-implementation strategy development.
Resumo:
Purpose – The purpose of this paper is to demonstrate key strategic decisions involved in turning around a large multinational operating in a dynamic market. Design/methodology/approach – The paper is based on analysis of archival documents and a semi-structured interview with the chairman of the company credited with its rescue. Findings – Turnaround is complex and involves both planned and emergent strategies. The progress is non-linear requiring adjustment and change in direction of travel. Top management credibility and vision is critical to success. Rescue is only possible if the company has a strong cash generative business among its businesses. The speed of decision making, decisiveness and the ability to implement strategy are among the key ingredients of success. Originality/value – Turnaround is an under-researched area in strategy. This paper contributes to a better understanding in this important area and bridges the gap between theory and practice. It provides a practical view and demonstrates how a leading executive with significant expertise and successful turnaround track record deals with inherent dilemmas of turnaround
Resumo:
Area-wide development viability appraisals are undertaken to determine the economic feasibility of policy targets in relation to planning obligations. Essentially, development viability appraisals consist of a series of residual valuations of hypothetical development sites across a local authority area at a particular point in time. The valuations incorporate the estimated financial implications of the proposed level of planning obligations. To determine viability the output land values are benchmarked against threshold land value and therefore the basis on which this threshold is established and the level at which it is set is critical to development viability appraisal at the policy-setting (area-wide) level. Essentially it is an estimate of the value at which a landowner would be prepared to sell. If the estimated site values are higher than the threshold land value the policy target is considered viable. This paper investigates the effectiveness of existing methods of determining threshold land value. They will be tested against the relationship between development value and costs. Modelling reveals that threshold land value that is not related to shifts in development value renders marginal sites unviable and fails to collect proportionate planning obligations from high value/low cost sites. Testing the model against national average house prices and build costs reveals the high degree of volatility in residual land values over time and underlines the importance of making threshold land value relative to the main driver of this volatility, namely development value.
Resumo:
With the growing number and significance of urban meteorological networks (UMNs) across the world, it is becoming critical to establish a standard metadata protocol. Indeed, a review of existing UMNs indicate large variations in the quality, quantity, and availability of metadata containing technical information (i.e., equipment, communication methods) and network practices (i.e., quality assurance/quality control and data management procedures). Without such metadata, the utility of UMNs is greatly compromised. There is a need to bring together the currently disparate sets of guidelines to ensure informed and well-documented future deployments. This should significantly improve the quality, and therefore the applicability, of the high-resolution data available from such networks. Here, the first metadata protocol for UMNs is proposed, drawing on current recommendations for urban climate stations and identified best practice in existing networks
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.
Resumo:
The rise of food security up international political, societal and academic agendas has led to increasing interest in novel means of improving primary food production and reducing waste. There are however, also many ‘post-farm gate’ activities that are critical to food security, including processing, packaging, distributing, retailing, cooking and consuming. These activities all affect a range of important food security elements, notably availability, affordability and other aspects of access, nutrition and safety. Addressing the challenge of universal food security, in the context of a number of other policy goals (e.g. social, economic and environmental sustainability), is of keen interest to a range of UK stakeholders but requires an up-to-date evidence base and continuous innovation. An exercise was therefore conducted, under the auspices of the UK Global Food Security Programme, to identify priority research questions with a focus on the UK food system (though the outcomes may be broadly applicable to other developed nations). Emphasis was placed on incorporating a wide range of perspectives (‘world views’) from different stakeholder groups: policy, private sector, non-governmental organisations, advocacy groups and academia. A total of 456 individuals submitted 820 questions from which 100 were selected by a process of online voting and a three-stage workshop voting exercise. These 100 final questions were sorted into 10 themes and the ‘top’ question for each theme identified by a further voting exercise. This step also allowed four different stakeholder groups to select the top 7–8 questions from their perspectives. Results of these voting exercises are presented. It is clear from the wide range of questions prioritised in this exercise that the different stakeholder groups identified specific research needs on a range of post-farm gate activities and food security outcomes. Evidence needs related to food affordability, nutrition and food safety (all key elements of food security) featured highly in the exercise. While there were some questions relating to climate impacts on production, other important topics for food security (e.g. trade, transport, preference and cultural needs) were not viewed as strongly by the participants.
Resumo:
The Walker circulation is one of the major components of the large-scale tropical atmospheric circulation and variations in its strength are critical to equatorial Pacific Ocean circulation. It has been argued in the literature that during the 20th century the Walker circulation weakened, and that this weakening was attributable to anthropogenic climate change. By using updated observations, we show that there has been a rapid interdecadal enhancement of the Walker circulation since the late 1990s. Associated with this enhancement is enhanced precipitation in the tropical western Pacific, anomalous westerlies in the upper troposphere, descent in the central and eastern tropical Pacific, and anomalous surface easterlies in the western and central tropical Pacific. The characteristics of associated oceanic changes are a strengthened thermocline slope and an enhanced zonal SST gradient across the tropical Pacific. Many characteristics of these changes are similar to those associated with the mid-1970s climate shift with an opposite sign. We also show that the interdecadal variability of the Walker circulation in the tropical Pacific is inversely correlated to the interdecadal variability of the zonal circulation in the tropical Atlantic. An enhancement of the Walker circulation in the tropical Pacific is associated with a weakening zonal circulation in the tropical Atlantic and vise versa, implying an inter-Atlantic-Pacific connection of the zonal overturning circulation variation. Whether these recent changes will be sustained is not yet clear, but our research highlights the importance of understanding the interdecadal variability, as well as the long-term trends, that influence tropical circulation.
Resumo:
The long observational record is critical to our understanding of the Earth’s climate, but most observing systems were not developed with a climate objective in mind. As a result, tremendous efforts have gone into assessing and reprocessing the data records to improve their usefulness in climate studies. The purpose of this paper is to both review recent progress in reprocessing and reanalyzing observations, and summarize the challenges that must be overcome in order to improve our understanding of climate and variability. Reprocessing improves data quality through more scrutiny and improved retrieval techniques for individual observing systems, while reanalysis merges many disparate observations with models through data assimilation, yet both aim to provide a climatology of Earth processes. Many challenges remain, such as tracking the improvement of processing algorithms and limited spatial coverage. Reanalyses have fostered significant research, yet reliable global trends in many physical fields are not yet attainable, despite significant advances in data assimilation and numerical modeling. Oceanic reanalyses have made significant advances in recent years, but will only be discussed here in terms of progress toward integrated Earth system analyses. Climate data sets are generally adequate for process studies and large-scale climate variability. Communication of the strengths, limitations and uncertainties of reprocessed observations and reanalysis data, not only among the community of developers, but also with the extended research community, including the new generations of researchers and the decision makers is crucial for further advancement of the observational data records. It must be emphasized that careful investigation of the data and processing methods are required to use the observations appropriately.
Resumo:
Observational evidence is scarce concerning the distribution of plant pathogen population sizes or densities as a function of time-scale or spatial scale. For wild pathosystems we can only get indirect evidence from evolutionary patterns and the consequences of biological invasions.We have little or no evidence bearing on extermination of hosts by pathogens, or successful escape of a host from a pathogen. Evidence over the last couple of centuries from crops suggest that the abundance of particular pathogens in the spectrum affecting a given host can vary hugely on decadal timescales. However, this may be an artefact of domestication and intensive cultivation. Host-pathogen dynamics can be formulated mathematically fairly easily–for example as SIR-type differential equation or difference equation models, and this has been the (successful) focus of recent work in crops. “Long-term” is then discussed in terms of the time taken to relax from a perturbation to the asymptotic state. However, both host and pathogen dynamics are driven by environmental factors as well as their mutual interactions, and both host and pathogen co-evolve, and evolve in response to external factors. We have virtually no information about the importance and natural role of higher trophic levels (hyperpathogens) and competitors, but they could also induce long-scale fluctuations in the abundance of pathogens on particular hosts. In wild pathosystems the host distribution cannot be modelled as either a uniform density or even a uniform distribution of fields (which could then be treated as individuals). Patterns of short term density-dependence and the detail of host distribution are therefore critical to long-term dynamics. Host density distributions are not usually scale-free, but are rarely uniform or clearly structured on a single scale. In a (multiply structured) metapopulation with coevolution and external disturbances it could well be the case that the time required to attain equilibrium (if it exists) based on conditions stable over a specified time-scale is longer than that time-scale. Alternatively, local equilibria may be reached fairly rapidly following perturbations but the meta-population equilibrium be attained very slowly. In either case, meta-stability on various time-scales is a more relevant than equilibrium concepts in explaining observed patterns.