88 resultados para Quality Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil organic matter (SOM) is one of the main global carbon pools. It is a measure of soil quality as its presence increases carbon sequestration and improves physical and chemical soil properties. The determination and characterisation of humic substances gives essential information of the maturity and stresses of soils as well as of their health. However, the determination of the exact nature and molecular structure of these substances has been proven difficult. Several complex techniques exist to characterise SOM and mineralisation and humification processes. One of the more widely accepted for its accuracy is nuclear magnetic resonance (NMR) spectroscopy. Despite its efficacy, NMR needs significant economic resources, equipment, material and time. Proxy measures like the fluorescence index (FI), cold and hot-water extractable carbon (CWC and HWC) and SUVA-254 have the potential to characterise SOM and, in combination, provide qualitative and quantitative data of SOM and its processes. Spanish and British agricultural cambisols were used to measure SOM quality and determine whether similarities were found between optical techniques and 1H NMR results in these two regions with contrasting climatic conditions. High correlations (p < 0.001) were found between the specific aromatic fraction measured with 1H NMR and SUVA-254 (Rs = 0.95) and HWC (Rs = 0.90), which could be described using a linear model. A high correlation between FI and the aromatics fraction measured with 1H NMR (Rs = −0.976) was also observed. In view of our results, optical measures have a potential, in combination, to predict the aromatic fraction of SOM without the need of expensive and time consuming techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to first evaluate the benefits of including Jersey milk into Holstein-Friesian milk on the Cheddar cheese making process and secondly, using the data gathered, identify the effects and relative importance of a wide range of milk components on milk coagulation properties and the cheese making process. Blending Jersey and Holstein-Friesian milk led to quadratic trends on the size of casein micelle and fat globule and on coagulation properties. However this was not found to affect the cheese making process. Including Jersey milk was found, on a pilot scale, to increase cheese yield (up to + 35 %) but it did not affect cheese quality, which was defined as compliance with the legal requirements of cheese composition, cheese texture, colour and grading scores. Profitability increased linearly with the inclusion of Jersey milk (up to 11.18 p£ L-1 of milk). The commercial trials supported the pilot plant findings, demonstrating that including Jersey milk increased cheese yield without having a negative impact on cheese quality, despite the inherent challenges of scaling up such a process commercially. The successful use of a large array of milk components to model the cheese making process challenged the commonly accepted view that fat, protein and casein content and protein to fat ratio are the main contributors to the cheese making process as other components such as the size of casein micelle and fat globule were found to also play a key role with small casein micelle and large fat globule reducing coagulation time, improving curd firmness, fat recovery and influencing cheese moisture and fat content. The findings of this thesis indicated that milk suitability for Cheddar making could be improved by the inclusion of Jersey milk and that more compositional factors need to be taken into account when judging milk suitability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Customers will not continue to pay for a service if it is perceived to be of poor quality, and/or of no value. With a paradigm shift towards business dependence on service orientated IS solutions [1], it is critical that alignment exists between service definition, delivery, and customer expectation, businesses are to ensure customer satisfaction. Services, and micro-service development, offer businesses a flexible structure for solution innovation, however, constant changes in technology, business and societal expectations means an iterative analysis solution is required to i) determine whether provider services adequately meet customer segment needs and expectations, and ii) to help guide business service innovation and development. In this paper, by incorporating multiple models, we propose a series of steps to help identify and prioritise service gaps. Moreover, the authors propose the Dual Semiosis Analysis Model, i.e. a tool that highlights where within the symbiotic customer / provider semiosis process, requirements misinterpretation, and/or service provision deficiencies occur. This paper offers the reader a powerful customer-centric tool, designed to help business managers highlight both what services are critical to customer quality perception, and where future innovation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Using continuing professional development (CPD) as part of the revalidation of pharmacy professionals has been proposed in the UK but not implemented. We developed a CPD Outcomes Framework (‘the framework’) for scoring CPD records, where the score range was -100 to +150 based on demonstrable relevance and impact of the CPD on practice. OBJECTIVE: This exploratory study aimed to test the outcome of training people to use the framework, through distance-learning material (active intervention), by comparing CPD scores before and after training. SETTING: Pharmacy professionals were recruited in the UK in Reading, Banbury, Southampton, Kingston-upon-Thames and Guildford in 2009. METHOD: We conducted a randomised, double-blinded, parallel-group, before and after study. The control group simply received information on new CPD requirements through the post; the active intervention group also received the framework and associated training. Altogether 48 participants (25 control, 23 active) completed the study. All participants submitted CPD records to the research team before and after receiving the posted resources. The records (n=226) were scored blindly by the researchers using the framework. A subgroup of CPD records (n=96) submitted first (before-stage) and rewritten (after-stage) were analysed separately. MAIN OUTCOME MEASURE: Scores for CPD records received before and after distributing group-dependent material through the post. RESULTS: Using a linear-regression model both analyses found an increase in CPD scores in favour of the active intervention group. For the complete set of records, the effect was a mean difference of 9.9 (95% CI = 0.4 to 19.3), p-value = 0.04. For the subgroup of rewritten records, the effect was a mean difference of 17.3 (95% CI = 5.6 to 28.9), p-value = 0.0048. CONCLUSION: The intervention improved participants’ CPD behaviour. Training pharmacy professionals to use the framework resulted in better CPD activities and CPD records, potentially helpful for revalidation of pharmacy professionals. IMPACT: • Using a bespoke Continuing Professional Development outcomes framework improves the value of pharmacy professionals’ CPD activities and CPD records, with the potential to improve patient care. • The CPD outcomes framework could be helpful to pharmacy professionals internationally who want to improve the quality of their CPD activities and CPD records. • Regulators and officials across Europe and beyond can assess the suitability of the CPD outcomes framework for use in pharmacy CPD and revalidation in their own setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many business schools, the field of strategic management has been elevated to the same status as more traditional subject areas such as finance, marketing and organizational behaviour. However, the field is rather unclearly delineated at present, as a result of the heavy usage of borrowed theories, a phenomenon we discuss in this article. For strategic management to become a legitimate subject area, truly at par with the more conventional fields taught in business schools, we recommend much stronger selectivity when borrowing theories from other areas of scholarly inquiry than management, as the foundation of empirical work. We propose a new model consisting of seven quality tests to assess whether proper selectivity is being applied when ‘importing’ concepts from other fields than management. Our perspective has major implications both for future, evidence-based strategic management research and for the field's key stakeholders such as strategy teachers, practitioners and policy makers – who rely on research outputs from strategy scholars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to present the model of the translation of particularly important ideas for the organization and its context, called mythical ideas. Design/methodology/approach – The study is based on ethnographic research. Findings – It is found that change processes based on mythical ideas are especially dynamic but also very vulnerable. The consequences of failure can be vital for the organization and its environment. Originality/value – The paper explores the outcomes to which the translation of a mythical idea can lead. The findings are of value for people involved in organizational change processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses one of the issues in contemporary globalisation theory: the extent to which there is ‘one best way’ in which business can be done and organisations managed. It uses Czarniawska’s ‘Travels of Ideas’ model as an organising framework to present and understand how the concept of ‘Quality’, so important in contemporary approaches to manufacturing & services, and their management, travelled to, and impinged on, a newly opened vehicle assembly plant in Poland. The extent to which new meanings were mutually created in the process of translation is discussed, using ethnographic reporting and analysis techniques commonly used in diffusion research. Parallels between the process of translation as an idea becomes embedded into a new cultural location, and the processes which contemporary research has identified as important to organisational learning, are briefly discussed in conclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous versions of the Consortium for Small-scale Modelling (COSMO) numerical weather prediction model have used a constant sea-ice surface temperature, but observations show a high degree of variability on sub-daily timescales. To account for this, we have implemented a thermodynamic sea-ice module in COSMO and performed simulations at a resolution of 15 km and 5 km for the Laptev Sea area in April 2008. Temporal and spatial variability of surface and 2-m air temperature are verified by four automatic weather stations deployed along the edge of the western New Siberian polynya during the Transdrift XIII-2 expedition and by surface temperature charts derived from Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data. A remarkable agreement between the new model results and these observations demonstrates that the implemented sea-ice module can be applied for short-range simulations. Prescribing the polynya areas daily, our COSMO simulations provide a high-resolution and high-quality atmospheric data set for the Laptev Sea for the period 14-30 April 2008. Based on this data set, we derive a mean total sea-ice production rate of 0.53 km3/day for all Laptev Sea polynyas under the assumption that the polynyas are ice-free and a rate of 0.30 km3/day if a 10-cm-thin ice layer is assumed. Our results indicate that ice production in Laptev Sea polynyas has been overestimated in previous studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A data insertion method, where a dispersion model is initialized from ash properties derived from a series of satellite observations, is used to model the 8 May 2010 Eyjafjallajökull volcanic ash cloud which extended from Iceland to northern Spain. We also briefly discuss the application of this method to the April 2010 phase of the Eyjafjallajökull eruption and the May 2011 Grímsvötn eruption. An advantage of this method is that very little knowledge about the eruption itself is required because some of the usual eruption source parameters are not used. The method may therefore be useful for remote volcanoes where good satellite observations of the erupted material are available, but little is known about the properties of the actual eruption. It does, however, have a number of limitations related to the quality and availability of the observations. We demonstrate that, using certain configurations, the data insertion method is able to capture the structure of a thin filament of ash extending over northern Spain that is not fully captured by other modeling methods. It also verifies well against the satellite observations according to the quantitative object-based quality metric, SAL—structure, amplitude, location, and the spatial coverage metric, Figure of Merit in Space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

European air quality legislation has reduced emissions of air pollutants across Europe since the 1970s, affecting air quality, human health and regional climate. We used a coupled composition-climate model to simulate the impacts of European air quality legislation and technology measures implemented between 1970 and 2010. We contrast simulations using two emission scenarios; one with actual emissions in 2010 and the other with emissions that would have occurred in 2010 in the absence of technological improvements and end-of-pipe treatment measures in the energy, industrial and road transport sectors. European emissions of sulphur dioxide, black carbon (BC) and organic carbon in 2010 are 53%, 59% and 32% lower respectively compared to emissions that would have occurred in 2010 in the absence of legislative and technology measures. These emission reductions decreased simulated European annual mean concentrations of fine particulate matter(PM2.5) by 35%, sulphate by 44%, BC by 56% and particulate organic matter by 23%. The reduction in PM2.5 concentrations is calculated to have prevented 80 000 (37 000–116 000, at 95% confidence intervals) premature deaths annually across the European Union, resulting in a perceived financial benefit to society of US$232 billion annually (1.4% of 2010 EU GDP). The reduction in aerosol concentrations due to legislative and technology measures caused a positive change in the aerosol radiative effect at the top of atmosphere, reduced atmospheric absorption and also increased the amount of solar radiation incident at the surface over Europe. We used an energy budget approximation to estimate that these changes in the radiative balance have increased European annual mean surface temperatures and precipitation by 0.45 ± 0.11 °C and by 13 ± 0.8 mm yr−1 respectively. Our results show that the implementation of European legislation and technological improvements to reduce the emission of air pollutants has improved air quality and human health over Europe, as well as having an unintended impact on the regional radiative balance and climate.