88 resultados para Quality Model
Resumo:
This study proposes a conceptual model for customer experience quality and its impact on customer relationship outcomes. Customer experience is conceptualized as the customer’s subjective response to the holistic direct and indirect encounter with the firm, and customer experience quality as its perceived excellence or superiority. Using the repertory grid technique in 40 interviews in B2B and B2C contexts, the authors find that customer experience quality is judged with respect to its contribution to value-in-use, and hence propose that value-in-use mediates between experience quality and relationship outcomes. Experience quality includes evaluations not just of the firm’s products and services but also of peer-to-peer and complementary supplier encounters. In assessing experience quality in B2B contexts, customers place a greater emphasis on firm practices that focus on understanding and delivering value-in-use than is generally the case in B2C contexts. Implications for practitioners’ customer insight processes and future research directions are suggested.
Resumo:
Foundation construction process has been an important key point in a successful construction engineering. The frequency of using diaphragm wall construction method among many deep excavation construction methods in Taiwan is the highest in the world. The traditional view of managing diaphragm wall unit in the sequencing of construction activities is to establish each phase of the sequencing of construction activities by heuristics. However, it conflicts final phase of engineering construction with unit construction and effects planning construction time. In order to avoid this kind of situation, we use management of science in the study of diaphragm wall unit construction to formulate multi-objective combinational optimization problem. Because the characteristic (belong to NP-Complete problem) of problem mathematic model is multi-objective and combining explosive, it is advised that using the 2-type Self-Learning Neural Network (SLNN) to solve the N=12, 24, 36 of diaphragm wall unit in the sequencing of construction activities program problem. In order to compare the liability of the results, this study will use random researching method in comparison with the SLNN. It is found that the testing result of SLNN is superior to random researching method in whether solution-quality or Solving-efficiency.
Resumo:
As part of a large European coastal operational oceanography project (ECOOP), we have developed a web portal for the display and comparison of model and in situ marine data. The distributed model and in situ datasets are accessed via an Open Geospatial Consortium Web Map Service (WMS) and Web Feature Service (WFS) respectively. These services were developed independently and readily integrated for the purposes of the ECOOP project, illustrating the ease of interoperability resulting from adherence to international standards. The key feature of the portal is the ability to display co-plotted timeseries of the in situ and model data and the quantification of misfits between the two. By using standards-based web technology we allow the user to quickly and easily explore over twenty model data feeds and compare these with dozens of in situ data feeds without being concerned with the low level details of differing file formats or the physical location of the data. Scientific and operational benefits to this work include model validation, quality control of observations, data assimilation and decision support in near real time. In these areas it is essential to be able to bring different data streams together from often disparate locations.
Resumo:
During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.
Resumo:
This article presents the results of a study that explored the human side of the multimedia experience. We propose a model that assesses quality variation from three distinct levels: the network, the media and the content levels; and from two views: the technical and the user perspective. By facilitating parameter variation at each of the quality levels and from each of the perspectives, we were able to examine their impact on user quality perception. Results show that a significant reduction in frame rate does not proportionally reduce the user's understanding of the presentation independent of technical parameters, that multimedia content type significantly impacts user information assimilation, user level of enjoyment, and user perception of quality, and that the device display type impacts user information assimilation and user perception of quality. Finally, to ensure the transfer of information, low-level abstraction (network-level) parameters, such as delay and jitter, should be adapted; to maintain the user's level of enjoyment, high-level abstraction quality parameters (content-level), such as the appropriate use of display screens, should be adapted.
Resumo:
Distributed multimedia supports a symbiotic infotainment duality, i.e. the ability to transfer information to the user, yet also provide the user with a level of satisfaction. As multimedia is ultimately produced for the education and / or enjoyment of viewers, the user’s-perspective concerning the presentation quality is surely of equal importance as objective Quality of Service (QoS) technical parameters, to defining distributed multimedia quality. In order to extensively measure the user-perspective of multimedia video quality, we introduce an extended model of distributed multimedia quality that segregates quality into three discrete levels: the network-level, the media-level and content-level, using two distinct quality perspectives: the user-perspective and the technical-perspective. Since experimental questionnaires do not provide continuous monitoring of user attention, eye tracking was used in our study in order to provide a better understanding of the role that the human element plays in the reception, analysis and synthesis of multimedia data. Results showed that video content adaptation, results in disparity in user video eye-paths when: i) no single / obvious point of focus exists; or ii) when the point of attention changes dramatically. Accordingly, appropriate technical- and user-perspective parameter adaptation is implemented, for all quality abstractions of our model, i.e. network-level (via simulated delay and jitter), media-level (via a technical- and user-perspective manipulated region-of-interest attentive display) and content-level (via display-type and video clip-type). Our work has shown that user perception of distributed multimedia quality cannot be achieved by means of purely technical-perspective QoS parameter adaptation.
Resumo:
The assessment of the potential landscape impacts of the latest Common Agricultural Policy reforms constitutes a challenge for policy makers and it requires the development of models that can reliably project the likely spatial distribution of land uses. The aim of this study is to investigate the impact of 2003 CAP reforms to land uses and rural landscapes across England. For this purpose we modified an existing economic model of agriculture, the Land-Use Allocation Model (LUAM) to provide outputs at a scale appropriate for informing a semi-quantitative landscape assessment at the level of ‘Joint Character Areas’ (JCAs). Overall a decline in the cereal and oilseed production area is projected but intensive arable production will persist in specific locations (East of England, East Midlands and South East), having ongoing negative effects on the character of many JCAs. The impacts of de-coupling will be far more profound on the livestock sector; extensification of production will occur in traditional mixed farming regions (e.g. the South West), a partial displacement of cattle by sheep in the upland regions and an increase in the sheep numbers is expected in the lowlands (South East, Eastern and East Midlands). This extensification process will affect positively those JCAs of mixed farming conditions, but it will have negative impacts on the JCAs of historically low intensity farming (e.g. the uplands of north-west) because they will suffer from under-management and land idling. Our analysis shows that the territorialisation between intensively and extensively agricultural landscapes will continue.
Resumo:
Export coefficient modelling was used to model the impact of agriculture on nitrogen and phosphorus loading on the surface waters of two contrasting agricultural catchments. The model was originally developed for the Windrush catchment where the highly reactive Jurassic limestone aquifer underlying the catchment is well connected to the surface drainage network, allowing the system to be modelled using uniform export coefficients for each nutrient source in the catchment, regardless of proximity to the surface drainage network. In the Slapton catchment, the hydrological path-ways are dominated by surface and lateral shallow subsurface flow, requiring modification of the export coefficient model to incorporate a distance-decay component in the export coefficients. The modified model was calibrated against observed total nitrogen and total phosphorus loads delivered to Slapton Ley from inflowing streams in its catchment. Sensitivity analysis was conducted to isolate the key controls on nutrient export in the modified model. The model was validated against long-term records of water quality, and was found to be accurate in its predictions and sensitive to both temporal and spatial changes in agricultural practice in the catchment. The model was then used to forecast the potential reduction in nutrient loading on Slapton Ley associated with a range of catchment management strategies. The best practicable environmental option (BPEO) was found to be spatial redistribution of high nutrient export risk sources to areas of the catchment with the greatest intrinsic nutrient retention capacity.
Resumo:
Until recently, pollution control in rural drainage basins of the UK consisted solely of water treatment at the point of abstraction. However, prevention of agricultural pollution at source is now a realistic option given the possibility of financing the necessary changes in land use through modification of the Common Agricultural Policy. This paper uses a nutrient export coefficient model to examine the cost of land-use change in relation to improvement of water quality. Catchment-wide schemes and local protection measures are considered. Modelling results underline the need for integrated management of entire drainage basins. A wide range of benefits may accrue from land-use change, including enhanced habitats for wildlife as well as better drinking water.
Resumo:
Let 0 denote the level of quality inherent in a food product that is delivered to some terminal market. In this paper, I characterize allocations over 0 and provide an economic rationale for regulating safety and quality standards in the food system. Zusman and Bockstael investigate the theoretical foundations for imposing standards and stress the importance of providing a tractable conceptual foundation. Despite a wealth of contributions that are mainly empirical (for reviews of these works see, respectively, Caswell and Antle), there have been relatively few attempts to model formally the linkages between farm and food markets when food quality and consumer safety are at issue. Here, I attempt to provide such a framework, building on key contributions in the theoretical literature and linking them in a simple model of quality determination in a vertically related marketing channel. The food-marketing model is due to Gardner. Spence provides a foundation for Pareto-improving intervention in a deterministic model of quality provision, and Leland, building on the classic paper by Akerlof, investigates licensing and minimum standards when the information structure is incomplete. Linking these ideas in a satisfactory model of the food markets is the main objective of the paper.
Resumo:
Data quality is a difficult notion to define precisely, and different communities have different views and understandings of the subject. This causes confusion, a lack of harmonization of data across communities and omission of vital quality information. For some existing data infrastructures, data quality standards cannot address the problem adequately and cannot fulfil all user needs or cover all concepts of data quality. In this study, we discuss some philosophical issues on data quality. We identify actual user needs on data quality, review existing standards and specifications on data quality, and propose an integrated model for data quality in the field of Earth observation (EO). We also propose a practical mechanism for applying the integrated quality information model to a large number of datasets through metadata inheritance. While our data quality management approach is in the domain of EO, we believe that the ideas and methodologies for data quality management can be applied to wider domains and disciplines to facilitate quality-enabled scientific research.
Resumo:
The evaluation of the quality and usefulness of climate modeling systems is dependent upon an assessment of both the limited predictability of the climate system and the uncertainties stemming from model formulation. In this study a methodology is presented that is suited to assess the performance of a regional climate model (RCM), based on its ability to represent the natural interannual variability on monthly and seasonal timescales. The methodology involves carrying out multiyear ensemble simulations (to assess the predictability bounds within which the model can be evaluated against observations) and multiyear sensitivity experiments using different model formulations (to assess the model uncertainty). As an example application, experiments driven by assimilated lateral boundary conditions and sea surface temperatures from the ECMWF Reanalysis Project (ERA-15, 1979–1993) were conducted. While the ensemble experiment demonstrates that the predictability of the regional climate varies strongly between different seasons and regions, being weakest during the summer and over continental regions, important sensitivities of the modeling system to parameterization choices are uncovered. In particular, compensating mechanisms related to the long-term representation of the water cycle are revealed, in which summer dry and hot conditions at the surface, resulting from insufficient evaporation, can persist despite insufficient net solar radiation (a result of unrealistic cloud-radiative feedbacks).
Resumo:
The formulation and performance of the Met Office visibility analysis and prediction system are described. The visibility diagnostic within the limited-area Unified Model is a function of humidity and a prognostic aerosol content. The aerosol model includes advection, industrial and general urban sources, plus boundary-layer mixing and removal by rain. The assimilation is a 3-dimensional variational scheme in which the visibility observation operator is a very nonlinear function of humidity, aerosol and temperature. A quality control scheme for visibility data is included. Visibility observations can give rise to humidity increments of significant magnitude compared with the direct impact of humidity observations. We present the results of sensitivity studies which show the contribution of different components of the system to improved skill in visibility forecasts. Visibility assimilation is most important within the first 6-12 hours of the forecast and for visibilities below 1 km, while modelling of aerosol sources and advection is important for slightly higher visibilities (1-5 km) and is still significant at longer forecast times
Resumo:
Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.