942 resultados para Quality analysis
Resumo:
Objectives: To describe what is known of quality of life for colorectal cancer patients, to review what has been done in the Australian setting and to identify emerging directions for future research to address current gaps in knowledge. Method: A literature search (using Medline, PsychInfo, CINAHL and Sociological Abstracts) was conducted and 41 articles identified for review. Results: Three key areas relating to quality of life in colorectal cancer patients emerged from the literature review: the definition and measurement of quality of life; predictors of quality of life; and the relationship of quality of life to survival. Results of existing studies are inconsistent in relation to quality of life over time and its relationship to survival. Small sample sizes and methodological limitations make interpretation difficult. Conclusions: There is a need for large-scale, longitudinal, population-based studies describing the quality of life experienced by colorectal cancer patients and its determinants. Measurement and simultaneous adjustment for potential confounding factors would productively advance knowledge in this area, as would an analysis of the economic cost of morbidity to the community and an assessment of the cost effectiveness of proposed interventions. Implications: As the Australian population ages, the prevalence of colorectal cancer within the community will increase. This burden of disease presents as a priority area for public health research. An improved understanding of quality of life and its predictors will inform the development and design of supportive interventions for those affected by the disease.
Resumo:
In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Measurement of Health-Related Quality of Life (HRQoL) of the elderly requires instruments with demonstrated sensitivity, reliability, and validity, particularly with the increasing proportion of older people entering the health care system. This article reports the psychometric properties of the 12-item Assessment of Quality of Life (AQoL) instrument in chronically ill community-dwelling elderly people with an 18-month follow-up. Comparator instruments included the SF-36 and the OARS. Construct validity of the AQoL was strong when examined via factor analysis and convergent and divergent validity against other scales. Receiver Operator Characteristic (ROC) curve analyses and relative efficiency estimates indicated the AQoL is sensitive, responsive, and had the strongest predicative validity for nursing home entry. It was also sensitive to economic prediction over the follow-up. Given these robust psychometric properties and the brevity of the scale, AQoL appears to be a suitable instrument for epidemiologic studies where HRQoL and utility data are required from elderly populations. (C) 2003 Elsevier Science Inc. All rights reserved.
Resumo:
Regional commodity forecasts are being used increasingly in agricultural industries to enhance their risk management and decision-making processes. These commodity forecasts are probabilistic in nature and are often integrated with a seasonal climate forecast system. The climate forecast system is based on a subset of analogue years drawn from the full climatological distribution. In this study we sought to measure forecast quality for such an integrated system. We investigated the quality of a commodity (i.e. wheat and sugar) forecast based on a subset of analogue years in relation to a standard reference forecast based on the full climatological set. We derived three key dimensions of forecast quality for such probabilistic forecasts: reliability, distribution shift, and change in dispersion. A measure of reliability was required to ensure no bias in the forecast distribution. This was assessed via the slope of the reliability plot, which was derived from examination of probability levels of forecasts and associated frequencies of realizations. The other two dimensions related to changes in features of the forecast distribution relative to the reference distribution. The relationship of 13 published accuracy/skill measures to these dimensions of forecast quality was assessed using principal component analysis in case studies of commodity forecasting using seasonal climate forecasting for the wheat and sugar industries in Australia. There were two orthogonal dimensions of forecast quality: one associated with distribution shift relative to the reference distribution and the other associated with relative distribution dispersion. Although the conventional quality measures aligned with these dimensions, none measured both adequately. We conclude that a multi-dimensional approach to assessment of forecast quality is required and that simple measures of reliability, distribution shift, and change in dispersion provide a means for such assessment. The analysis presented was also relevant to measuring quality of probabilistic seasonal climate forecasting systems. The importance of retaining a focus on the probabilistic nature of the forecast and avoiding simplifying, but erroneous, distortions was discussed in relation to applying this new forecast quality assessment paradigm to seasonal climate forecasts. Copyright (K) 2003 Royal Meteorological Society.
Resumo:
In this paper, we revisit the surface mass excess in adsorption studies and investigate the role of the volume of the adsorbed phase and its density in the analysis of supercritical gas adsorption in non-porous as well as microporous solids. For many supercritical fluids tested (krypton, argon, nitrogen, methane) on many different carbonaceous solids, it is found that the volume of the adsorbed phase is confined mostly to a geometrical volume having a thickness of up to a few molecular diameters. At high pressure the adsorbed phase density is also found to be very close to but never equal or greater than the liquid phase density. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.
Resumo:
Fungal diseases are important factors limiting common bean yield. White mold is one of the main diseases caused by soil pathogens. The objective of this study was to quantify the distribution of a fungicide solution sprayed into the canopy of bean plants by spectrophotometry, using a boom sprayer with and without air assistance. The experiment was arranged in a 2 x 2 x 2 factorial (two types of nozzles, two application rates, and air assistance on and off) randomized block design with four replications. Air assistance influenced the deposition of solution on the bean plant and yield increased significantly with the increased rate of application and air assistance in the boom sprayer.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Graphical user interfaces (GUIs) are critical components of todays software. Given their increased relevance, correctness and usability of GUIs are becoming essential. This paper describes the latest results in the development of our tool to reverse engineer the GUI layer of interactive computing systems. We use static analysis techniques to generate models of the user interface behaviour from source code. Models help in graphical user interface inspection by allowing designers to concentrate on its more important aspects. One particularly type of model that the tool is able to generate is state machines. The paper shows how graph theory can be useful when applied to these models. A number of metrics and algorithms are used in the analysis of aspects of the user interface's quality. The ultimate goal of the tool is to enable analysis of interactive system through GUIs source code inspection.
Resumo:
When developing interactive applications, considering the correctness of graphical user interfaces (GUIs) code is essential. GUIs are critical components of today's software, and contemporary software tools do not provide enough support for ensuring GUIs' code quality. GUIsurfer, a GUI reverse engineering tool, enables evaluation of behavioral properties of user interfaces. It performs static analysis of GUI code, generating state machines that can help in the evaluation of interactive applications. This paper describes the design, software architecture, and the use of GUIsurfer through an example. The tool is easily re-targetable, and support is available to Java/Swing, and WxHaskell. The paper sets the ground for a generalization effort to consider rich internet applications. It explores the GWT web applications' user interface programming toolkit.
Resumo:
Purpose – Castings defects are usually easy to characterize, but to eradicate them can be a difficult task. In many cases, defects are caused by the combined effect of different factors, whose identification is often difficult. Besides, the real non-quality costs are usually unknown, and even neglected. This paper aims to describe the development of a modular tool for quality improvement in foundries, and its main objective is to present the application potential and the foundry process areas that are covered and taken into account. Design/methodology/approach – The integrated model was conceived as an expert system, designated Qualifound, which performs both qualitative and quantitative analyses. For the qualitative analyses mode, the nomenclature and the description of defects are based on the classification suggested by the International Committee of the Foundry Technical Association. Thus, a database of defects was established, enabling one to associate the defects with the relevant process operations and the identification of their possible causes. The quantitative analysis mode deals with the number of produced and rejected castings and includes the calculation of the non-quality costs. Findings – The validation of Qualifound was carried out in a Portuguese foundry, whose quality system had been certified according to the ISO 9000 standards. Qualifound was used in every management area and it was concluded that the application had the required technological requisites to provide the necessary information for the foundry management to improve process quality. Originality/value – The paper presents a successful application of an informatics tool on quality improvement in foundries.
Resumo:
The textile industry has a long tradition in Portugal and it is one of the most important sectors, despite the current economic crisis. It has always assumed a prominent role in terms of employment and a relevant position within the Portuguese economy. The lack of quality and the lower prices that other countries offer causes the loss of clients. Quality is a main tool to survive nowadays in the textile sector. To undertake our analysis, we made use of an existing database where 55 firms belonged to the textile industry, namely to the manufacturing sector. A new survey was created based on the original survey and was sent to 5 firms. Besides the survey, we also sent a few questions to the firms in order to retract more information about the actually situation in our country, concerning the textile industry. Several tables, graphs and pie charts were made to help shed light on our findings. This research was conducted in order to determine the importance of quality in the consolidation of textile firms in the north of Portugal. Most firms in our sample feel that quality improvement, business benefits, mobilizing employees’ knowledge and business image were important and that competition is very intense and is mainly by price and not by differentiation of product or service. The quality program has contributed to improve their competitive position and the improvement of their overall performance. The majority of the firms in our sample undertake TQM measures for quality purposes to meet customer expectations and prevent errors. Of all firms surveyed, the quality is certainly very important for its survival.
Resumo:
This article presents a research work, the goal of which was to achieve a model for the evaluation of data quality in institutional websites of health units in a broad and balanced way. We have carried out a literature review of the available approaches for the evaluation of website content quality, in order to identify the most recurrent dimensions and the attributes, and we have also carried out a Delphi method process with experts in order to reach an adequate set of attributes and their respective weights for the measurement of content quality. The results obtained revealed a high level of consensus among the experts who participated in the Delphi process. On the other hand, the different statistical analysis and techniques implemented are robust and attach confidence to our results and consequent model obtained.
Resumo:
This paper assesses the validity and reliability of two instruments measuring quality of service, the SERVPERF and SERVQUAL scales, replicated in a novel cultural settings, a Portuguese energy company. To provide insights and strategies for managerial intervention, a relation between customers’ satisfaction and quality of service is established. The empirical study suggests a superior convergent and predictive validity of SERVPERF scale to measure quality of service in this settings when comparing to SERVQUAL. The main differences of this study with previous ones, are that this one resorts on a confirmatory factor analysis, the validation of the instruments is performed by using the same measures suggested by their creators and extends the line of research to a novel cultural settings, a Portuguese energy company. Concerning the relationship between service quality and customers’ satisfaction, all of the quality of service attributes correlate almost equally to the satisfaction ones, with a lower weight concerning tangibles.
Resumo:
This paper assesses the validity and reliability of two instruments measuring quality of service, the SERVPERF and SERVQUAL scales, replicated in a novel cultural settings, a Portuguese energy company. To provide insights and strategies for managerial intervention, a relation between customers’ satisfaction and quality of service is established. The empirical study suggests a superior convergent and predictive validity of SERVPERF scale to measure quality of service in this settings when comparing to SERVQUAL. The main differences of this study with previous ones, are that this one resorts on a confirmatory factor analysis, the validation of the instruments is performed by using the same measures suggested by their creators and extends the line of research to a novel cultural settings, a Portuguese energy company. Concerning the relationship between service quality and customers’ satisfaction, all of the quality of service attributes correlate almost equally to the satisfaction ones, with a lower weight concerning tangibles.