942 resultados para Quality models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fuzzy waste-load allocation model, FWLAM, is developed for water quality management of a river system using fuzzy multiple-objective optimization. An important feature of this model is its capability to incorporate the aspirations and conflicting objectives of the pollution control agency and dischargers. The vagueness associated with specifying the water quality criteria and fraction removal levels is modeled in a fuzzy framework. The goals related to the pollution control agency and dischargers are expressed as fuzzy sets. The membership functions of these fuzzy sets are considered to represent the variation of satisfaction levels of the pollution control agency and dischargers in attaining their respective goals. Two formulations—namely, the MAX-MIN and MAX-BIAS formulations—are proposed for FWLAM. The MAX-MIN formulation maximizes the minimum satisfaction level in the system. The MAX-BIAS formulation maximizes a bias measure, giving a solution that favors the dischargers. Maximization of the bias measure attempts to keep the satisfaction levels of the dischargers away from the minimum satisfaction level and that of the pollution control agency close to the minimum satisfaction level. Most of the conventional water quality management models use waste treatment cost curves that are uncertain and nonlinear. Unlike such models, FWLAM avoids the use of cost curves. Further, the model provides the flexibility for the pollution control agency and dischargers to specify their aspirations independently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With growing population and fast urbanization in Australia, it is a challenging task to maintain our water quality. It is essential to develop an appropriate statistical methodology in analyzing water quality data in order to draw valid conclusions and hence provide useful advices in water management. This paper is to develop robust rank-based procedures for analyzing nonnormally distributed data collected over time at different sites. To take account of temporal correlations of the observations within sites, we consider the optimally combined estimating functions proposed by Wang and Zhu (Biometrika, 93:459-464, 2006) which leads to more efficient parameter estimation. Furthermore, we apply the induced smoothing method to reduce the computational burden. Smoothing leads to easy calculation of the parameter estimates and their variance-covariance matrix. Analysis of water quality data from Total Iron and Total Cyanophytes shows the differences between the traditional generalized linear mixed models and rank regression models. Our analysis also demonstrates the advantages of the rank regression models for analyzing nonnormal data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental data usually include measurements, such as water quality data, which fall below detection limits, because of limitations of the instruments or of certain analytical methods used. The fact that some responses are not detected needs to be properly taken into account in statistical analysis of such data. However, it is well-known that it is challenging to analyze a data set with detection limits, and we often have to rely on the traditional parametric methods or simple imputation methods. Distributional assumptions can lead to biased inference and justification of distributions is often not possible when the data are correlated and there is a large proportion of data below detection limits. The extent of bias is usually unknown. To draw valid conclusions and hence provide useful advice for environmental management authorities, it is essential to develop and apply an appropriate statistical methodology. This paper proposes rank-based procedures for analyzing non-normally distributed data collected at different sites over a period of time in the presence of multiple detection limits. To take account of temporal correlations within each site, we propose an optimal linear combination of estimating functions and apply the induced smoothing method to reduce the computational burden. Finally, we apply the proposed method to the water quality data collected at Susquehanna River Basin in United States of America, which dearly demonstrates the advantages of the rank regression models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spectral data were collected of intact and ground kernels using 3 instruments (using Si-PbS, Si, and InGaAs detectors), operating over different areas of the spectrum (between 400 and 2500 nm) and employing transmittance, interactance, and reflectance sample presentation strategies. Kernels were assessed on the basis of oil and water content, and with respect to the defect categories of insect damage, rancidity, discoloration, mould growth, germination, and decomposition. Predictive model performance statistics for oil content models were acceptable on all instruments (R2 > 0.98; RMSECV < 2.5%, which is similar to reference analysis error), although that for the instrument employing reflectance optics was inferior to models developed for the instruments employing transmission optics. The spectral positions for calibration coefficients were consistent with absorbance due to the third overtones of CH2 stretching. Calibration models for moisture content in ground samples were acceptable on all instruments (R2 > 0.97; RMSECV < 0.2%), whereas calibration models for intact kernels were relatively poor. Calibration coefficients were more highly weighted around 1360, 740 and 840 nm, consistent with absorbance due to overtones of O-H stretching and combination. Intact kernels with brown centres or rancidity could be discriminated from each other and from sound kernels using principal component analysis. Part kernels affected by insect damage, discoloration, mould growth, germination, and decomposition could be discriminated from sound kernels. However, discrimination among these defect categories was not distinct and could not be validated on an independent set. It is concluded that there is good potential for a low cost Si photodiode array instrument to be employed to identify some quality defects of intact macadamia kernels and to quantify oil and moisture content of kernels in the process laboratory and for oil content in-line. Further work is required to examine the robustness of predictive models across different populations, including growing districts, cultivars and times of harvest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The robustness of multivariate calibration models, based on near infrared spectroscopy, for the assessment of total soluble solids (TSS) and dry matter (DM) of intact mandarin fruit (Citrus reticulata cv. Imperial) was assessed. TSS calibration model performance was validated in terms of prediction of populations of fruit not in the original population (different harvest days from a single tree, different harvest localities, different harvest seasons). Of these, calibration performance was most affected by validation across seasons (signal to noise statistic on root mean squared error of prediction of 3.8, compared with 20 and 13 for locality and harvest day, respectively). Procedures for sample selection from the validation population for addition to the calibration population (‘model updating’) were considered for both TSS and DM models. Random selection from the validation group worked as well as more sophisticated selection procedures, with approximately 20 samples required. Models that were developed using samples at a range of temperatures were robust in validation for TSS and DM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictive models based on near infra-red spectroscopy for the assessment of fruit internal quality attributes must exhibit a degree of robustness across the parameters of variety, district and time to be of practical use in fruit grading. At the time this thesis was initiated, while there were a number of published reports on the development of near infra-red based calibration models for the assessment of internal quality attributes of intact fruit, there were no reports of the reliability ("robustness") of such models across time, cultivars or growing regions. As existing published reports varied in instrumentation employed, a re-analysis of existing data was not possible. An instrument platform, based on partial transmittance optics, a halogen light source and (Zeiss MMS 1) detector operating in the short wavelength near infra-red region was developed for use in the assessment of intact fruit. This platform was used to assess populations of macadamia kernels, melons and mandarin fruit for total soluble solids, dry matter and oil concentration. Calibration procedures were optimised and robustness assessed across growing areas, time of harvest, season and variety. In general, global modified partial least squares regression (MPLS) calibration models based on derivatised absorbance data were better than either multiple linear regression or `local' MPLS models in the prediction of independent validation populations . Robustness was most affected by growing season, relative to the growing district or variety . Various calibration updating procedures were evaluated in terms of calibration robustness. Random selection of samples from the validation population for addition to the calibration population was equivalent to or better than other methods of sample addition (methods based on the Mahalanobis distance of samples from either the centroid of the population or neighbourhood samples). In these exercises the global Mahalanobis distance (GH) was calculated using the scores and loadings from the calibration population on the independent validation population. In practice, it is recommended that model predictive performance be monitored in terms of predicted sample GH, with model updating using as few as 10 samples from the new population undertaken when the average GH value exceeds 1 .0 .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Absenteeism is one of the major problems of Indian industries. It necessitates the employment of more manpower than the jobs require, resulting in the increase of manpower costs, and lowers the efficiency of plant operation through lowered performance and higher rejects. It also causes machine idleness, if extra manpower is not hired, resulting in disrupted work schedules and assignments. Several studies have investigated the causes of absenteeism (Vaid 1967) for example and their remedy and relationship between absenteeism and turnover with a suggested model for diagnosis and treatment (Hawk 1976) However, the production foremen and supervisor will face the operating task of determining how many extra operatives are to be hired in order to stave off the adverse effects of absenteeism on the man-machine system. This paper deals with a class of reserve manpower models based on the reject allowance model familiar in quality control literature. The present study considers, in addition to absenteeism, machine failures and the graded nature of manpower met within production systems and seeks to find optimal reserve manpower through computer simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To review models of care for older adults with cancer, with a focus on the role of the oncology nurse in geriatric oncology care. International exemplars of geriatric oncology nursing care are discussed. Data source Published peer reviewed literature, web-based resources, professional society materials, and the authors' experience. Conclusion Nursing care for older patients with cancer is complex and requires integrating knowledge from multiple disciplines that blends the sciences of geriatrics, oncology, and nursing. and which recognizes the dimensions of quality of life. Implications for Nursing Practice: Oncology nurses can benefit from learning key skills of comprehensive geriatric screening and assessment to improve the care they provide for older adults with cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of erroneous genotypes having passed standard quality control (QC) can be severe in genome-wide association studies, genotype imputation, and estimation of heritability and prediction of genetic risk based on single nucleotide polymorphisms (SNP). To detect such genotyping errors, a simple two-locus QC method, based on the difference in test statistic of association between single SNPs and pairs of SNPs, was developed and applied. The proposed approach could detect many problematic SNPs with statistical significance even when standard single SNP QC analyses fail to detect them in real data. Depending on the data set used, the number of erroneous SNPs that were not filtered out by standard single SNP QC but detected by the proposed approach varied from a few hundred to thousands. Using simulated data, it was shown that the proposed method was powerful and performed better than other tested existing methods. The power of the proposed approach to detect erroneous genotypes was approximately 80% for a 3% error rate per SNP. This novel QC approach is easy to implement and computationally efficient, and can lead to a better quality of genotypes for subsequent genotype-phenotype investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The safety of food has become an increasingly interesting issue to consumers and the media. It has also become a source of concern, as the amount of information on the risks related to food safety continues to expand. Today, risk and safety are permanent elements within the concept of food quality. Safety, in particular, is the attribute that consumers find very difficult to assess. The literature in this study consists of three main themes: traceability; consumer behaviour related to both quality and safety issues and perception of risk; and valuation methods. The empirical scope of the study was restricted to beef, because the beef labelling system enables reliable tracing of the origin of beef, as well as attributes related to safety, environmental friendliness and animal welfare. The purpose of this study was to examine what kind of information flows are required to ensure quality and safety in the food chain for beef, and who should produce that information. Studying the willingness to pay of consumers makes it possible to determine whether the consumers consider the quantity of information available on the safety and quality of beef sufficient. One of the main findings of this study was that the majority of Finnish consumers (73%) regard increased quality information as beneficial. These benefits were assessed using the contingent valuation method. The results showed that those who were willing to pay for increased information on the quality and safety of beef would accept an average price increase of 24% per kilogram. The results showed that certain risk factors impact consumer willingness to pay. If the respondents considered genetic modification of food or foodborne zoonotic diseases as harmful or extremely harmful risk factors in food, they were more likely to be willing to pay for quality information. The results produced by the models thus confirmed the premise that certain food-related risks affect willingness to pay for beef quality information. The results also showed that safety-related quality cues are significant to the consumers. In the first place, the consumers would like to receive information on the control of zoonotic diseases that are contagious to humans. Similarly, other process-control related information ranked high among the top responses. Information on any potential genetic modification was also considered important, even though genetic modification was not regarded as a high risk factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The treatment of large segmental bone defects remains a significant clinical challenge. Due to limitations surrounding the use of bone grafts, tissue-engineered constructs for the repair of large bone defects could offer an alternative. Before translation of any newly developed tissue engineering (TE) approach to the clinic, efficacy of the treatment must be shown in a validated preclinical large animal model. Currently, biomechanical testing, histology, and microcomputed tomography are performed to assess the quality and quantity of the regenerated bone. However, in vivo monitoring of the progression of healing is seldom performed, which could reveal important information regarding time to restoration of mechanical function and acceleration of regeneration. Furthermore, since the mechanical environment is known to influence bone regeneration, and limb loading of the animals can poorly be controlled, characterizing activity and load history could provide the ability to explain variability in the acquired data sets and potentially outliers based on abnormal loading. Many approaches have been devised to monitor the progression of healing and characterize the mechanical environment in fracture healing studies. In this article, we review previous methods and share results of recent work of our group toward developing and implementing a comprehensive biomechanical monitoring system to study bone regeneration in preclinical TE studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grape drying is a slow and energy intensive process because the waxy peel has low permeability to moisture. Therefore, peel chemical and physical pretreatments are considered before drying in order to facilitate water diffusion. However, they cause heterogeneity in the waxes removal and problems during shelf-life. In this paper an alternative abrasive pretreatment of grape peel, for enhancing the drying rate and preserving the samples, was applied to Red Globe grapes. Convective drying experiments were carried out at 40-70 Centigrade and at 2.3 m/s air velocity. The effect of wax abrasive pretreatment on the drying kinetics and quality parameters of raisins was investigated. The results were compared with those of samples pretreated by dipping in alkaline ethyl oleate solution and untreated grapes. All the dried samples are darker than fresh one and shrunked. The samples pretreated by peel abrasion and dried at 50 centigrade showed the lowest color changes, less shrinkage and the best rehydration capacity. The drying kinetics and shrinkage curves were also analyzed using some commonly available empirical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty plays an important role in water quality management problems. The major sources of uncertainty in a water quality management problem are the random nature of hydrologic variables and imprecision (fuzziness) associated with goals of the dischargers and pollution control agencies (PCA). Many Waste Load Allocation (WLA)problems are solved by considering these two sources of uncertainty. Apart from randomness and fuzziness, missing data in the time series of a hydrologic variable may result in additional uncertainty due to partial ignorance. These uncertainties render the input parameters as imprecise parameters in water quality decision making. In this paper an Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) is developed for water quality management of a river system subject to uncertainty arising from partial ignorance. In a WLA problem, both randomness and imprecision can be addressed simultaneously by fuzzy risk of low water quality. A methodology is developed for the computation of imprecise fuzzy risk of low water quality, when the parameters are characterized by uncertainty due to partial ignorance. A Monte-Carlo simulation is performed to evaluate the imprecise fuzzy risk of low water quality by considering the input variables as imprecise. Fuzzy multiobjective optimization is used to formulate the multiobjective model. The model developed is based on a fuzzy multiobjective optimization problem with max-min as the operator. This usually does not result in a unique solution but gives multiple solutions. Two optimization models are developed to capture all the decision alternatives or multiple solutions. The objective of the two optimization models is to obtain a range of fractional removal levels for the dischargers, such that the resultant fuzzy risk will be within acceptable limits. Specification of a range for fractional removal levels enhances flexibility in decision making. The methodology is demonstrated with a case study of the Tunga-Bhadra river system in India.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern-day weather forecasting is highly dependent on Numerical Weather Prediction (NWP) models as the main data source. The evolving state of the atmosphere with time can be numerically predicted by solving a set of hydrodynamic equations, if the initial state is known. However, such a modelling approach always contains approximations that by and large depend on the purpose of use and resolution of the models. Present-day NWP systems operate with horizontal model resolutions in the range from about 40 km to 10 km. Recently, the aim has been to reach operationally to scales of 1 4 km. This requires less approximations in the model equations, more complex treatment of physical processes and, furthermore, more computing power. This thesis concentrates on the physical parameterization methods used in high-resolution NWP models. The main emphasis is on the validation of the grid-size-dependent convection parameterization in the High Resolution Limited Area Model (HIRLAM) and on a comprehensive intercomparison of radiative-flux parameterizations. In addition, the problems related to wind prediction near the coastline are addressed with high-resolution meso-scale models. The grid-size-dependent convection parameterization is clearly beneficial for NWP models operating with a dense grid. Results show that the current convection scheme in HIRLAM is still applicable down to a 5.6 km grid size. However, with further improved model resolution, the tendency of the model to overestimate strong precipitation intensities increases in all the experiment runs. For the clear-sky longwave radiation parameterization, schemes used in NWP-models provide much better results in comparison with simple empirical schemes. On the other hand, for the shortwave part of the spectrum, the empirical schemes are more competitive for producing fairly accurate surface fluxes. Overall, even the complex radiation parameterization schemes used in NWP-models seem to be slightly too transparent for both long- and shortwave radiation in clear-sky conditions. For cloudy conditions, simple cloud correction functions are tested. In case of longwave radiation, the empirical cloud correction methods provide rather accurate results, whereas for shortwave radiation the benefit is only marginal. Idealised high-resolution two-dimensional meso-scale model experiments suggest that the reason for the observed formation of the afternoon low level jet (LLJ) over the Gulf of Finland is an inertial oscillation mechanism, when the large-scale flow is from the south-east or west directions. The LLJ is further enhanced by the sea-breeze circulation. A three-dimensional HIRLAM experiment, with a 7.7 km grid size, is able to generate a similar LLJ flow structure as suggested by the 2D-experiments and observations. It is also pointed out that improved model resolution does not necessary lead to better wind forecasts in the statistical sense. In nested systems, the quality of the large-scale host model is really important, especially if the inner meso-scale model domain is small.