885 resultados para fuzzy based evaluation method
Resumo:
In stored grains, smaller depositions and great variation with respect to theoretical insecticide doses are frequently found. The objective of this work was to study the effectiveness of the standard method (ISO 5682/1-1996) employed to evaluate hydraulic nozzles used in stored corn and wheat grain protection experiments. The transversal volumetric distribution and droplet spectrum of a model TJ-60 8002EVS nozzle were determined in order to calibrate a spraying system for an application rate of 5 L/t and to obtain theoretical concentrations of 10 and 0.5 mg/kg of fenitrothion and esfenvalerate, respectively. After treatment, the corn and wheat grains were processed and deposition was analyzed by gas chromatography. The type of grain did not have any influence on insecticide deposition and was dependent upon insecticide only. The insecticide deposits on the grains only reached 42.1 and 38.2% of the intended theoretical values for fenitrothion and esfenvalerate concentrations, respectively. These results demonstrate the ineffectiveness of the standard evaluation method for hydraulic nozzles employed in stored grain protection experiments.
Resumo:
A new wavelet-based adaptive framework for solving population balance equations (PBEs) is proposed in this work. The technique is general, powerful and efficient without the need for prior assumptions about the characteristics of the processes. Because there are steeply varying number densities across a size range, a new strategy is developed to select the optimal order of resolution and the collocation points based on an interpolating wavelet transform (IWT). The proposed technique has been tested for size-independent agglomeration, agglomeration with a linear summation kernel and agglomeration with a nonlinear kernel. In all cases, the predicted and analytical particle size distributions (PSDs) are in excellent agreement. Further work on the solution of the general population balance equations with nucleation, growth and agglomeration and the solution of steady-state population balance equations will be presented in this framework. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper, a novel mixed-integer nonlinear approach is proposed to solve the short-term hydro scheduling problem in the day-ahead electricity market, considering not only head-dependency, but also start/stop of units, discontinuous operating regions and discharge ramping constraints. Results from a case study based on one of the main Portuguese cascaded hydro energy systems are presented, showing that the proposedmixed-integer nonlinear approach is proficient. Conclusions are duly drawn. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a novel efficiencybased evaluation of sentence and word aligners. This assessment is critical in order to make a reliable use in industrial scenarios. The evaluation shows that the resourcesrequired by aligners differ rather broadly. Subsequently, we establish limitation mechanisms on a set of aligners deployed as web services. These results, paired with the quality expected from the aligners, allow providers to choose the most appropriate aligner according to the task at hand.
Resumo:
In this article, the objective is to demonstrate the effects of different decision styles on strategic decisions and likewise, on an organization. The technique that was presented in the study is based on the transformation of linguistic variables to numerical value intervals. In this model, the study benefits from fuzzy logic methodology and fuzzy numbers. This fuzzy methodology approach allows us to examine the relations between decision making styles and strategic management processes when there is uncertainty. The purpose is to provide results to companies that may help them to exercise the most appropriate decision making style for its different strategic management processes. The study is leaving more research topics for further studies that may be applied to other decision making areas within the strategic management process.
Resumo:
The ability of a PCR-based restriction fragment length polymorphism (RFLP) analysis of the cytochrome b (mtDNA) to distinguish Apodemus alpicola from two other Apodemus species was investigated. The partial sequencing of the cytochrome b allowed the identification of one enzyme as being potentially diagnostic. This was supported by an analysis of 131 specimens previously identified using morphometric and/or allozymic data, indicating that the PCR-based RFLP method provides a rapid and reliable tool for distinguishing A. alpicola from its two co-occurring congenerics. The method is applicable to samples taken in the field for ecological studies, and could easily be adapted to the identification of museum samples.
Resumo:
A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).
Resumo:
This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is on finding the potential of using the activity-based costing (ABC) method when formulating cost information for the needs of design engineers. This paper suggests that ABC is more useful than the traditional job order costing, but the negative issue is the fact that ABC models become easily too complicated, i.e. expensive to build and maintain, and difficult to use. For engineering design the most suitable elements of ABC are recognizing activities of the company, constructing acitivity chains, identifying resources, activity and cost drivers, as wellas calculating accurate product costs. ABC systems including numerous cost drivers can become complex. Therefore, a comprehensive ABC based cost information system for the use of design engineers should be considered criticaly. Combining the suitable ideas of ABC with engineering oriented thinking could give competentresults.
Resumo:
Building on the instrumental model of group conflict (IMGC), the present experiment investigates the support for discriminatory and meritocratic method of selections at university in a sample of local and immigrant students. Results showed that local students were supporting in a larger proportion selection method that favors them over immigrants in comparison to method that consists in selecting the best applicants without considering his/her origin. Supporting the assumption of the IMGC, this effect was stronger for locals who perceived immigrants as competing for resources. Immigrant students supported more strongly the meritocratic selection method than the one that discriminated them. However, contrasting with the assumption of the IMGC, this effect was only present in students who perceived immigrants as weakly competing for locals' resources. Results demonstrate that selection methods used at university can be perceived differently depending on students' origin. Further, they suggest that the mechanisms underlying the perception of discriminatory and meritocratic selection methods differ between local and immigrant students. Hence, the present experiment makes a theoretical contribution to the IMGC by delimiting its assumptions to the ingroup facing a competitive situation with a relevant outgroup. Practical implication for universities recruitment policies are discussed.
Resumo:
Connectivity analysis on diffusion MRI data of the whole- brain suffers from distortions caused by the standard echo- planar imaging acquisition strategies. These images show characteristic geometrical deformations and signal destruction that are an important drawback limiting the success of tractography algorithms. Several retrospective correction techniques are readily available. In this work, we use a digital phantom designed for the evaluation of connectivity pipelines. We subject the phantom to a âeurooetheoretically correctâeuro and plausible deformation that resembles the artifact under investigation. We correct data back, with three standard methodologies (namely fieldmap-based, reversed encoding-based, and registration- based). Finally, we rank the methods based on their geometrical accuracy, the dropout compensation, and their impact on the resulting connectivity matrices.
Resumo:
A Kriging interpolation method is combined with an object-based evaluation measure to assess the ability of the UK Met Office's dispersion and weather prediction models to predict the evolution of a plume of tracer as it was transported across Europe. The object-based evaluation method, SAL, considers aspects of the Structure, Amplitude and Location of the pollutant field. The SAL method is able to quantify errors in the predicted size and shape of the pollutant plume, through the structure component, the over- or under-prediction of the pollutant concentrations, through the amplitude component, and the position of the pollutant plume, through the location component. The quantitative results of the SAL evaluation are similar for both models and close to a subjective visual inspection of the predictions. A negative structure component for both models, throughout the entire 60 hour plume dispersion simulation, indicates that the modelled plumes are too small and/or too peaked compared to the observed plume at all times. The amplitude component for both models is strongly positive at the start of the simulation, indicating that surface concentrations are over-predicted by both models for the first 24 hours, but modelled concentrations are within a factor of 2 of the observations at later times. Finally, for both models, the location component is small for the first 48 hours after the start of the tracer release, indicating that the modelled plumes are situated close to the observed plume early on in the simulation, but this plume location error grows at later times. The SAL methodology has also been used to identify differences in the transport of pollution in the dispersion and weather prediction models. The convection scheme in the weather prediction model is found to transport more pollution vertically out of the boundary layer into the free troposphere than the dispersion model convection scheme resulting in lower pollutant concentrations near the surface and hence a better forecast for this case study.
Resumo:
Despite the importance of dust aerosol in the Earth system, state-of-the-art models show a large variety for North African dust emission. This study presents a systematic evaluation of dust emitting-winds in 30 years of the historical model simulation with the UK Met Office Earth-system model HadGEM2-ES for the Coupled Model Intercomparison Project Phase 5. Isolating the effect of winds on dust emission and using an automated detection for nocturnal low-level jets (NLLJs) allow an in-depth evaluation of the model performance for dust emission from a meteorological perspective. The findings highlight that NLLJs are a key driver for dust emission in HadGEM2-ES in terms of occurrence frequency and strength. The annually and spatially averaged occurrence frequency of NLLJs is similar in HadGEM2-ES and ERA-Interim from the European Centre for Medium-Range Weather Forecasts. Compared to ERA-Interim, a stronger pressure ridge over northern Africa in winter and the southward displaced heat low in summer result in differences in location and strength of NLLJs. Particularly the larger geostrophic winds associated with the stronger ridge have a strengthening effect on NLLJs over parts of West Africa in winter. Stronger NLLJs in summer may rather result from an artificially increased mixing coefficient under stable stratification that is weaker in HadGEM2-ES. NLLJs in the Bodélé Depression are affected by stronger synoptic-scale pressure gradients in HadGEM2-ES. Wintertime geostrophic winds can even be so strong that the associated vertical wind shear prevents the formation of NLLJs. These results call for further model improvements in the synoptic-scale dynamics and the physical parametrization of the nocturnal stable boundary layer to better represent dust-emitting processes in the atmospheric model. The new approach could be used for identifying systematic behavior in other models with respect to meteorological processes for dust emission. This would help to improve dust emission simulations and contribute to decreasing the currently large uncertainty in climate change projections with respect to dust aerosol.