14 resultados para comparative methods
em Universidad Politécnica de Madrid
Resumo:
El Sistema de Seguros Agrarios con el Seguro de cobertura de los daños por sequía en los pastos aprovechados por el ganado en régimen extensivo (línea de seguro 133) aplica la teledetección mediante un índice de vegetación (NDVI), con el fin de solucionar los problemas de peritación que surgen cuando se tiene que determinar la cantidad y calidad del pasto afectado por la sequía. Por ello el seguro de cobertura de los daños por sequía en pastos es el principal instrumento para hacer frente al gasto que supone la necesidad de suplemento de alimentación del ganado reproductor debido a la sequía. En las comarcas de Vitigudino, Trujillo y Valle de los Pedroches (España) se comparó la evolución del seguro de sequía en pastos desde 2006 a 2010 con un modelo matemático de crecimiento del pasto en función de las variables ecofisiológicas y ambientales. Sumadas las decenas de sequía extrema y sequía leve, el modelo matemático contabilizó un número mayor de decenas que las proporcionadas por Agroseguro. La recomendación es comparar las curvas de crecimiento del pasto con las curvas de evolución del NDVI, para ajustar ambos modelos
Resumo:
Case-based reasoning (CBR) is a unique tool for the evaluation of possible failure of firms (EOPFOF) for its eases of interpretation and implementation. Ensemble computing, a variation of group decision in society, provides a potential means of improving predictive performance of CBR-based EOPFOF. This research aims to integrate bagging and proportion case-basing with CBR to generate a method of proportion bagging CBR for EOPFOF. Diverse multiple case bases are first produced by multiple case-basing, in which a volume parameter is introduced to control the size of each case base. Then, the classic case retrieval algorithm is implemented to generate diverse member CBR predictors. Majority voting, the most frequently used mechanism in ensemble computing, is finally used to aggregate outputs of member CBR predictors in order to produce final prediction of the CBR ensemble. In an empirical experiment, we statistically validated the results of the CBR ensemble from multiple case bases by comparing them with those of multivariate discriminant analysis, logistic regression, classic CBR, the best member CBR predictor and bagging CBR ensemble. The results from Chinese EOPFOF prior to 3 years indicate that the new CBR ensemble, which significantly improved CBRs predictive ability, outperformed all the comparative methods.
Resumo:
Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.
Resumo:
This paper presents a study of the effectiveness of three different algorithms for the parallelization of logic programs based on compile-time detection of independence among goals. The algorithms are embedded in a complete parallelizing compiler, which incorporates different abstract interpretation-based program analyses. The complete system shows the task of automatic program parallelization to be practical. The trade-offs involved in using each of the algorithms in this task are studied experimentally, weaknesses of these identified, and possible improvements discussed.
Resumo:
An investigation was undertaken consisting of a state-of-the-art and comparative analysis of currently available methods for calculating the structural stability of wave walls in sloping breakwaters. A total of six design schemes are addressed. The conditions under which the formulations and ranges of validity are explicitly indicated by their authors, are given. The lack of definition in parameters to be used and aspects not taken into account in their investigations are discussed and the results of this analysis are given in a final table.
Resumo:
The increasing number of works related to the surface texture characterization based on 3D information, makes convenient rethinking traditional methods based on two-dimensional measurements from profiles. This work compares results between measurements obtained using two and three-dimensional methods. It uses three kinds of data sources: reference surfaces, randomly generated surfaces and measured. Preliminary results are presented. These results must be completed trying to cover a wider number of possibilities according to the manufacturing process and the measurement instrumentation since results can vary quite significantly between them.
Resumo:
Two different methods of analysis of plate bending, FEM and BM are discussed in this paper. The plate behaviour is assumed to be represented by using the linear thin plate theory where the Poisson-Kirchoff assumption holds. The BM based in a weighted mean square error technique produced good results for the problem of plate bending. The computational effort demanded in the BM is smaller than the one needed in a FEM analysis for the same level of accuracy. The general application of the FEM cannot be matched by the BM. Particularly, different types of geometry (plates of arbitrary geometry) need a similar but not identical treatment in the BM. However, this loss of generality is counterbalanced by the computational efficiency gained in the BM in the solution achievement
Resumo:
This paper presents an extensive and useful comparison of existing formulas to estimate wave forces on crown walls. The paper also provides valuable insights into crown wall behaviour, suggesting the use of formulas for prior sizing and recommending, in any case, tests on a physical model in order to confirm the final design. The authors helpfully advise to use more than one method to obtain results closer to reality, always taking into account the test conditions under which each formula was developed
Resumo:
Ionoluminescence (IL) of the two SiO2 phases, amorphous silica and crystalline quartz, has been comparatively investigated in this work, in order to learn about the structural defects generated by means of ion irradiation and the role of crystalline order on the damage processes. Irradiations have been performed with Cl at 10 MeV and Br at 15 MeV, corresponding to the electronic stopping regime (i.e., where the electronic stopping power Se is dominant) and well above the amorphization threshold. The light-emission kinetics for the two main emission bands, located at 1.9 eV (652 nm) and 2.7 eV (459 nm), has been measured under the same ion irradiation conditions as a function of fluence for both, silica and quartz. The role of electronic stopping power has been also investigated and discussed within current views for electronic damage. Our experiments provide a rich phenomenological background that should help to elucidate the mechanisms responsible for light emission and defect creation.
Resumo:
The new Spanish Regulation in Building Acoustic establishes values and limits for the different acoustic magnitudes whose fulfillment can be verify by means field measurements. In this sense, an essential aspect of a field measurement is to give the measured magnitude and the uncertainty associated to such a magnitude. In the calculus of the uncertainty it is very usual to follow the uncertainty propagation method as described in the Guide to the expression of Uncertainty in Measurements (GUM). Other option is the numerical calculus based on the distribution propagation method by means of Monte Carlo simulation. In fact, at this stage, it is possible to find several publications developing this last method by using different software programs. In the present work, we used Excel for the Monte Carlo simulation for the calculus of the uncertainty associated to the different magnitudes derived from the field measurements following ISO 140-4, 140-5 and 140-7. We compare the results with the ones obtained by the uncertainty propagation method. Although both methods give similar values, some small differences have been observed. Some arguments to explain such differences are the asymmetry of the probability distributions associated to the entry magnitudes,the overestimation of the uncertainty following the GUM
Resumo:
Area, launched in 1999 with the Bologna Declaration, has bestowed such a magnitude and unprecedented agility to the transformation process undertaken by European universities. However, the change has been more profound and drastic with regards to the use of new technologies both inside and outside the classroom. This article focuses on the study and analysis of the technology’s history within the university education and its impact on teachers, students and teaching methods. All the elements that have been significant and innovative throughout the history inside the teaching process have been analyzed, from the use of blackboard and chalk during lectures, the use of slide projectors and transparent slides, to the use of electronic whiteboards and Internet nowadays. The study is complemented with two types of surveys that have been performed among teachers and students during the school years 1999 - 2011 in the School of Civil Engineering at the Polytechnic University of Madrid. The pros and cons of each of the techniques and methodologies used in the learning process over the last decades are described, unfolding how they have affected the teacher, who has evolved from writing on a whiteboard to project onto a screen, the student, who has evolved from taking handwritten notes to download information or search the Internet, and the educational process, that has evolved from the lecture to acollaborative learning and project-based learning. It is unknown how the process of learning will evolve in the future, but we do know the consequences that some of the multimedia technologies are having on teachers, students and the learning process. It is our goal as teachers to keep ourselves up to date, in order to offer the student adequate technical content, while providing proper motivation through the use of new technologies. The study provides a forecast in the evolution of multimedia within the classroom and the renewal of the education process, which in our view, will set the basis for future learning process within the context of this new interactive era.
Resumo:
An analysis and comparison of daily and yearly solar irradiation from the satellite CM SAF database and a set of 301 stations from the Spanish SIAR network is performed using data of 2010 and 2011. This analysis is completed with the comparison of the estimations of effective irradiation incident on three different tilted planes (fixed, two axis tracking, north-south hori- zontal axis) using irradiation from these two data sources. Finally, a new map of yearly values of irradiation both on the horizontal plane and on inclined planes is produced mixing both sources with geostatistical techniques (kriging with external drift, KED) The Mean Absolute Difference (MAD) between CM SAF and SIAR is approximately 4% for the irradiation on the horizontal plane and is comprised between 5% and 6% for the irradiation incident on the inclined planes. The MAD between KED and SIAR, and KED and CM SAF is approximately 3% for the irradiation on the horizontal plane and is comprised between 3% and 4% for the irradiation incident on the inclined planes. The methods have been implemented using free software, available as supplementary ma- terial, and the data sources are freely available without restrictions.
Resumo:
An analysis and comparison of daily and yearly solar irradiation from the satellite CM SAF database and a set of 301 stations from the Spanish SIAR network is performed using data of 2010 and 2011. This analysis is completed with the comparison of the estimations of effective irradiation incident on three different tilted planes (fixed, two axis tracking, north-south hori- zontal axis) using irradiation from these two data sources. Finally, a new map of yearly values of irradiation both on the horizontal plane and on inclined planes is produced mixing both sources with geostatistical techniques (kriging with external drift, KED) The Mean Absolute Difference (MAD) between CM SAF and SIAR is approximately 4% for the irradiation on the horizontal plane and is comprised between 5% and 6% for the irradiation incident on the inclined planes. The MAD between KED and SIAR, and KED and CM SAF is approximately 3% for the irradiation on the horizontal plane and is comprised between 3% and 4% for the irradiation incident on the inclined planes. The methods have been implemented using free software, available as supplementary ma- terial, and the data sources are freely available without restrictions.
Resumo:
CEN standards have helped to harmonize analytical methods for substrate analysis.Though, for special substrates or constituents the applicability might be Iimited. In this paper a comparative study of implementation of CEN standards to samples of pine bark and vermiculite has been carried out. For composted pine bark, an elongation of the equilibrium period up to 72 instead of 48 hours might increase the accuracy of determinations physical parameters according to EN 13041. For vermiculite, we suggest pycnometry as a feasible technique for the determination of particle density (PD), as the determination of organic matter (OM) as requested by EN 13041 for the calculation of the PD seems not to be applicable for this kind of material.