32 resultados para Methods engineering.

em Universidad Politécnica de Madrid


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a remote sensing observational method for the measurement of the spatio-temporal dynamics of ocean waves. Variational techniques are used to recover a coherent space-time reconstruction of oceanic sea states given stereo video imagery. The stereoscopic reconstruction problem is expressed in a variational optimization framework. There, we design an energy functional whose minimizer is the desired temporal sequence of wave heights. The functional combines photometric observations as well as spatial and temporal regularizers. A nested iterative scheme is devised to numerically solve, via 3-D multigrid methods, the system of partial differential equations resulting from the optimality condition of the energy functional. The output of our method is the coherent, simultaneous estimation of the wave surface height and radiance at multiple snapshots. We demonstrate our algorithm on real data collected off-shore. Statistical and spectral analysis are performed. Comparison with respect to an existing sequential method is analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All meta-analyses should include a heterogeneity analysis. Even so, it is not easy to decide whether a set of studies are homogeneous or heterogeneous because of the low statistical power of the statistics used (usually the Q test). Objective: Determine a set of rules enabling SE researchers to find out, based on the characteristics of the experiments to be aggregated, whether or not it is feasible to accurately detect heterogeneity. Method: Evaluate the statistical power of heterogeneity detection methods using a Monte Carlo simulation process. Results: The Q test is not powerful when the meta-analysis contains up to a total of about 200 experimental subjects and the effect size difference is less than 1. Conclusions: The Q test cannot be used as a decision-making criterion for meta-analysis in small sample settings like SE. Random effects models should be used instead of fixed effects models. Caution should be exercised when applying Q test-mediated decomposition into subgroups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Several meta-analysis methods can be used to quantitatively combine the results of a group of experiments, including the weighted mean difference, statistical vote counting, the parametric response ratio and the non-parametric response ratio. The software engineering community has focused on the weighted mean difference method. However, other meta-analysis methods have distinct strengths, such as being able to be used when variances are not reported. There are as yet no guidelines to indicate which method is best for use in each case. Aim: Compile a set of rules that SE researchers can use to ascertain which aggregation method is best for use in the synthesis phase of a systematic review. Method: Monte Carlo simulation varying the number of experiments in the meta analyses, the number of subjects that they include, their variance and effect size. We empirically calculated the reliability and statistical power in each case Results: WMD is generally reliable if the variance is low, whereas its power depends on the effect size and number of subjects per meta-analysis; the reliability of RR is generally unaffected by changes in variance, but it does require more subjects than WMD to be powerful; NPRR is the most reliable method, but it is not very powerful; SVC behaves well when the effect size is moderate, but is less reliable with other effect sizes. Detailed tables of results are annexed. Conclusions: Before undertaking statistical aggregation in software engineering, it is worthwhile checking whether there is any appreciable difference in the reliability and power of the methods. If there is, software engineers should select the method that optimizes both parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the beginning of the 90s, ontology development was similar to an art: ontology developers did not have clear guidelines on how to build ontologies but only some design criteria to be followed. Work on principles, methods and methodologies, together with supporting technologies and languages, made ontology development become an engineering discipline, the so-called Ontology Engineering. Ontology Engineering refers to the set of activities that concern the ontology development process and the ontology life cycle, the methods and methodologies for building ontologies, and the tool suites and languages that support them. Thanks to the work done in the Ontology Engineering field, the development of ontologies within and between teams has increased and improved, as well as the possibility of reusing ontologies in other developments and in final applications. Currently, ontologies are widely used in (a) Knowledge Engineering, Artificial Intelligence and Computer Science, (b) applications related to knowledge management, natural language processing, e-commerce, intelligent information integration, information retrieval, database design and integration, bio-informatics, education, and (c) the Semantic Web, the Semantic Grid, and the Linked Data initiative. In this paper, we provide an overview of Ontology Engineering, mentioning the most outstanding and used methodologies, languages, and tools for building ontologies. In addition, we include some words on how all these elements can be used in the Linked Data initiative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laminatedglass is composed of two glass layers and a thin intermediate PVB layer, strongly influencing PVB's viscoelastic behaviour its dynamic response. While natural frequencies are relatively easily identified even with simplified FE models, damping ratios are not identified with such an ease. In order to determine to what extent external factors influence dampingidentification, different tests have been carried out. The external factors considered, apart from temperature, are accelerometers, connection cables and the effect of the glass layers. To analyse the influence of the accelerometers and their connection cables a laser measuring device was employed considering three possibilities: sample without instrumentation, sample with the accelerometers fixed and sample completely instrumented. When the sample is completely instrumented, accelerometer readings are also analysed. To take into consideration the effect of the glass layers, tests were realised both for laminatedglass and monolithic samples. This paper presents in depth data analysis of the different configurations and establishes criteria for data acquisition when testing laminatedglass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between different learning evaluation methods and the academic success in an aeronautical engineering degree in Spain is analysed. The study is based on data about the evolution of academic achievement obtained along the last ten year, along which the evaluation and learning’s methods have suffered huge changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for the simulation of spectrum compatible earthquake time histories has existed since earthquake engineering for complicated structures began. More than the safety of the main structure, the analysis of the equipment (piping, racks, etc.) can only be assessed on the basis of the time history of the floor in which they are contained. This paper presents several methods for calculating simulated spectrum compatible earthquakes as well as a comparison between them. As a result of this comparison, the use of the phase content in real earthquakes as proposed by Ohsaki appears as an effective alternative to the classical methods. With this method, it is possible to establish an approach without the arbitrary modulation commonly used in other methods. Different procedures are described as is the influence of the different parameters which appear in the analysis. Several numerical examples are also presented, and the effectiveness of Ohsaki's method is confirmed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-linear physical systems of infinite extent are conveniently modelled using FE–BE coupling methods. By the combination of both methods, suitable use of the advantages of each one may be obtained. Several possibilities of FEM–BEM coupling and their performance in some practical cases are discussed in this paper. Parallelizable coupling algorithms based on domain decomposition are developed and compared with the most traditional coupling methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The numerical strategies employed in the evaluation of singular integrals existing in the Cauchy principal value (CPV) sense are, undoubtedly, one of the key aspects which remarkably affect the performance and accuracy of the boundary element method (BEM). Thus, a new procedure, based upon a bi-cubic co-ordinate transformation and oriented towards the numerical evaluation of both the CPV integrals and some others which contain different types of singularity is developed. Both the ideas and some details involved in the proposed formulae are presented, obtaining rather simple and-attractive expressions for the numerical quadrature which are also easily embodied into existing BEM codes. Some illustrative examples which assess the stability and accuracy of the new formulae are included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of seismic hysteretic dampers for passive control is increasing exponentially in recent years for both new and existing buildings. In order to utilize hysteretic dampers within a structural system, it is of paramount importance to have simplified design procedures based upon knowledge gained from theoretical studies and validated with experimental results. Non-linear Static Procedures (NSPs) are presented as an alternative to the force-based methods more common nowadays. The application of NSPs to conventional structures has been well established; yet there is a lack of experimental information on how NSPs apply to systems with hysteretic dampers. In this research, several shaking table tests were conducted on two single bay and single story 1:2 scale structures with and without hysteretic dampers. The maximum response of the structure with dampers in terms of lateral displacement and base shear obtained from the tests was compared with the prediction provided by three well-known NSPs: (1) the improved version of the Capacity Spectrum Method (CSM) from FEMA 440; (2) the improved version of the Displacement Coefficient Method (DCM) from FEMA 440; and (3) the N2 Method implemented in Eurocode 8. In general, the improved version of the DCM and N2 methods are found to provide acceptable accuracy in prediction, but the CSM tends to underestimate the response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An investigation was undertaken consisting of a state-of-the-art and comparative analysis of currently available methods for calculating the structural stability of wave walls in sloping breakwaters. A total of six design schemes are addressed. The conditions under which the formulations and ranges of validity are explicitly indicated by their authors, are given. The lack of definition in parameters to be used and aspects not taken into account in their investigations are discussed and the results of this analysis are given in a final table.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present, in the University curricula in most countries, the decision theory and the mathematical models to aid decision making is not included, as in the graduate program like in Doctored and Master´s programs. In the Technical School of High Level Agronomic Engineers of the Technical University of Madrid (ETSIA-UPM), the need to offer to the future engineers training in a subject that could help them to take decisions in their profession was felt. Along the life, they will have to take a lot of decisions. Ones, will be important and others no. In the personal level, they will have to take several very important decisions, like the election of a career, professional work, or a couple, but in the professional field, the decision making is the main role of the Managers, Politicians and Leaders. They should be decision makers and will be paid for it. Therefore, nobody can understand that such a professional that is called to practice management responsibilities in the companies, does not take training in such an important matter. For it, in the year 2000, it was requested to the University Board to introduce in the curricula an optional qualified subject of the second cycle with 4,5 credits titled " Mathematical Methods for Making Decisions ". A program was elaborated, the didactic material prepared and programs as Maple, Lingo, Math Cad, etc. installed in several IT classrooms, where the course will be taught. In the course 2000-2001 this subject was offered with a great acceptance that exceeded the forecasts of capacity and had to be prepared more classrooms. This course in graduate program took place in the Department of Applied Mathematics to the Agronomic Engineering, as an extension of the credits dedicated to Mathematics in the career of Engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

System identification deals with the problem of building mathematical models of dynamical systems based on observed data from the system" [1]. In the context of civil engineering, the system refers to a large scale structure such as a building, bridge, or an offshore structure, and identification mostly involves the determination of modal parameters (the natural frequencies, damping ratios, and mode shapes). This paper presents some modal identification results obtained using a state-of-the-art time domain system identification method (data-driven stochastic subspace algorithms [2]) applied to the output-only data measured in a steel arch bridge. First, a three dimensional finite element model was developed for the numerical analysis of the structure using ANSYS. Modal analysis was carried out and modal parameters were extracted in the frequency range of interest, 0-10 Hz. The results obtained from the finite element modal analysis were used to determine the location of the sensors. After that, ambient vibration tests were conducted during April 23-24, 2009. The response of the structure was measured using eight accelerometers. Two stations of three sensors were formed (triaxial stations). These sensors were held stationary for reference during the test. The two remaining sensors were placed at the different measurement points along the bridge deck, in which only vertical and transversal measurements were conducted (biaxial stations). Point estimate and interval estimate have been carried out in the state space model using these ambient vibration measurements. In the case of parametric models (like state space), the dynamic behaviour of a system is described using mathematical models. Then, mathematical relationships can be established between modal parameters and estimated point parameters (thus, it is common to use experimental modal analysis as a synonym for system identification). Stable modal parameters are found using a stabilization diagram. Furthermore, this paper proposes a method for assessing the precision of estimates of the parameters of state-space models (confidence interval). This approach employs the nonparametric bootstrap procedure [3] and is applied to subspace parameter estimation algorithm. Using bootstrap results, a plot similar to a stabilization diagram is developed. These graphics differentiate system modes from spurious noise modes for a given order system. Additionally, using the modal assurance criterion, the experimental modes obtained have been compared with those evaluated from a finite element analysis. A quite good agreement between numerical and experimental results is observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. The ASSERT project de?ned new software engineering methods and tools for the development of critical embedded real-time systems in the space domain. The ASSERT model-driven engineering process was one of the achievements of the project and is based on the concept of property- preserving model transformations. The key element of this process is that non-functional properties of the software system must be preserved during model transformations. Properties preservation is carried out through model transformations compliant with the Ravenscar Pro?le and provides a formal basis to the process. In this way, the so-called Ravenscar Computational Model is central to the whole ASSERT process. This paper describes the work done in the HWSWCO study, whose main objective has been to address the integration of the Hardware/Software co-design phase in the ASSERT process. In order to do that, non-functional properties of the software system must also be preserved during hardware synthesis. Keywords : Ada 2005, Ravenscar pro?le, Hardware/Software co-design, real- time systems, high-integrity systems, ORK

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is remarkable growing concern about the quality control at the time, which has led to the search for methods capable of addressing effectively the reliability analysis as part of the Statistic. Managers, researchers and Engineers must understand that 'statistical thinking' is not just a set of statistical tools. They should start considering 'statistical thinking' from a 'system', which means, developing systems that meet specific statistical tools and other methodologies for an activity. The aim of this article is to encourage them (engineers, researchers and managers) to develop a new way of thinking.