930 resultados para Methods engineering
Resumo:
It is difficult, if not impossible, to find something that is not changing in computer technology: circuits, architectures, languages, methods, fields of application ... The "central object" itself of this brand of engineering, software, represents such a diverse reality (many objects) that the fact that it has only one name gives rise to considerable confusion. This issue, among others, was taken up by Fox (1) and, at this point, I would like to underline that it is more of a pragmatic issue than an academic one. Thus, Software Engineering Education moves in an unstable, undefined'world. This axiom governs and limits the. validity of all educational proposals in the area of Software Engineering and, thereforer all the ideas presented in this paper.
Learning and Assessing Competencies: New challenges for Mathematics in Engineering Degrees in Spain.
Resumo:
The introduction of new degrees adapted to the European Area of Higher Education (EAHE) has involved a radically different approach to the curriculum. The new programs are structured around competencies that should be acquired. Considering the competencies, teachers must define and develop learning objectives, design teaching methods and establish appropriate evaluation systems. While most Spanish universities have incorporated methodological innovations and evaluation systems different from traditional exams, there is enough confusion about how to teach and assess competencies and learning outcomes, as traditionally the teaching and assessment have focused on knowledge. In this paper we analyze the state-of-the-art in the mathematical courses of the new engineering degrees in some Spanish universities.
Resumo:
Pushover methods are being used as an everyday tool in engineering practice and some of them have been included in Regulatory Codes. Recently several efforts have been done trying to look at them from a probabilistic viewpoint. In this paper the authors shall present a Level 2 approach based on a probabilistic definition of the characteristic points defining the response spectra as well as a probabilistic definition of the elasto-plastic pushover curve representing the structural behavior. Comparisons with Montecarlo simulations will help to precise the accuracy of the proposed approach.
Resumo:
The increasing number of works related to the surface texture characterization based on 3D information, makes convenient rethinking traditional methods based on two-dimensional measurements from profiles. This work compares results between measurements obtained using two and three-dimensional methods. It uses three kinds of data sources: reference surfaces, randomly generated surfaces and measured. Preliminary results are presented. These results must be completed trying to cover a wider number of possibilities according to the manufacturing process and the measurement instrumentation since results can vary quite significantly between them.
Resumo:
Diseño conceptual de puentes de alta velocidad ferroviarios. Railroad bridges, in general, and those for high speed railways, in particular, demand very special conditions. The traffic loads are much higher than for road bridges. Loads due to braking and acceleration determine, due to their magnitude, the structural layout. Because of the speed of the vehicles there are specific dynamic effects which need to be considered. In order to ensure passenger comfort, compatible with speeds of up to 350 km/h, it is necessary to meet very demanding conditions with respect to stiffness, displacements and dynamic behavior. In this paper these conditions are briefly described and different typological possibilities to satisfy them are presented as well as the main construction methods applicable to this kind of bridges.
Resumo:
Airbus designs and industrializes aircrafts using Concurrent Engineering techniques since decades. The introduction of new PLM methods, procedures and tools, and the need to reduce time-to-market, led Airbus Military to pursue new working methods. Traditional Engineering works sequentially. Concurrent Engineering basically overlaps tasks between teams. Collaborative Engineering promotes teamwork to develop product, processes and resources from the conceptual phase to the start of the serial production. The CALIPSO-neo pilot project was launched to support the industrialization process of a medium size aerostructure. The aim is to implement the industrial Digital Mock-Up (iDMU) concept and its exploitation to create shop floor documentation. In a framework of a collaborative engineering strategy, the project is part of the efforts to deploy Digital Manufacturing as a key technology for the industrialization of aircraft assembly lines. This paper presents the context, the conceptual approach and the methodology adopted.
Resumo:
Two different methods of analysis of plate bending, FEM and BM are discussed in this paper. The plate behaviour is assumed to be represented by using the linear thin plate theory where the Poisson-Kirchoff assumption holds. The BM based in a weighted mean square error technique produced good results for the problem of plate bending. The computational effort demanded in the BM is smaller than the one needed in a FEM analysis for the same level of accuracy. The general application of the FEM cannot be matched by the BM. Particularly, different types of geometry (plates of arbitrary geometry) need a similar but not identical treatment in the BM. However, this loss of generality is counterbalanced by the computational efficiency gained in the BM in the solution achievement
Resumo:
A major challenge in the engineering of complex and critical systems is the management of change, both in the system and in its operational environment. Due to the growing of complexity in systems, new approaches on autonomy must be able to detect critical changes and avoid their progress towards undesirable states. We are searching for methods to build systems that can tune the adaptability protocols. New mechanisms that use system-wellness requirements to reduce the influence of the outer domain and transfer the control of uncertainly to the inner one. Under the view of cognitive systems, biological emotions suggests a strategy to configure value-based systems to use semantic self-representations of the state. A method inspired by emotion theories to causally connect to the inner domain of the system and its objectives of wellness, focusing on dynamically adapting the system to avoid the progress of critical states. This method shall endow the system with a transversal mechanism to monitor its inner processes, detecting critical states and managing its adaptivity in order to maintain the wellness goals. The paper describes the current vision produced by this work-in-progress.
Resumo:
This paper presents an extensive and useful comparison of existing formulas to estimate wave forces on crown walls. The paper also provides valuable insights into crown wall behaviour, suggesting the use of formulas for prior sizing and recommending, in any case, tests on a physical model in order to confirm the final design. The authors helpfully advise to use more than one method to obtain results closer to reality, always taking into account the test conditions under which each formula was developed
Resumo:
A major challenge in the engineering of complex and critical systems is the management of change, both in the system and in its operational environment. Due to the growing of complexity in systems, new approaches on autonomy must be able to detect critical changes and avoid their progress towards undesirable states. We are searching for methods to build systems that can tune the adaptability protocols. New mechanisms that use system-wellness requirements to reduce the influence of the outer domain and transfer the control of uncertainly to the inner one. Under the view of cognitive systems, biological emotions suggests a strategy to configure value-based systems to use semantic self-representations of the state. A method inspired by emotion theories to causally connect to the inner domain of the system and its objectives of wellness, focusing on dynamically adapting the system to avoid the progress of critical states. This method shall endow the system with a transversal mechanism to monitor its inner processes, detecting critical states and managing its adaptivity in order to maintain the wellness goals. The paper describes the current vision produced by this work-in-progress.
Resumo:
En esta Tesis Doctoral se emplean y desarrollan Métodos Bayesianos para su aplicación en análisis geotécnicos habituales, con un énfasis particular en (i) la valoración y selección de modelos geotécnicos basados en correlaciones empíricas; en (ii) el desarrollo de predicciones acerca de los resultados esperados en modelos geotécnicos complejos. Se llevan a cabo diferentes aplicaciones a problemas geotécnicos, como es el caso de: (1) En el caso de rocas intactas, se presenta un método Bayesiano para la evaluación de modelos que permiten estimar el módulo de Young a partir de la resistencia a compresión simple (UCS). La metodología desarrollada suministra estimaciones de las incertidumbres de los parámetros y predicciones y es capaz de diferenciar entre las diferentes fuentes de error. Se desarrollan modelos "específicos de roca" para los tipos de roca más comunes y se muestra cómo se pueden "actualizar" esos modelos "iniciales" para incorporar, cuando se encuentra disponible, la nueva información específica del proyecto, reduciendo las incertidumbres del modelo y mejorando sus capacidades predictivas. (2) Para macizos rocosos, se presenta una metodología, fundamentada en un criterio de selección de modelos, que permite determinar el modelo más apropiado, entre un conjunto de candidatos, para estimar el módulo de deformación de un macizo rocoso a partir de un conjunto de datos observados. Una vez que se ha seleccionado el modelo más apropiado, se emplea un método Bayesiano para obtener distribuciones predictivas de los módulos de deformación de macizos rocosos y para actualizarlos con la nueva información específica del proyecto. Este método Bayesiano de actualización puede reducir significativamente la incertidumbre asociada a la predicción, y por lo tanto, afectar las estimaciones que se hagan de la probabilidad de fallo, lo cual es de un interés significativo para los diseños de mecánica de rocas basados en fiabilidad. (3) En las primeras etapas de los diseños de mecánica de rocas, la información acerca de los parámetros geomecánicos y geométricos, las tensiones in-situ o los parámetros de sostenimiento, es, a menudo, escasa o incompleta. Esto plantea dificultades para aplicar las correlaciones empíricas tradicionales que no pueden trabajar con información incompleta para realizar predicciones. Por lo tanto, se propone la utilización de una Red Bayesiana para trabajar con información incompleta y, en particular, se desarrolla un clasificador Naïve Bayes para predecir la probabilidad de ocurrencia de grandes deformaciones (squeezing) en un túnel a partir de cinco parámetros de entrada habitualmente disponibles, al menos parcialmente, en la etapa de diseño. This dissertation employs and develops Bayesian methods to be used in typical geotechnical analyses, with a particular emphasis on (i) the assessment and selection of geotechnical models based on empirical correlations; on (ii) the development of probabilistic predictions of outcomes expected for complex geotechnical models. Examples of application to geotechnical problems are developed, as follows: (1) For intact rocks, we present a Bayesian framework for model assessment to estimate the Young’s moduli based on their UCS. Our approach provides uncertainty estimates of parameters and predictions, and can differentiate among the sources of error. We develop ‘rock-specific’ models for common rock types, and illustrate that such ‘initial’ models can be ‘updated’ to incorporate new project-specific information as it becomes available, reducing model uncertainties and improving their predictive capabilities. (2) For rock masses, we present an approach, based on model selection criteria to select the most appropriate model, among a set of candidate models, to estimate the deformation modulus of a rock mass, given a set of observed data. Once the most appropriate model is selected, a Bayesian framework is employed to develop predictive distributions of the deformation moduli of rock masses, and to update them with new project-specific data. Such Bayesian updating approach can significantly reduce the associated predictive uncertainty, and therefore, affect our computed estimates of probability of failure, which is of significant interest to reliability-based rock engineering design. (3) In the preliminary design stage of rock engineering, the information about geomechanical and geometrical parameters, in situ stress or support parameters is often scarce or incomplete. This poses difficulties in applying traditional empirical correlations that cannot deal with incomplete data to make predictions. Therefore, we propose the use of Bayesian Networks to deal with incomplete data and, in particular, a Naïve Bayes classifier is developed to predict the probability of occurrence of tunnel squeezing based on five input parameters that are commonly available, at least partially, at design stages.
Resumo:
The Department of Structural Analysis of the University of Santander has been for a longtime involved in the solution of the country´s practical engineering problems. Some of these have required the use of non-conventional methods of analysis, in order to achieve adequate engineering answers. As an example of the increasing application of non-linear computer codes in the nowadays engineering practice, some cases will be briefly presented. In each case, only the main features of the problem involved and the solution used to solve it will be shown
Resumo:
Incremental truncation for the creation of hybrid enzymes (ITCHY) is a novel tool for the generation of combinatorial libraries of hybrid proteins independent of DNA sequence homology. We herein report a fundamentally different methodology for creating incremental truncation libraries using nucleotide triphosphate analogs. Central to the method is the polymerase catalyzed, low frequency, random incorporation of α-phosphothioate dNTPs into the region of DNA targeted for truncation. The resulting phosphothioate internucleotide linkages are resistant to 3′→5′ exonuclease hydrolysis, rendering the target DNA resistant to degradation in a subsequent exonuclease III treatment. From an experimental perspective the protocol reported here to create incremental truncation libraries is simpler and less time consuming than previous approaches by combining the two gene fragments in a single vector and eliminating additional purification steps. As proof of principle, an incremental truncation library of fusions between the N-terminal fragment of Escherichia coli glycinamide ribonucleotide formyltransferase (PurN) and the C-terminal fragment of human glycinamide ribonucleotide formyltransferase (hGART) was prepared and successfully tested for functional hybrids in an auxotrophic E.coli host strain. Multiple active hybrid enzymes were identified, including ones fused in regions of low sequence homology.
Empirical study on the maintainability of Web applications: Model-driven Engineering vs Code-centric
Resumo:
Model-driven Engineering (MDE) approaches are often acknowledged to improve the maintainability of the resulting applications. However, there is a scarcity of empirical evidence that backs their claimed benefits and limitations with respect to code-centric approaches. The purpose of this paper is to compare the performance and satisfaction of junior software maintainers while executing maintainability tasks on Web applications with two different development approaches, one being OOH4RIA, a model-driven approach, and the other being a code-centric approach based on Visual Studio .NET and the Agile Unified Process. We have conducted a quasi-experiment with 27 graduated students from the University of Alicante. They were randomly divided into two groups, and each group was assigned to a different Web application on which they performed a set of maintainability tasks. The results show that maintaining Web applications with OOH4RIA clearly improves the performance of subjects. It also tips the satisfaction balance in favor of OOH4RIA, although not significantly. Model-driven development methods seem to improve both the developers’ objective performance and subjective opinions on ease of use of the method. This notwithstanding, further experimentation is needed to be able to generalize the results to different populations, methods, languages and tools, different domains and different application sizes.
Resumo:
Different kinds of algorithms can be chosen so as to compute elementary functions. Among all of them, it is worthwhile mentioning the shift-and-add algorithms due to the fact that they have been specifically designed to be very simple and to save computer resources. In fact, almost the only operations usually involved with these methods are additions and shifts, which can be easily and efficiently performed by a digital processor. Shift-and-add algorithms allow fairly good precision with low cost iterations. The most famous algorithm belonging to this type is CORDIC. CORDIC has the capability of approximating a wide variety of functions with only the help of a slight change in their iterations. In this paper, we will analyze the requirements of some engineering and industrial problems in terms of type of operands and functions to approximate. Then, we will propose the application of shift-and-add algorithms based on CORDIC to these problems. We will make a comparison between the different methods applied in terms of the precision of the results and the number of iterations required.