30 resultados para Computer Modelling
Resumo:
Dissertação de mestrado integrado em Biomedical Engineering Biomaterials, Biomechanics and Rehabilitation
Resumo:
The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.
Resumo:
This paper describes the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at s√=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z→ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products are obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples.
Resumo:
Tese de Doutoramento em Engenharia de Eletrónica e de Computadores
Resumo:
Tese de Doutoramento em Ciência e Engenharia de Polímeros e Compósitos
Resumo:
Tese de Doutoramento em Ciências (Especialidade em Matemática)
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.
Resumo:
This book was produced in the scope of a research project entitled “Navigating with ‘Magalhães’: Study on the Impact of Digital Media in Schoolchildren”. This study was conducted between May 2010 and May 2013 at the Communication and Society Research Centre, University of Minho, Portugal and it was funded by the Portuguese Foundation for Science and Technology (PTDC/CCI-COM/101381/2008).
Resumo:
(Excerto) In times past, learning to read, write and do arithmetic was to get on course to earn the “writ of emancipation” in society. These skills are still essential today, but are not enough to live in society. Reading and critically understanding the world we live in, with all its complexity, difficulties and challenges, require not only other skills (learning to search for and validate information, reading with new codes and grammar, etc) but, to a certain extent, also metaskills, matrixes and mechanisms that are transversal to the different and new literacies, are necessary. They are needed not just to interpret but equally to communicate and participate in the little worlds that make up our everyday activities as well as, in a broader sense, in the world of the polis, which today is a global world.
Resumo:
This book was produced in the scope of a research project entitled “Navigating with ‘Magalhães’: Study on the Impact of Digital Media in Schoolchildren”. This study was conducted between May 2010 and May 2013 at the Communication and Society Research Centre, University of Minho, Portugal and it was funded by the Portuguese Foundation for Science and Technology (PTDC/CCI-COM/101381/2008). As we shall explain in more detail later in this book, the main objective of that research project was to analyse the impact of the Portuguese government programme named ´e-escolinha´ launched in 2008 within the Technological Plan for Education. This Plan responds to the principles of the Lisbon Strategy signed in 2000 and rereleased in the Spring European Council of 2005.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Informática Médica)
Resumo:
Publicado em "Information control in manufacturing 1998 : (INCOM'98) : advances in industrial engineering : a proceedings volume from the 9th IFAC Symposium, Nancy-Metz, France, 24-26 June 1998. Vol. 2"
Resumo:
This paper discusses how object-oriented iuheritance can be re-interpreted if statecharts are used for modelling the dynamic behaviour of an object. The support of inheritance of statecharts allows the improvement of systems' development by easing the reutilization of parts of already developed euccessful systems, aad by promoting the iterative and continuous models' refinement advocated by the operatioaal approach. Statechart is the formalism used within UML to specify reactive state.based behaviours. This paper covers the use of statecharts within the modelling of embedded systems for industrial control applxications, where performance and memory usage are main concerns.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.