981 resultados para Predictive Modelling
Resumo:
This work proposes a constitutive model to simulate nonlinear behaviour of cement based materials subjected to different loading paths. The model incorporates a multidirectional fixed smeared crack approach to simulate crack initiation and propagation, whereas the inelastic behaviour of material between cracks is treated by a numerical strategy that combines plasticity and damage theories. For capturing more realistically the shear stress transfer between the crack surfaces, a softening diagram is assumed for modelling the crack shear stress versus crack shear strain. The plastic damage model is based on the yield function, flow rule and evolution law for hardening variable, and includes an explicit isotropic damage law to simulate the stiffness degradation and the softening behaviour of cement based materials in compression. This model was implemented into the FEMIX computer program, and experimental tests at material scale were simulated to appraise the predictive performance of this constitutive model. The applicability of the model for simulating the behaviour of reinforced concrete shear wall panels submitted to biaxial loading conditions, and RC beams failing in shear is investigated.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.
Resumo:
[Excerpt] A large number of constitutive equations were developed for viscoelastic fluids, some empirical and other with strong physical foundations. The currently available macroscopic constitutive equations can be divided in two main types: differential and integral. Some of the constitutive equations, e.g. Maxwell are available both in differential and integral types. However, relevant in tegral models, like K - BKZ, just possesses the integral form. (...)
Resumo:
Invasive aspergillosis (IA) is a life-threatening fungal disease commonly diagnosed among individuals with immunological deficits, namely hematological patients undergoing chemotherapy or allogeneic hematopoietic stem cell transplantation. Vaccines are not available, and despite the improved diagnosis and antifungal therapy, the treatment of IA is associated with a poor outcome. Importantly, the risk of infection and its clinical outcome vary significantly even among patients with similar predisposing clinical factors and microbiological exposure. Recent insights into antifungal immunity have further highlighted the complexity of host-fungus interactions and the multiple pathogen-sensing systems activated to control infection. How to decode this information into clinical practice remains however, a challenging issue in medical mycology. Here, we address recent advances in our understanding of the host-fungus interaction and discuss the application of this knowledge in potential strategies with the aim of moving toward personalized diagnostics and treatment (theranostics) in immunocompromised patients. Ultimately, the integration of individual traits into a clinically applicable process to predict the risk and progression of disease, and the efficacy of antifungal prophylaxis and therapy, holds the promise of a pioneering innovation benefiting patients at risk of IA.
Resumo:
Dissertação de mestrado integrado em Biomedical Engineering Biomaterials, Biomechanics and Rehabilitation
Resumo:
This paper describes the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at s√=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z→ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products are obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples.
Resumo:
Tese de Doutoramento em Ciência e Engenharia de Polímeros e Compósitos
Resumo:
Tese de Doutoramento (Programa Doutoral em Engenharia Biomédica)
Resumo:
Tese de Doutoramento em Ciências (Especialidade em Matemática)
Resumo:
Architectural (bad) smells are design decisions found in software architectures that degrade the ability of systems to evolve. This paper presents an approach to verify that a software architecture is smellfree using the Archery architectural description language. The language provides a core for modelling software architectures and an extension for specifying constraints. The approach consists in precisely specifying architectural smells as constraints, and then verifying that software architectures do not satisfy any of them. The constraint language is based on a propositional modal logic with recursion that includes: a converse operator for relations among architectural concepts, graded modalities for describing the cardinality in such relations, and nominals referencing architectural elements. Four architectural smells illustrate the approach.
Resumo:
Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.
Resumo:
OBJECTIVE: Risk stratification of patients with nonsustained ventricular tachycardia (NSVT) and chronic chagasic cardiomyopathy (CCC). METHODS: Seventy eight patients with CCC and NSVT were consecutively and prospectively studied. All patients underwent to 24-hour Holter monitoring, radioisotopic ventriculography, left ventricular angiography, and electrophysiologic study. With programmed ventricular stimulation. RESULTS: Sustained monomorphic ventricular tachycardia (SMVT) was induced in 25 patients (32%), NSVT in 20 (25.6%) and ventricular fibrillation in 4 (5.1%). In 29 patients (37.2%) no arrhythmia was inducible. During a 55.7-month-follow-up, 22 (28.2%) patients died, 16 due to sudden death, 2 due to nonsudden cardiac death and 4 due to noncardiac death. Logistic regression analysis showed that induction was the independent and main variable that predicted the occurrence of subsequent events and cardiac death (probability of 2.56 and 2.17, respectively). The Mantel-Haenszel chi-square test showed that survival probability was significantly lower in the inducible group than in the noninductible group. The percentage of patients free of events was significantly higher in the noninducible group. CONCLUSION: Induction of SMVT during programmed ventricular stimulation was a predictor of arrhythmia occurrence cardiac death and general mortality in patients with CCC and NSVT.
Resumo:
OBJECTIVE: To determine in arrhythmogenic right ventricular cardiomyopathy the value of QT interval dispersion for identifying the induction of sustained ventricular tachycardia in the electrophysiological study or the risk of sudden cardiac death. METHODS: We assessed QT interval dispersion in the 12-lead electrocardiogram of 26 patients with arrhythmogenic right ventricular cardiomyopathy. We analyzed its association with sustained ventricular tachycardia and sudden cardiac death, and in 16 controls similar in age and sex. RESULTS: (mean ± SD). QT interval dispersion: patients = 53.8±14.1ms; control group = 35.0±10.6ms, p=0.001. Patients with induction of ventricular tachycardia: 52.5±13.8ms; without induction of ventricular tachycardia: 57.5±12.8ms, p=0.420. In a mean follow-up period of 41±11 months, five sudden cardiac deaths occurred. QT interval dispersion in this group was 62.0±17.8, and in the others it was 51.9±12.8ms, p=0.852. Using a cutoff > or = 60ms to define an increase in the degree of the QT interval dispersion, we were able to identify patients at risk of sudden cardiac death with a sensitivity of 60%, a specificity of 57%, and positive and negative predictive values of 25% and 85%, respectively. CONCLUSION: Patients with arrhythmogenic right ventricular cardiomyopathy have a significant increase in the degree of QT interval dispersion when compared with the healthy population. However it, did not identify patients with induction of ventricular tachycardia in the electrophysiological study, showing a very low predictive value for defining the risk of sudden cardiac death in the population studied.
Resumo:
In recent decades, an increased interest has been evidenced in the research on multi-scale hierarchical modelling in the field of mechanics, and also in the field of wood products and timber engineering. One of the main motivations for hierar-chical modelling is to understand how properties, composition and structure at lower scale levels may influence and be used to predict the material properties on a macroscopic and structural engineering scale. This chapter presents the applicability of statistic and probabilistic methods, such as the Maximum Likelihood method and Bayesian methods, in the representation of timber’s mechanical properties and its inference accounting to prior information obtained in different importance scales. These methods allow to analyse distinct timber’s reference properties, such as density, bending stiffness and strength, and hierarchically consider information obtained through different non, semi or destructive tests. The basis and fundaments of the methods are described and also recommendations and limitations are discussed. The methods may be used in several contexts, however require an expert’s knowledge to assess the correct statistic fitting and define the correlation arrangement between properties.