967 resultados para Dynamic Modelling
Resumo:
INTRODUCTION: Sylvatic yellow fever (SYF) is enzootic in Brazil, causing periodic outbreaks in humans living near forest borders or in rural areas. In this study, the cycling patterns of this arbovirosis were analyzed. METHODS: Spectral Fourier analysis was used to capture the periodicity patterns of SYF in time series. RESULTS: SYF outbreaks have not increased in frequency, only in the number of cases. There are two dominant cycles in SYF outbreaks, a seven year cycle for the central-western region and a 14 year cycle for the northern region. Most of the variance was concentrated in the central-western region and dominated the entire endemic region. CONCLUSIONS: The seven year cycle is predominant in the endemic region of the disease due the greater contribution of variance in the central-western region; however, it was possible identify a 14 cycle that governs SYF outbreaks in the northern region. No periodicities were identified for the remaining geographical regions.
Resumo:
Fundação para a Ciência e a Tecnologia (FCT) - SFRH/BD/64337/2009 ; projects PTDC/ECM/70652/2006, PTDC/ECM/117660/2010 and RECI/ECM-HID/0371/2012
Resumo:
This work tests different delta hedging strategies for two products issued by Banco de Investimento Global in 2012. The work studies the behaviour of the delta and gamma of autocallables and their impact on the results when delta hedging with different rebalancing periods. Given its discontinuous payoff and path dependency, it is suggested the hedging portfolio is rebalanced on a daily basis to better follow market changes. Moreover, a mixed strategy is analysed where time to maturity is used as a criterion to change the rebalancing frequency.
Resumo:
With the projection of an increasing world population, hand-in-hand with a journey towards a bigger number of developed countries, further demand on basic chemical building blocks, as ethylene and propylene, has to be properly addressed in the next decades. The methanol-to-olefins (MTO) is an interesting reaction to produce those alkenes using coal, gas or alternative sources, like biomass, through syngas as a source for the production of methanol. This technology has been widely applied since 1985 and most of the processes are making use of zeolites as catalysts, particularly ZSM-5. Although its selectivity is not especially biased over light olefins, it resists to a quick deactivation by coke deposition, making it quite attractive when it comes to industrial environments; nevertheless, this is a highly exothermic reaction, which is hard to control and to anticipate problems, such as temperature runaways or hot-spots, inside the catalytic bed. The main focus of this project is to study those temperature effects, by addressing both experimental, where the catalytic performance and the temperature profiles are studied, and modelling fronts, which consists in a five step strategy to predict the weight fractions and activity. The mind-set of catalytic testing is present in all the developed assays. It was verified that the selectivity towards light olefins increases with temperature, although this also leads to a much faster catalyst deactivation. To oppose this effect, experiments were carried using a diluted bed, having been able to increase the catalyst lifetime between 32% and 47%. Additionally, experiments with three thermocouples placed inside the catalytic bed were performed, analysing the deactivation wave and the peaks of temperature throughout the bed. Regeneration was done between consecutive runs and it was concluded that this action can be a powerful means to increase the catalyst lifetime, maintaining a constant selectivity towards light olefins, by losing acid strength in a steam stabilised zeolitic structure. On the other hand, developments on the other approach lead to the construction of a raw basic model, able to predict weight fractions, that should be tuned to be a tool for deactivation and temperature profiles prediction.
Resumo:
Nestlé’s Dynamic Forecasting Process: Anticipating Risks and Opportunities This Work Project discusses the Nestlé’s Dynamic Forecasting Process, implemented within the organization as a way of reengineering its performance management concept and processes, so as to make it more flexible and capable to react to volatile business conditions. When stressing the importance of demand planning to reallocate resources and enhance performance, Nescafé Dolce Gusto comes as way of seeking improvements on this forecasts’ accuracy and it is thus, by providing a more accurate model on its capsules’ sales, as well as recommending adequate implementations that positively contribute to the referred Planning Process, that value is brought to the Project
Resumo:
In the early nineties, Mark Weiser wrote a series of seminal papers that introduced the concept of Ubiquitous Computing. According to Weiser, computers require too much attention from the user, drawing his focus from the tasks at hand. Instead of being the centre of attention, computers should be so natural that they would vanish into the human environment. Computers become not only truly pervasive but also effectively invisible and unobtrusive to the user. This requires not only for smaller, cheaper and low power consumption computers, but also for equally convenient display solutions that can be harmoniously integrated into our surroundings. With the advent of Printed Electronics, new ways to link the physical and the digital worlds became available. By combining common printing techniques such as inkjet printing with electro-optical functional inks, it is starting to be possible not only to mass-produce extremely thin, flexible and cost effective electronic circuits but also to introduce electronic functionalities into products where it was previously unavailable. Indeed, Printed Electronics is enabling the creation of novel sensing and display elements for interactive devices, free of form factor. At the same time, the rise in the availability and affordability of digital fabrication technologies, namely of 3D printers, to the average consumer is fostering a new industrial (digital) revolution and the democratisation of innovation. Nowadays, end-users are already able to custom design and manufacture on demand their own physical products, according to their own needs. In the future, they will be able to fabricate interactive digital devices with user-specific form and functionality from the comfort of their homes. This thesis explores how task-specific, low computation, interactive devices capable of presenting dynamic visual information can be created using Printed Electronics technologies, whilst following an approach based on the ideals behind Personal Fabrication. Focus is given on the use of printed electrochromic displays as a medium for delivering dynamic digital information. According to the architecture of the displays, several approaches are highlighted and categorised. Furthermore, a pictorial computation model based on extended cellular automata principles is used to programme dynamic simulation models into matrix-based electrochromic displays. Envisaged applications include the modelling of physical, chemical, biological, and environmental phenomena.
Resumo:
This paper aims to provide a model that allows BPI to measure the credit risk, through its rating scale, of the subsidiaries included in the corporate groups who are their clients. This model should be simple enough to be applied in practice, accurate, and must give consistent results in comparison to what have been the ratings given by the bank. The model proposed includes operational, strategic, and financial factors and ends up giving one of three results: no support, partial support, or full support from the holding to the subsidiary, and each of them translates in adjustments in each subsidiary’s credit rating. As it would be expectable, most of the subsidiaries should have the same credit rating of its parent company.
Resumo:
Electric Vehicles (EVs) have limited energy storage capacity and the maximum autonomy range is strongly dependent of the driver's behaviour. Due to the fact of that batteries cannot be recharged quickly during a journey, it is essential that a precise range prediction is available to the driver of the EV. With this information, it is possible to check if the desirable destination is achievable without a stop to charge the batteries, or even, if to reach the destination it is necessary to perform an optimized driving (e.g., cutting the air-conditioning, among others EV parameters). The outcome of this research work is the development of an Electric Vehicle Assistant (EVA). This is an application for mobile devices that will help users to take efficient decisions about route planning, charging management and energy efficiency. Therefore, it will contribute to foster EVs adoption as a new paradigm in the transportation sector.
Numerical Assessment of the out-of-plane response of a brick masonry structure without box behaviour
Resumo:
This paper presents the assessment of the out-of-plane response due to seismic loading of a masonry structure without rigid diaphragm. This structure corresponds to real scale brick masonry specimen with a main façade connected to two return walls. Two modelling approaches were defined for this evaluation. The first one consisted on macro modelling, whereas the second one on simplified micro modelling. As a first step of this study, static nonlinear analyses were conducted to the macro model aiming at evaluating the out-of-plane response and failure mechanism of the masonry structure. A sensibility analyses was performed in order to assess the mesh size and material model dependency. In addition, the macro models were subjected to dynamic nonlinear analyses with time integration in order to assess the collapse mechanism. Finally, these analyses were also applied to a simplified micro model of the masonry structure. Furthermore, these results were compared to experimental response from shaking table tests. It was observed that these numerical techniques simulate correctly the in-plane behaviour of masonry structures. However, the
Resumo:
The Our Lady of Conception church is located in village of Monforte (Portugal) and is not in use nowadays. The church presents structural damage and, consequently, a study was carried out. The study involved the survey of the damage, dynamic identification tests under ambient vibration and the numerical analysis. The church is constituted by the central nave, the chancel, the sacristy and the corridor to access the pulpit. The masonry walls present different thickness, namely 0.65 m in the chancel, 0.70 m in the sacristy, 0.92 in the central nave and 0.65 m in the corridor. The masonry walls present 8 buttresses with different dimensions. The total longitudinal and transversal dimensions of the church are equal to 21.10 m and 14.26 m, respectively. The survey of the damage showed that, in general, the masonry walls are in good conditions, with exception of the transversal walls of the nave, which present severe cracks. The arches of the vault presents also severe cracks along the central nave. As consequence, the infiltrations have increased the degradation of the vault and paintings. Furthermore, the foundations present settlements in the Southwest direction. The dynamic identification test were carried out under the action of ambient excitation of the wind and using 12 piezoelectric accelerometers of high sensitivity. The dynamic identification tests allowed to estimate the dynamic properties of the church, namely frequencies, mode shapes and damping ratios. A FEM numerical model was prepared and calibrated, based on the first four experimental modes estimated in the dynamic identification tests. The average error between the experimental and numerical frequencies of the first four modes is equal to 5%. After calibration of the numerical model, pushover analyses with a load pattern proportional to the mass, in the transversal and longitudinal direction of the church, were performed. The results of the analysis numerical allow to conclude that the most vulnerable direction of the church is in the transversal one and the maximum load factor is equal to 0.35.
Resumo:
The dearth of knowledge on the load resistance mechanisms of log houses and the need for developing numerical models that are capable of simulating the actual behaviour of these structures has pushed efforts to research the relatively unexplored aspects of log house construction. The aim of the research that is presented in this paper is to build a working model of a log house that will contribute toward understanding the behaviour of these structures under seismic loading. The paper presents the results of a series of shaking table tests conducted on a log house and goes on to develop a numerical model of the tested house. The finite element model has been created in SAP2000 and validated against the experimental results. The modelling assumptions and the difficulties involved in the process have been described and, finally, a discussion on the effects of the variation of different physical and material parameters on the results yielded by the model has been drawn up.
Resumo:
Forming suitable learning groups is one of the factors that determine the efficiency of collaborative learning activities. However, only a few studies were carried out to address this problem in the mobile learning environments. In this paper, we propose a new approach for an automatic, customized, and dynamic group formation in Mobile Computer Supported Collaborative Learning (MCSCL) contexts. The proposed solution is based on the combination of three types of grouping criteria: learner’s personal characteristics, learner’s behaviours, and context information. The instructors can freely select the type, the number, and the weight of grouping criteria, together with other settings such as the number, the size, and the type of learning groups (homogeneous or heterogeneous). Apart from a grouping mechanism, the proposed approach represents a flexible tool to control each learner, and to manage the learning processes from the beginning to the end of collaborative learning activities. In order to evaluate the quality of the implemented group formation algorithm, we compare its Average Intra-cluster Distance (AID) with the one of a random group formation method. The results show a higher effectiveness of the proposed algorithm in forming homogenous and heterogeneous groups compared to the random method.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.