989 resultados para Uncertainty Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the projection of an increasing world population, hand-in-hand with a journey towards a bigger number of developed countries, further demand on basic chemical building blocks, as ethylene and propylene, has to be properly addressed in the next decades. The methanol-to-olefins (MTO) is an interesting reaction to produce those alkenes using coal, gas or alternative sources, like biomass, through syngas as a source for the production of methanol. This technology has been widely applied since 1985 and most of the processes are making use of zeolites as catalysts, particularly ZSM-5. Although its selectivity is not especially biased over light olefins, it resists to a quick deactivation by coke deposition, making it quite attractive when it comes to industrial environments; nevertheless, this is a highly exothermic reaction, which is hard to control and to anticipate problems, such as temperature runaways or hot-spots, inside the catalytic bed. The main focus of this project is to study those temperature effects, by addressing both experimental, where the catalytic performance and the temperature profiles are studied, and modelling fronts, which consists in a five step strategy to predict the weight fractions and activity. The mind-set of catalytic testing is present in all the developed assays. It was verified that the selectivity towards light olefins increases with temperature, although this also leads to a much faster catalyst deactivation. To oppose this effect, experiments were carried using a diluted bed, having been able to increase the catalyst lifetime between 32% and 47%. Additionally, experiments with three thermocouples placed inside the catalytic bed were performed, analysing the deactivation wave and the peaks of temperature throughout the bed. Regeneration was done between consecutive runs and it was concluded that this action can be a powerful means to increase the catalyst lifetime, maintaining a constant selectivity towards light olefins, by losing acid strength in a steam stabilised zeolitic structure. On the other hand, developments on the other approach lead to the construction of a raw basic model, able to predict weight fractions, that should be tuned to be a tool for deactivation and temperature profiles prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims to provide a model that allows BPI to measure the credit risk, through its rating scale, of the subsidiaries included in the corporate groups who are their clients. This model should be simple enough to be applied in practice, accurate, and must give consistent results in comparison to what have been the ratings given by the bank. The model proposed includes operational, strategic, and financial factors and ends up giving one of three results: no support, partial support, or full support from the holding to the subsidiary, and each of them translates in adjustments in each subsidiary’s credit rating. As it would be expectable, most of the subsidiaries should have the same credit rating of its parent company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relationships between accuracy and speed of decision-making, or speed-accuracy tradeoffs (SAT), have been extensively studied. However, the range of SAT observed varies widely across studies for reasons that are unclear. Several explanations have been proposed, including motivation or incentive for speed vs. accuracy, species and modality but none of these hypotheses has been directly tested. An alternative explanation is that the different degrees of SAT are related to the nature of the task being performed. Here, we addressed this problem by comparing SAT in two odor-guided decision tasks that were identical except for the nature of the task uncertainty: an odor mixture categorization task, where the distinguishing information is reduced by making the stimuli more similar to each other; and an odor identification task in which the information is reduced by lowering the intensity over a range of three log steps. (...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[Excerpt] A large number of constitutive equations were developed for viscoelastic fluids, some empirical and other with strong physical foundations. The currently available macroscopic constitutive equations can be divided in two main types: differential and integral. Some of the constitutive equations, e.g. Maxwell are available both in differential and integral types. However, relevant in tegral models, like K - BKZ, just possesses the integral form. (...)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of s√=7 TeV corresponding to an integrated luminosity of 4.7 fb −1 . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- kt algorithm with distance parameters R=0.4 or R=0.6 , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a Z boson, for 20≤pjetT<1000 GeV and pseudorapidities |η|<4.5 . The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ( |η|<1.2 ) for jets with 55≤pjetT<500 GeV . For central jets at lower pT , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for pjetT>1 TeV. The calibration of forward jets is derived from dijet pT balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- pT jets at |η|=4.5 . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Programa Doutoral em Matemática e Aplicações.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Biomedical Engineering Biomaterials, Biomechanics and Rehabilitation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the concept, technical realisation and validation of a largely data-driven method to model events with Z→ττ decays. In Z→μμ events selected from proton-proton collision data recorded at s√=8 TeV with the ATLAS experiment at the LHC in 2012, the Z decay muons are replaced by τ leptons from simulated Z→ττ decays at the level of reconstructed tracks and calorimeter cells. The τ lepton kinematics are derived from the kinematics of the original muons. Thus, only the well-understood decays of the Z boson and τ leptons as well as the detector response to the τ decay products are obtained from simulation. All other aspects of the event, such as the Z boson and jet kinematics as well as effects from multiple interactions, are given by the actual data. This so-called τ-embedding method is particularly relevant for Higgs boson searches and analyses in ττ final states, where Z→ττ decays constitute a large irreducible background that cannot be obtained directly from data control samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciência e Engenharia de Polímeros e Compósitos