906 resultados para Probabilistic estimation
Resumo:
The adhesive bonding technique enables both weight and complexity reduction in structures that require some joining technique to be used on account of fabrication/component shape issues. Because of this, adhesive bonding is also one of the main repair methods for metal and composite structures by the strap and scarf configurations. The availability of strength prediction techniques for adhesive joints is essential for their generalized application and it can rely on different approaches, such as mechanics of materials, conventional fracture mechanics or damage mechanics. These two last techniques depend on the measurement of the fracture toughness (GC) of materials. Within the framework of damage mechanics, a valid option is the use of Cohesive Zone Modelling (CZM) coupled with Finite Element (FE) analyses. In this work, CZM laws for adhesive joints considering three adhesives with varying ductility were estimated. The End-Notched Flexure (ENF) test geometry was selected based on overall test simplicity and results accuracy. The adhesives Araldite® AV138, Araldite® 2015 and Sikaforce® 7752 were studied between high-strength aluminium adherends. Estimation of the CZM laws was carried out by an inverse methodology based on a curve fitting procedure, which enabled a precise estimation of the adhesive joints’ behaviour. The work allowed to conclude that a unique set of shear fracture toughness (GIIC) and shear cohesive strength (ts0) exists for each specimen that accurately reproduces the adhesive layer’ behaviour. With this information, the accurate strength prediction of adhesive joints in shear is made possible by CZM.
Resumo:
This work aims to shed some light on longshore sediment transport (LST) in the highly energetic northwest coast of Portugal. Data achieved through a sand-tracer experiment are compared with data obtained from the original and the new re-evaluated longshore sediment transport formulas (USACE Waterways Experiment Station’s Coastal Engineering and Research Center, Kamphuis, and Bayram bulk formulas) to assess their performance. The field experiment with dyed sand was held at Ofir Beach during one tidal cycle under medium wave-energy conditions. Local hydrodynamic conditions and beach topography were recorded. The tracer was driven southward in response to the local swell and wind- and wave-induced currents (Hsb=0.75mHsb=0.75m, Tp=11.5sTp=11.5s, θb=8−12°θb=8−12°). The LST was estimated by using a linear sediment transport flux approach. The obtained value (2.3×10−3m3⋅s−12.3×10−3m3⋅s−1) approached the estimation provided by the original Bayram formula (2.5×10−3m3⋅s−12.5×10−3m3⋅s−1). The other formulas overestimated the transport, but the estimations resulting from the new re-evaluated formulas also yield approximate results. Therefore, the results of this work indicated that the Bayram formula may give satisfactory results for predicting the longshore sediment transport on Ofir Beach.
Resumo:
In this work an adaptive modeling and spectral estimation scheme based on a dual Discrete Kalman Filtering (DKF) is proposed for speech enhancement. Both speech and noise signals are modeled by an autoregressive structure which provides an underlying time frame dependency and improves time-frequency resolution. The model parameters are arranged to obtain a combined state-space model and are also used to calculate instantaneous power spectral density estimates. The speech enhancement is performed by a dual discrete Kalman filter that simultaneously gives estimates for the models and the signals. This approach is particularly useful as a pre-processing module for parametric based speech recognition systems that rely on spectral time dependent models. The system performance has been evaluated by a set of human listeners and by spectral distances. In both cases the use of this pre-processing module has led to improved results.
Resumo:
The study of chemical diffusion in biological tissues is a research field of high importance and with application in many clinical, research and industrial areas. The evaluation of diffusion and viscosity properties of chemicals in tissues is necessary to characterize treatments or inclusion of preservatives in tissues or organs for low temperature conservation. Recently, we have demonstrated experimentally that the diffusion properties and dynamic viscosity of sugars and alcohols can be evaluated from optical measurements. Our studies were performed in skeletal muscle, but our results have revealed that the same methodology can be used with other tissues and different chemicals. Considering the significant number of studies that can be made with this method, it becomes necessary to turn data processing and calculation easier. With this objective, we have developed a software application that integrates all processing and calculations, turning the researcher work easier and faster. Using the same experimental data that previously was used to estimate the diffusion and viscosity of glucose in skeletal muscle, we have repeated the calculations with the new application. Comparing between the results obtained with the new application and with previous independent routines we have demonstrated great similarity and consequently validated the application. This new tool is now available to be used in similar research to obtain the diffusion properties of other chemicals in different tissues or organs.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics
Resumo:
Research Project submited as partial fulfilment for the Master Degree in Statistics and Information Management
Resumo:
Dissertação para obtenção do Grau de Doutora em Estatística e Gestão de Risco, Especialidade em Estatística
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics
Resumo:
Acute infections by the protozoan Toxoplasma gondii during pregnancy (gestational toxoplasmosis) are known to cause serious health problems to the fetus (congenital toxoplasmosis). In Brasília, there have been few studies on the incidence of toxoplasmosis. This report summarizes a retrospective study performed on 2,636 selected pregnant women attended by the public health system of Guará, a satellite-city of Brasília. In this survey, 17 cases of gestational toxoplasmosis were detected; 15 of which were primary maternal infection and the remaining 2 were consistent with secondary maternal infection. These results suggest an annual seroconversion rate of 0.64 percent (90 percent confidence interval: 0.38, 0.90).
Resumo:
Water is a limited resource for which demand is growing. Contaminated water from inadequate wastewater treatment provides one of the greatest health challenges as it restricts development and increases poverty in emerging and developing countries. Therefore, the connection between wastewater and human health is linked to access to sanitation and to human waste disposal. Adequate sanitation is expected to create a barrier between disposed human excreta and sources of drinking water. Different approaches to wastewater management are required for different geographical regions and different stages of economic governance depending on the capacity to manage wastewater. Effective wastewater management can contribute to overcome the challenges of water scarcity. Separate collection of human urine at its source is one promising approach that strongly reduces the economic and load demands on wastewater treatment plants (WWTP). Treatment of source-separated urine appears as a sanitation system that is affordable, produces a valuable fertiliser, reduces pollution of water resources and promotes health. However, the technical realisation of urine separation still faces challenges. Biological hydrolysis of urea causes a strong increase of ammonia and pH. Under these conditions ammonia volatilises which can cause odour problems and significant nitrogen losses. The above problems can be avoided by urine stabilisation. Biological nitrification is a suitable process for stabilisation of urine. Urine is a highly concentrated nutrient solution which can lead to strong inhibition effects during bacterial nitrification. This can further lead to process instabilities. The major cause of instability is accumulation of the inhibitory intermediate compound nitrite, which could lead to process breakdown. Enhanced on-line nitrite monitoring can be applied in biological source-separated urine nitrification reactors as a sustainable and efficient way to improve the reactor performance, avoiding reactor failures and eventual loss of biological activity. Spectrophotometry appears as a promising candidate for the development and application of on-line nitrite monitoring. Spectroscopic methods together with chemometrics are presented in this work as a powerful tool for estimation of nitrite concentrations. Principal component regression (PCR) is applied for the estimation of nitrite concentrations using an immersible UV sensor and off-line spectra acquisition. The effect of particles and the effect of saturation, respectively, on the UV absorbance spectra are investigated. The analysis allows to conclude that (i) saturation has a substantial effect on nitrite estimation; (ii) particles appear to have less impact on nitrite estimation. In addition, improper mixing together with instabilities in the urine nitrification process appears to significantly reduce the performance of the estimation model.
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
The aim of this work project is to analyze the current algorithm used by EDP to estimate their clients’ electrical energy consumptions, create a new algorithm and compare the advantages and disadvantages of both. This new algorithm is different from the current one as it incorporates some effects from temperature variations. The results of the comparison show that this new algorithm with temperature variables performed better than the same algorithm without temperature variables, although there is still potential for further improvements of the current algorithm, if the prediction model is estimated using a sample of daily data, which is the case of the current EDP algorithm.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.