963 resultados para calibration estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutora em Estatística e Gestão de Risco, Especialidade em Estatística

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute infections by the protozoan Toxoplasma gondii during pregnancy (gestational toxoplasmosis) are known to cause serious health problems to the fetus (congenital toxoplasmosis). In Brasília, there have been few studies on the incidence of toxoplasmosis. This report summarizes a retrospective study performed on 2,636 selected pregnant women attended by the public health system of Guará, a satellite-city of Brasília. In this survey, 17 cases of gestational toxoplasmosis were detected; 15 of which were primary maternal infection and the remaining 2 were consistent with secondary maternal infection. These results suggest an annual seroconversion rate of 0.64 percent (90 percent confidence interval: 0.38, 0.90).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water is a limited resource for which demand is growing. Contaminated water from inadequate wastewater treatment provides one of the greatest health challenges as it restricts development and increases poverty in emerging and developing countries. Therefore, the connection between wastewater and human health is linked to access to sanitation and to human waste disposal. Adequate sanitation is expected to create a barrier between disposed human excreta and sources of drinking water. Different approaches to wastewater management are required for different geographical regions and different stages of economic governance depending on the capacity to manage wastewater. Effective wastewater management can contribute to overcome the challenges of water scarcity. Separate collection of human urine at its source is one promising approach that strongly reduces the economic and load demands on wastewater treatment plants (WWTP). Treatment of source-separated urine appears as a sanitation system that is affordable, produces a valuable fertiliser, reduces pollution of water resources and promotes health. However, the technical realisation of urine separation still faces challenges. Biological hydrolysis of urea causes a strong increase of ammonia and pH. Under these conditions ammonia volatilises which can cause odour problems and significant nitrogen losses. The above problems can be avoided by urine stabilisation. Biological nitrification is a suitable process for stabilisation of urine. Urine is a highly concentrated nutrient solution which can lead to strong inhibition effects during bacterial nitrification. This can further lead to process instabilities. The major cause of instability is accumulation of the inhibitory intermediate compound nitrite, which could lead to process breakdown. Enhanced on-line nitrite monitoring can be applied in biological source-separated urine nitrification reactors as a sustainable and efficient way to improve the reactor performance, avoiding reactor failures and eventual loss of biological activity. Spectrophotometry appears as a promising candidate for the development and application of on-line nitrite monitoring. Spectroscopic methods together with chemometrics are presented in this work as a powerful tool for estimation of nitrite concentrations. Principal component regression (PCR) is applied for the estimation of nitrite concentrations using an immersible UV sensor and off-line spectra acquisition. The effect of particles and the effect of saturation, respectively, on the UV absorbance spectra are investigated. The analysis allows to conclude that (i) saturation has a substantial effect on nitrite estimation; (ii) particles appear to have less impact on nitrite estimation. In addition, improper mixing together with instabilities in the urine nitrification process appears to significantly reduce the performance of the estimation model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the invention of photography humans have been using images to capture, store and analyse the act that they are interested in. With the developments in this field, assisted by better computers, it is possible to use image processing technology as an accurate method of analysis and measurement. Image processing's principal qualities are flexibility, adaptability and the ability to easily and quickly process a large amount of information. Successful examples of applications can be seen in several areas of human life, such as biomedical, industry, surveillance, military and mapping. This is so true that there are several Nobel prizes related to imaging. The accurate measurement of deformations, displacements, strain fields and surface defects are challenging in many material tests in Civil Engineering because traditionally these measurements require complex and expensive equipment, plus time consuming calibration. Image processing can be an inexpensive and effective tool for load displacement measurements. Using an adequate image acquisition system and taking advantage of the computation power of modern computers it is possible to accurately measure very small displacements with high precision. On the market there are already several commercial software packages. However they are commercialized at high cost. In this work block-matching algorithms will be used in order to compare the results from image processing with the data obtained with physical transducers during laboratory load tests. In order to test the proposed solutions several load tests were carried out in partnership with researchers from the Civil Engineering Department at Universidade Nova de Lisboa (UNL).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work project is to analyze the current algorithm used by EDP to estimate their clients’ electrical energy consumptions, create a new algorithm and compare the advantages and disadvantages of both. This new algorithm is different from the current one as it incorporates some effects from temperature variations. The results of the comparison show that this new algorithm with temperature variables performed better than the same algorithm without temperature variables, although there is still potential for further improvements of the current algorithm, if the prediction model is estimated using a sample of daily data, which is the case of the current EDP algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The moisture content in concrete structures has an important influence in their behavior and performance. Several vali-dated numerical approaches adopt the governing equation for relative humidity fields proposed in Model Code 1990/2010. Nevertheless there is no integrative study which addresses the choice of parameters for the simulation of the humidity diffusion phenomenon, particularly in concern to the range of parameters forwarded by Model Code 1990/2010. A software based on a Finite Difference Method Algorithm (1D and axisymmetric cases) is used to perform sensitivity analyses on the main parameters in a normal strength concrete. Then, based on the conclusions of the sensi-tivity analyses, experimental results from nine different concrete compositions are analyzed. The software is used to identify the main material parameters that better fit the experimental data. In general, the model was able to satisfactory fit the experimental results and new correlations were proposed, particularly focusing on the boundary transfer coeffi-cient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many texture measures have been developed and used for improving land-cover classification accuracy, but rarely has research examined the role of textures in improving the performance of aboveground biomass estimations. The relationship between texture and biomass is poorly understood. This paper used Landsat Thematic Mapper (TM) data to explore relationships between TM image textures and aboveground biomass in Rondônia, Brazilian Amazon. Eight grey level co-occurrence matrix (GLCM) based texture measures (i.e., mean, variance, homogeneity, contrast, dissimilarity, entropy, second moment, and correlation), associated with seven different window sizes (5x5, 7x7, 9x9, 11x11, 15x15, 19x19, and 25x25), and five TM bands (TM 2, 3, 4, 5, and 7) were analyzed. Pearson's correlation coefficient was used to analyze texture and biomass relationships. This research indicates that most textures are weakly correlated with successional vegetation biomass, but some textures are significantly correlated with mature forest biomass. In contrast, TM spectral signatures are significantly correlated with successional vegetation biomass, but weakly correlated with mature forest biomass. Our findings imply that textures may be critical in improving mature forest biomass estimation, but relatively less important for successional vegetation biomass estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the trigger and offline reconstruction, identification and energy calibration algorithms for hadronic decays of tau leptons employed for the data collected from pp collisions in 2012 with the ATLAS detector at the LHC center-of-mass energy s√ = 8 TeV. The performance of these algorithms is measured in most cases with Z decays to tau leptons using the full 2012 dataset, corresponding to an integrated luminosity of 20.3 fb−1. An uncertainty on the offline reconstructed tau energy scale of 2% to 4%, depending on transverse energy and pseudorapidity, is achieved using two independent methods. The offline tau identification efficiency is measured with a precision of 2.5% for hadronically decaying tau leptons with one associated track, and of 4% for the case of three associated tracks, inclusive in pseudorapidity and for a visible transverse energy greater than 20 GeV. For hadronic tau lepton decays selected by offline algorithms, the tau trigger identification efficiency is measured with a precision of 2% to 8%, depending on the transverse energy. The performance of the tau algorithms, both offline and at the trigger level, is found to be stable with respect to the number of concurrent proton--proton interactions and has supported a variety of physics results using hadronically decaying tau leptons at ATLAS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In longitudinal studies of disease, patients may experience several events through a follow-up period. In these studies, the sequentially ordered events are often of interest and lead to problems that have received much attention recently. Issues of interest include the estimation of bivariate survival, marginal distributions and the conditional distribution of gap times. In this work we consider the estimation of the survival function conditional to a previous event. Different nonparametric approaches will be considered for estimating these quantities, all based on the Kaplan-Meier estimator of the survival function. We explore the finite sample behavior of the estimators through simulations. The different methods proposed in this article are applied to a data set from a German Breast Cancer Study. The methods are used to obtain predictors for the conditional survival probabilities as well as to study the influence of recurrence in overall survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mycotoxins are toxic secondary metabolites produced by filamentous fungi that occur naturally in agricultural commodities worldwide. Aflatoxins, ochratoxin A, patulin, fumonisins, zearalenone, trichothecenes and ergot alkaloids are presently the most important for food and feed safety. These compounds are produced by several species that belong to the Aspergillus, Penicillium, Fusarium and Claviceps genera and can be carcinogenic, mutagenic, teratogenic, cytotoxic, neurotoxic, nephrotoxic, estrogenic and immunosuppressant. Human and animal exposure to mycotoxins is generally assessed by taking into account data on the occurrence of mycotoxins in food and feed as well as data on the consumption patterns of the concerned population. This evaluation is crucial to support measures to reduce consumer exposure to mycotoxins. This work reviews the occurrence and levels of mycotoxins in Portuguese food and feed to provide a global overview of this issue in Portugal. With the information collected, the exposure of the Portuguese population to those mycotoxins is assessed, and the estimated dietary intakes are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precise estimation of the postmortem interval (PMI) is one of the most important topics in forensic pathology. However, the PMI estimation is based mainly on the visual observation of cadaverous pheno- mena (e.g. algor, livor and rigor mortis) and on alternative methods such as thanatochemistry that remain relatively imprecise. The aim of this in vitro study was to evaluate the kinetic alterations of several bio- chemical parameters (i.e. proteins, enzymes, substrates, electrolytes and lipids) during putrefaction of human blood. For this purpose, we performed kinetic biochemical analysis during a 264 hour period. The results showed a significant linear correlation between total and direct bilirubin, urea, uric acid, transferrin, immunoglobulin M (IgM), creatine kinase (CK), aspartate transaminase (AST), calcium and iron with the time of blood putrefaction. These parameters allowed us to develop two mathematical models that may have predictive values and become important complementary tools of traditional methods to achieve a more accurate PMI estimation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mestrado em Estatística