948 resultados para Non-parametric methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

En esta Tesis se presenta el modelo de Kou, Difusión con saltos doble exponenciales, para la valoración de opciones Call de tipo europeo sobre los precios del petróleo como activo subyacente. Se mostrarán los cálculos numéricos para la formulación de expresiones analíticas que se resolverán mediante la implementación de algoritmos numéricos eficientes que conllevaran a los precios teóricos de las opciones evaluadas. Posteriormente se discutirán las ventajas de usar métodos como la transformada de Fourier por la sencillez relativa de su programación frente a los desarrollos de otras técnicas numéricas. Este método es usado en conjunto con el ejercicio de calibración no paramétrica de regularización, que mediante la minimización de los errores al cuadrado sujeto a una penalización fundamentada en el concepto de entropía relativa, resultaran en la obtención de precios para las opciones Call sobre el petróleo considerando una mejor capacidad del modelo de asignar precios justos frente a los transados en el mercado.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objetivo: Evaluar prospectivamente el grado de disfunción sexual en pacientes con incontinencia urinaria de esfuerzo antes y después de la cinta suburetral libre de tensión mediante el cuestionario PISQ-12 validado en español. Materiales y Métodos: Estudio observacional longitudinal de antes y después. Se seleccionaron 60 mujeres sexualmente activas con algún grado de disfunción sexual, entre abril del 2014 hasta marzo del 2015, diagnosticadas con incontinencia urinaria, programadas para colocación de cinta suburetral transobturadora en el Hospital Universitario Mayor Méderi. La mayoría de las pacientes presentaron algún grado de prolapso genital y requirieron corrección quirúrgica asociada a la cinta. Todas las pacientes respondieron el cuestionario PISQ-12 antes y 6 meses después del procedimiento. Resultados: La edad promedio fue 48 ± 4.58 años. El grado de prolapso con mayor frecuencia fue el estadío II del POP-Q 55% (n=33). El 96.7% (n=50) de las pacientes requirieron además de la colocación de la cinta suburetral corrección quirúrgica del prolapso genital. En la evaluación preoperatoria la disfunción sexual se distribuyó así: Severa: 70%, Moderada 18.3% y Leve 11.7%, después de 6 meses postoperatorios se encontró una diferencia estadísticamente significativa del cambio en el grado de disfunción sexual así: Moderada 41.5% y Leve 58.2% donde ninguna paciente quedó clasificada con disfunción severa. Discusión y Conclusión: Las pacientes que presentaron disfunción sexual severa obtuvieron mayor cambio en el grado de disfunción, luego de la colocación de la cinta suburetral.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper provides recent evidence about the beneÖts of attending preschool on future performance. A non-parametric matching procedure is used over two outcomes: math and verbal scores at a national mandatory test (Saber11) in Colombia. It is found that students who had the chance of attending preschool obtain higher scores in math (6.7%) and verbal (5.4%) than those who did not. A considerable fraction of these gaps comes from the upper quintiles of studentís performance, suggesting that preschool matters when is done at high quality institutions. When we include the number of years at the preschool, the gap rises up to 12% in verbal and 17% in math.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Normal Quantile Transform (NQT) has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF) of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP) developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of Bayesian inference in the inference of time-frequency representations has, thus far, been limited to offline analysis of signals, using a smoothing spline based model of the time-frequency plane. In this paper we introduce a new framework that allows the routine use of Bayesian inference for online estimation of the time-varying spectral density of a locally stationary Gaussian process. The core of our approach is the use of a likelihood inspired by a local Whittle approximation. This choice, along with the use of a recursive algorithm for non-parametric estimation of the local spectral density, permits the use of a particle filter for estimating the time-varying spectral density online. We provide demonstrations of the algorithm through tracking chirps and the analysis of musical data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper models the transmission of shocks between the US, Japanese and Australian equity markets. Tests for the existence of linear and non-linear transmission of volatility across the markets are performed using parametric and non-parametric techniques. In particular the size and sign of return innovations are important factors in determining the degree of spillovers in volatility. It is found that a multivariate asymmetric GARCH formulation can explain almost all of the non-linear causality between markets. These results have important implications for the construction of models and forecasts of international equity returns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The aim of this study was to evaluate root coverage of gingival recessions and to compare graft vascularization in smokers and non-smokers. Methods: Thirty subjects, 15 smokers and 15 non-smokers, were selected. Each subject had one Miller Class I or II recession in a non-molar tooth. Clinical measurements of probing depth (PD), relative clinical attachment level (CAL), gingival recession (GR), and width of keratinized tissue (KT) were determined at baseline and 3 and 6 months after surgery. The recessions were treated surgically with a coronally positioned flap associated with a subepithelial connective tissue graft. A small portion of this graft was prepared for immunohistochemistry. Blood vessels were identified and counted by expression of factor VIII-related antigen-stained endothelial cells. Results: Intragroup analysis showed that after 6 months there a was gain in CAL, a decrease in GR, and an increase in KT for both groups (P<0.05), whereas changes in PD were not statistically significant. Smokers had less root coverage than non-smokers (58.02% +/- 19.75% versus 83.35% +/- 18.53%; P<0.05). Furthermore, the smokers had more GR (1.48 +/- 0.79 mm versus 0.52 +/- 0.60 mm) than the nonsmokers (P<0.05). Histomorphometry of the donor tissue revealed a blood vessel density of 49.01 +/- 11.91 vessels/200x field for non-smokers and 36.53 +/- 10.23 vessels/200x field for smokers (P<0.05). Conclusion: Root coverage with subepithelial connective tissue graft was negatively affected by smoking, which limited and jeopardized treatment results.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cobalt is one of the main components of cast metal alloys broadly used in dentistry. It is the constituent of 45 to 70% of numerous prosthetic works. There are evidences that metal elements cause systemic and local toxicity. The purpose of the present study was to evaluate the effects of cobalt on the junctional epithelium and reduced enamel epithelium of the first superior molar in rats, during lactation. To do this, 1-day old rats were used, whose mothers received 300mg of cobalt chloride per liter of distilled water in the drinker, during lactation. After 21 days, the rat pups were killed with an anesthetic overdose. The heads were separated, fixed in ""alfac"", decalcified and embedded in paraffin. Frontal sections stained with hematoxylin and eosin were employed. Karyometric methods allowed to estimate the following parameters: biggest, smallest and mean diameters, D/d ratio, perimeter, area, volume, volume/area ratio, eccentricity, form coefficient and contour index. Stereologic methods allow to evaluate: cytoplasm/nucleus ratio, cell and cytoplasm volume, cell number density, external surface/basal membrane ratio, thickness of the epithelial layers and surface density. All the collected data were subjected to statistic analysis by the non-parametric Wilcoxon-Mann-Whitney test. The nuclei of the studied tissues showed smaller values after karyometry for: diameters; perimeter, area, volume and volume/area ratio. Stereologically, it was observed, in the junctional epithelium and in the reduced enamel epithelium, smaller cells with scarce cytoplasm, reflected in the greater number of cells per mm3 of tissue. In this study, cobalt caused epithelial atrophy, indicating a direct action on the junctional and enamel epithelium.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The oral health conditions of indigenous peoples in Amazonia are closely associated with ecological and dietary changes related to interaction with non-Indians. Aim: The study investigated the incidence of caries in an indigenous community from Central Brazil focusing on gender differences. Subjects and methods: The research was conducted among the Xavante Indians and was based on longitudinal data collected in two surveys (1999 and 2004). The study included 128 individuals, 63 (49.2%) males and 65 (50.8%) females, divided in four age brackets (6-12, 13-19, 20-34, 35-60 years of age). The DMFT (decayed, missing and filled teeth) index and incidences (difference between 1999 and 2004) were calculated for each individual. The proportion of incidence was also calculated. Differences in caries risk between gender and age brackets were compared by parametric and non-parametric tests. Results: There were statistically significant differences in relation to caries incidence between age brackets and gender. The greatest incidence was observed in the 20-34 age bracket, which presented 3.30 new decayed teeth, twice the risk of the 6-12 age bracket (p0.01), chosen as reference. While females in most age groups did not show higher risk for caries when compared to males, there was a 4.04-fold risk in the 20-34 age bracket (p0.01). Conclusion: It is concluded that factors related to the social functions of each sex (gender issues) and differential access to information, health services, and education may help to understand the differences observed in the incidence of caries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The thermal decomposition of salbutamol (beta(2) - selective adrenoreceptor) was studied using differential scanning calorimetry (DSC) and thermogravimetry/derivative thermogravimetry (TG/DTG). It was observed that the commercial sample showed a different thermal profile than the standard sample caused by the presence of excipients. These compounds increase the thermal stability of the drug. Moreover, higher activation energy was calculated for the pharmaceutical sample, which was estimated by isothermal and non-isothermal methods for the first stage of the thermal decomposition process. For isothermal experiments the average values were E(act) = 130 kJ mol(-1) (for standard sample) and E(act) = 252 kJ mol(-1) (for pharmaceutical sample) in a dynamic nitrogen atmosphere (50 mL min(-1)). For non-isothermal method, activation energy was obtained from the plot of log heating rates vs. 1/T in dynamic air atmosphere (50 mL min(-1)). The calculated values were E(act) = 134 kJ mol(-1) (for standard sample) and E(act) (=) 139 kJ mol(-1) (for pharmaceutical sample).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background and aims Evaluating status in patients with motor fluctuations is complex and occasional observations/measurements do not give an adequate picture as to the time spent in different states. We developed a test battery to assess advanced Parkinson patients' status consisting of diary assessments and motor tests. This battery was constructed and implemented on a handheld computer with built-in mobile communication. In fluctuating patients, it should typically be used several times daily in the home environment, over periods of about one week. The aim of this battery is to provide status information in order to evaluate treatment effects in clinical practice and research, follow up treatments and disease progression and predict outcome to optimize treatment strategy. Methods Selection of diary questions was based on a previous study with Duodopa® (DIREQT). Tapping tests (with and without visual cueing) and a spiral drawing test were added. Rapid prototyping was used in development of the user interface. An evaluation with two pilot patients was performed before and after receiving new treatments for advanced disease (one received Duodopa® and one received DBS). Speed and proportion missed taps were calculated for the tapping tests and entropy of the radial drawing velocity was calculated for the spiral tests. Test variables were evaluated using non-parametric statistics. Results Post-treatment improvement was detected in both patients in many of the test variables. Conclusions Although validation work remains, preliminary results are promising and the test battery is currently being evaluated in a long-term health economics study with Duodopa® (DAPHNE).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate change has resulted in substantial variations in annual extreme rainfall quantiles in different durations and return periods. Predicting the future changes in extreme rainfall quantiles is essential for various water resources design, assessment, and decision making purposes. Current Predictions of future rainfall extremes, however, exhibit large uncertainties. According to extreme value theory, rainfall extremes are rather random variables, with changing distributions around different return periods; therefore there are uncertainties even under current climate conditions. Regarding future condition, our large-scale knowledge is obtained using global climate models, forced with certain emission scenarios. There are widely known deficiencies with climate models, particularly with respect to precipitation projections. There is also recognition of the limitations of emission scenarios in representing the future global change. Apart from these large-scale uncertainties, the downscaling methods also add uncertainty into estimates of future extreme rainfall when they convert the larger-scale projections into local scale. The aim of this research is to address these uncertainties in future projections of extreme rainfall of different durations and return periods. We plugged 3 emission scenarios with 2 global climate models and used LARS-WG, a well-known weather generator, to stochastically downscale daily climate models’ projections for the city of Saskatoon, Canada, by 2100. The downscaled projections were further disaggregated into hourly resolution using our new stochastic and non-parametric rainfall disaggregator. The extreme rainfall quantiles can be consequently identified for different durations (1-hour, 2-hour, 4-hour, 6-hour, 12-hour, 18-hour and 24-hour) and return periods (2-year, 10-year, 25-year, 50-year, 100-year) using Generalized Extreme Value (GEV) distribution. By providing multiple realizations of future rainfall, we attempt to measure the extent of total predictive uncertainty, which is contributed by climate models, emission scenarios, and downscaling/disaggregation procedures. The results show different proportions of these contributors in different durations and return periods.