902 resultados para two stage quantile regression
Resumo:
Se valora el papel que desempeña el teatro en el nivel educativo de primaria, no sólo como una forma de arte por derecho propio,sino también como una disciplina aprovechada como medio de aprendizaje de otras áreas del curriculo (lengua, desarrollo personal y social). Este libro del profesor se divide en varias secciones y se acompaña de actividades para los alumnos y actividades de evaluación y de hojas de trabajo fotocopiables.
Resumo:
Se propone a los niños del nivel educativo de primaria, la realización de actividades prácticas para que exploren e investiguen las propiedades, la composición y los cambios que sufren los distintos tipos de materiales que se les ofrecen (metal, papel, plástico,etc). También se les anima a que, siguiendo los pasos de una investigación científica, realicen experimentos con ellos. Se acompaña el libro de hojas fotocopiables y actividades de evaluación para el profesor.
Resumo:
We propose a unified data modeling approach that is equally applicable to supervised regression and classification applications, as well as to unsupervised probability density function estimation. A particle swarm optimization (PSO) aided orthogonal forward regression (OFR) algorithm based on leave-one-out (LOO) criteria is developed to construct parsimonious radial basis function (RBF) networks with tunable nodes. Each stage of the construction process determines the center vector and diagonal covariance matrix of one RBF node by minimizing the LOO statistics. For regression applications, the LOO criterion is chosen to be the LOO mean square error, while the LOO misclassification rate is adopted in two-class classification applications. By adopting the Parzen window estimate as the desired response, the unsupervised density estimation problem is transformed into a constrained regression problem. This PSO aided OFR algorithm for tunable-node RBF networks is capable of constructing very parsimonious RBF models that generalize well, and our analysis and experimental results demonstrate that the algorithm is computationally even simpler than the efficient regularization assisted orthogonal least square algorithm based on LOO criteria for selecting fixed-node RBF models. Another significant advantage of the proposed learning procedure is that it does not have learning hyperparameters that have to be tuned using costly cross validation. The effectiveness of the proposed PSO aided OFR construction procedure is illustrated using several examples taken from regression and classification, as well as density estimation applications.
Resumo:
Forecasting wind power is an important part of a successful integration of wind power into the power grid. Forecasts with lead times longer than 6 h are generally made by using statistical methods to post-process forecasts from numerical weather prediction systems. Two major problems that complicate this approach are the non-linear relationship between wind speed and power production and the limited range of power production between zero and nominal power of the turbine. In practice, these problems are often tackled by using non-linear non-parametric regression models. However, such an approach ignores valuable and readily available information: the power curve of the turbine's manufacturer. Much of the non-linearity can be directly accounted for by transforming the observed power production into wind speed via the inverse power curve so that simpler linear regression models can be used. Furthermore, the fact that the transformed power production has a limited range can be taken care of by employing censored regression models. In this study, we evaluate quantile forecasts from a range of methods: (i) using parametric and non-parametric models, (ii) with and without the proposed inverse power curve transformation and (iii) with and without censoring. The results show that with our inverse (power-to-wind) transformation, simpler linear regression models with censoring perform equally or better than non-linear models with or without the frequently used wind-to-power transformation.
Resumo:
We exploit a discontinuity in Brazilian municipal election rules to investigate whether political competition has a causal impact on policy choices. In municipalities with less than 200,000 voters mayors are elected with a plurality of the vote. In municipalities with more than 200,000 voters a run-off election takes place among the top two candidates if neither achieves a majority of the votes. At a first stage, we show that the possibility of runoff increases political competition. At a second stage, we use the discontinuity as a source of exogenous variation to infer causality from political competition to fiscal policy. Our second stage results suggest that political competition induces more investment and less current spending, particularly personnel expenses. Furthermore, the impact of political competition is larger when incumbents can run for reelection, suggesting incentives matter insofar as incumbents can themselves remain in office.
Resumo:
This paper considers two-sided tests for the parameter of an endogenous variable in an instrumental variable (IV) model with heteroskedastic and autocorrelated errors. We develop the nite-sample theory of weighted-average power (WAP) tests with normal errors and a known long-run variance. We introduce two weights which are invariant to orthogonal transformations of the instruments; e.g., changing the order in which the instruments appear. While tests using the MM1 weight can be severely biased, optimal tests based on the MM2 weight are naturally two-sided when errors are homoskedastic. We propose two boundary conditions that yield two-sided tests whether errors are homoskedastic or not. The locally unbiased (LU) condition is related to the power around the null hypothesis and is a weaker requirement than unbiasedness. The strongly unbiased (SU) condition is more restrictive than LU, but the associated WAP tests are easier to implement. Several tests are SU in nite samples or asymptotically, including tests robust to weak IV (such as the Anderson-Rubin, score, conditional quasi-likelihood ratio, and I. Andrews' (2015) PI-CLC tests) and two-sided tests which are optimal when the sample size is large and instruments are strong. We refer to the WAP-SU tests based on our weights as MM1-SU and MM2-SU tests. Dropping the restrictive assumptions of normality and known variance, the theory is shown to remain valid at the cost of asymptotic approximations. The MM2-SU test is optimal under the strong IV asymptotics, and outperforms other existing tests under the weak IV asymptotics.
Resumo:
First zoeal stages of the grapsinid Goniopsis cruentata (Latreille, 1803) and the sesarminid Aratus pisonii (H. Milne Edwards, 1837), are described and illustrated. Grapsinae zoeae can be distinguished from the other grapsid larvae by the absence of lateral spines on the carapace and the reduction of the antennal exopod to a small seta. A key to the first zoeal stage of the Brazilian coast Grapsidae is provided.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Background: Changes in heart rate during rest-exercise transition can be characterized by the application of mathematical calculations, such as deltas 0-10 and 0-30 seconds to infer on the parasympathetic nervous system and linear regression and delta applied to data range from 60 to 240 seconds to infer on the sympathetic nervous system. The objective of this study was to test the hypothesis that young and middle-aged subjects have different heart rate responses in exercise of moderate and intense intensity, with different mathematical calculations. Methods: Seven middle-aged men and ten young men apparently healthy were subject to constant load tests (intense and moderate) in cycle ergometer. The heart rate data were submitted to analysis of deltas (0-10, 0-30 and 60-240 seconds) and simple linear regression (60-240 seconds). The parameters obtained from simple linear regression analysis were: intercept and slope angle. We used the Shapiro-Wilk test to check the distribution of data and the "t" test for unpaired comparisons between groups. The level of statistical significance was 5%. Results: The value of the intercept and delta 0-10 seconds was lower in middle age in two loads tested and the inclination angle was lower in moderate exercise in middle age. Conclusion: The young subjects present greater magnitude of vagal withdrawal in the initial stage of the HR response during constant load exercise and higher speed of adjustment of sympathetic response in moderate exercise.
Resumo:
Spatial prediction of hourly rainfall via radar calibration is addressed. The change of support problem (COSP), arising when the spatial supports of different data sources do not coincide, is faced in a non-Gaussian setting; in fact, hourly rainfall in Emilia-Romagna region, in Italy, is characterized by abundance of zero values and right-skeweness of the distribution of positive amounts. Rain gauge direct measurements on sparsely distributed locations and hourly cumulated radar grids are provided by the ARPA-SIMC Emilia-Romagna. We propose a three-stage Bayesian hierarchical model for radar calibration, exploiting rain gauges as reference measure. Rain probability and amounts are modeled via linear relationships with radar in the log scale; spatial correlated Gaussian effects capture the residual information. We employ a probit link for rainfall probability and Gamma distribution for rainfall positive amounts; the two steps are joined via a two-part semicontinuous model. Three model specifications differently addressing COSP are presented; in particular, a stochastic weighting of all radar pixels, driven by a latent Gaussian process defined on the grid, is employed. Estimation is performed via MCMC procedures implemented in C, linked to R software. Communication and evaluation of probabilistic, point and interval predictions is investigated. A non-randomized PIT histogram is proposed for correctly assessing calibration and coverage of two-part semicontinuous models. Predictions obtained with the different model specifications are evaluated via graphical tools (Reliability Plot, Sharpness Histogram, PIT Histogram, Brier Score Plot and Quantile Decomposition Plot), proper scoring rules (Brier Score, Continuous Rank Probability Score) and consistent scoring functions (Root Mean Square Error and Mean Absolute Error addressing the predictive mean and median, respectively). Calibration is reached and the inclusion of neighbouring information slightly improves predictions. All specifications outperform a benchmark model with incorrelated effects, confirming the relevance of spatial correlation for modeling rainfall probability and accumulation.
Resumo:
Outcome-dependent, two-phase sampling designs can dramatically reduce the costs of observational studies by judicious selection of the most informative subjects for purposes of detailed covariate measurement. Here we derive asymptotic information bounds and the form of the efficient score and influence functions for the semiparametric regression models studied by Lawless, Kalbfleisch, and Wild (1999) under two-phase sampling designs. We show that the maximum likelihood estimators for both the parametric and nonparametric parts of the model are asymptotically normal and efficient. The efficient influence function for the parametric part aggress with the more general information bound calculations of Robins, Hsieh, and Newey (1995). By verifying the conditions of Murphy and Van der Vaart (2000) for a least favorable parametric submodel, we provide asymptotic justification for statistical inference based on profile likelihood.
Resumo:
Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.
Resumo:
BACKGROUND Several treatment strategies are available for adults with advanced-stage Hodgkin's lymphoma, but studies assessing two alternative standards of care-increased dose bleomycin, etoposide, doxorubicin, cyclophosphamide, vincristine, procarbazine, and prednisone (BEACOPPescalated), and doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD)-were not powered to test differences in overall survival. To guide treatment decisions in this population of patients, we did a systematic review and network meta-analysis to identify the best initial treatment strategy. METHODS We searched the Cochrane Library, Medline, and conference proceedings for randomised controlled trials published between January, 1980, and June, 2013, that assessed overall survival in patients with advanced-stage Hodgkin's lymphoma given BEACOPPbaseline, BEACOPPescalated, BEACOPP variants, ABVD, cyclophosphamide (mechlorethamine), vincristine, procarbazine, and prednisone (C[M]OPP), hybrid or alternating chemotherapy regimens with ABVD as the backbone (eg, COPP/ABVD, MOPP/ABVD), or doxorubicin, vinblastine, mechlorethamine, vincristine, bleomycin, etoposide, and prednisone combined with radiation therapy (the Stanford V regimen). We assessed studies for eligibility, extracted data, and assessed their quality. We then pooled the data and used a Bayesian random-effects model to combine direct comparisons with indirect evidence. We also reconstructed individual patient survival data from published Kaplan-Meier curves and did standard random-effects Poisson regression. Results are reported relative to ABVD. The primary outcome was overall survival. FINDINGS We screened 2055 records and identified 75 papers covering 14 eligible trials that assessed 11 different regimens in 9993 patients, providing 59 651 patient-years of follow-up. 1189 patients died, and the median follow-up was 5·9 years (IQR 4·9-6·7). Included studies were of high methodological quality, and between-trial heterogeneity was negligible (τ(2)=0·01). Overall survival was highest in patients who received six cycles of BEACOPPescalated (HR 0·38, 95% credibility interval [CrI] 0·20-0·75). Compared with a 5 year survival of 88% for ABVD, the survival benefit for six cycles of BEACOPPescalated is 7% (95% CrI 3-10)-ie, a 5 year survival of 95%. Reconstructed individual survival data showed that, at 5 years, BEACOPPescalated has a 10% (95% CI 3-15) advantage over ABVD in overall survival. INTERPRETATION Six cycles of BEACOPPescalated significantly improves overall survival compared with ABVD and other regimens, and thus we recommend this treatment strategy as standard of care for patients with access to the appropriate supportive care.