904 resultados para Statistic nonparametric


Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Enhanced inflammatory-oxidative status is well established in chronic kidney disease. OBJECTIVE: The objective of this study was to evaluate the oxidative- inflammatory status and iron indices in patients undergoing maintenance hemodialysis (HD) with serum ferritin lower than 500ng/mL, and to correlate them with nutritional status. METHOD: In a cross-sectional survey 35 HD patients (23 with normal nutritional status, 12 with Protein-Energy-Wasting syndrome, PEW), and healthy volunteers (n = 35) were studied. Serum concentration of iron, ferritin, transferrin saturation, malondialdehyde (MDA), protein carbonyl (PC), high-sensitive serum C -reactive protein (hs-CRP) and blood counts were determined. The nutritional status was determined by anthropometric and biochemical criteria. RESULTS: HD patients showed low values of hemoglobin and higher values of ferritin, MDA and PC when compared with healthy volunteers. HD subjects with PEW had higher values of PC and hs-PCR as compared to HD patients with normal nutritional status. A multiple logistic regression analysis showed that the independent variables PC (Wald Statistic 4.25, p = 0.039) and hs-CRP (Wald Statistic 4.83, p = 0.028) where related with the patients' nutritional condition. CONCLUSION: In HD patients with serum ferritin below 500 ng/mL was observed one association of the markers of oxidative stress and inflammation with poor nutritional status independently of serum ferritin, gender and age.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Teräsvalimon toimitusprosessissa toimitusvarmuus on tärkeä tuotannon tunnusluku. Valimotuotannossa olevien lukuisten muuttujien vuoksi tuotannonsuunnittelu ja läpivirtauksen hallinta on haasteellista. Tuotteiden, valumateriaalien ja näiden yhdistelmien suuri yhtäaikainen määrä tuotannossa vaikeuttaa tuotannon ennustettavuutta sekä vaikuttaa läpivirtaukseen ja toimitusvarmuuteen. Lisäksi tuotannon eri työvaiheissa ilmenevät kapeikot rajoittavat läpivirtausta ja kasvattavat läpimenoaikoja. Kapeikkoja voidaan hyödyntää tuotannonohjauksessa jos kapeikot ovat selkeästi havaittavissa. Kapasiteetin siirtäminen ei-kapeikosta kapeikkoon lisää tuotannon läpivirtausta. Pelkkä kapeikkojen hallinta ei paranna toimitusvarmuutta jos keskeneräisen tuotannon määrä on suuri ja järjestys väärä. Tuotannon työkuormien visuaalisuuden parantaminen kaikilla työvaiheilla antaa mahdollisuuksia ohjata tuotantoa tehokkaammin. Kandidaatintyössä on tarkasteltu teräsvalimon valmistusprosessia ja selvitetty tuotannon eri vaiheissa ilmeneviä kapeikkoja. Selvitystyössä hyödynnettiin TOC-analyysiä. Keskeneräisen tuotannon määrää mitattiin useilla otannoilla eri työvaiheiden kohdalla. Tuloksia analysoimalla löydettiin tuotannon ongelmakohdat ja niihin tarvittavat kehitystoimenpiteet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä kandidaatintyössä tutkittiin sähkön kysyntäjouston kannattavuutta teollisuuskiinteis-tössä. Tutkimus suoritettiin tutkimalla vuosien 2014 ja 2015 sähkön käyttöraportteja sekä toteutuneita Elspot hintoja. Lisäksi suoritettiin mittauksia ilmalämpöpumpuista, joiden pe-rusteella selvitettiin ilmalämpöpumppujen soveltuvuutta kysyntäjoustoon. Työssä käydään läpi kysyntäjouston vaikutuksia sekä eri kuormanohjaus mahdollisuuksia asiakkaan näkökulmasta. Lisäksi tuodaan esille Elspot hinnan muodostuminen sekä Elspot hintaan vaikuttavat tekijät. Työssä vertaillaan esimerkkikiinteistön sähkönkulutuksen suh-detta toteutuneisiin Elspot hintoihin. Lisäksi tutkitaan lämmityskuormien ohjauspotentiaalia kysyntäjouston näkökulmasta. Potentiaalin kartoittamiseksi suoritettiin mittauksia sekä teh-tiin esimerkkilaskelmia kysyntäjouston kannattavuudesta. Tutkimusten perusteella voidaan huomata, että vuositasolla saavutetut säästöt jäävät pienik-si. Huomioiden riskit Elspot hinnan vaihteluissa voidaan todetta, ettei siirtyminen tuntihin-noiteltuun sähköenergiaan ole kannattavaa esimerkkikiinteistössä. Lämmityskuormien oh-jauksesta tuntitiedon mukaan saatavien lisäsäästöjen jäädessä vähäisiksi kiinteistöön ei ole järkevää investoida erillisiä järjestelmiä kuormien ohjaukseen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects oftwo types of small-group communication, synchronous computer-mediated and face-to-face, on the quantity and quality of verbal output were con^ared. Quantity was deiSned as the number of turns taken per minute, the number of Analysis-of-Speech units (AS-units) produced per minute, and the number ofwords produced per minute. Quality was defined as the number of words produced per AS-unit. In addition, the interaction of gender and type of communication was explored for any differences that existed in the output produced. Questionnaires were also given to participants to determine attitudes toward computer-mediated and face-to-face communication. Thirty intermediate-level students fi-om the Intensive English Language Program (lELP) at Brock University participated in the study, including 15 females and 15 males. Nonparametric tests, including the Wilcoxon matched-pairs test, Mann-Whitney U test, and Friedman test were used to test for significance at the p < .05 level. No significant differences were found in the effects of computer-mediated and face-to-face communication on the output produced during follow-up speaking sessions. However, the quantity and quality of interaction was significantly higher during face-to-face sessions than computer-mediated sessions. No significant differences were found in the output produced by males and females in these 2 conditions. While participants felt that the use of computer-mediated communication may aid in the development of certain language skills, they generally preferred face-to-face communication. These results differed fi-om previous studies that found a greater quantity and quality of output in addition to a greater equality of interaction produced during computer-mediated sessions in comparison to face-to-face sessions (Kern, 1995; Warschauer, 1996).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project examines students in a private school in southwestern Ontario on a 17 -day Costa Rica Outward Bound Rainforest multielement course. The study attempted to discover whether voluntary teenage participants could increase their self-perceptions of life effectiveness by participating in a 17-day expedition. A total of9 students participated in the study. The experimental design that was implemented was a mixed methods design. Participants filled in a Life Effectiveness Questionnaire (LEQ) at four predesignated times during the study. These time intervals occurred (a) before the trip commenced, (b) the first day of the trip, ( c) the last day of the trip, and (d) 1 month after the trip ended. Fieldnotes and recordings from informal group debriefing sessions were also used to gather information. Data collected in this study were analyzed in a variety of ways by the researcher. Analyses that were run on the data included the Friedman test for covariance, means, medians, and the Wilcoxon Pairs Test. The questionnaires were analyzed quantitatively, and the fieldnotes were analyzed qualitatively. Nonparametric statistical analysis was implemented as a result of the small group size of participants. Both sets of data were grouped and discussed according to similarities and differences. The data indicate that voluntary teenage participants experience significant changes over time in the areas of time management, social competency, emotional control, active initiative, and self-confidence. The types of outcomes from this study illustrate that Outward Bound-type opportunities should be offered to teenagers in Ontario schools as a means to bring about self-development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research evaluated (a) the correlation between math anxiety, math attitudes, and achievement in math and (b) comparison among these variables in terms of gender among grade 9 students in a high school located in southern Ontario. Data were compiled from participant responses to the Attitudes Toward Math Inventory (ATMI) and the Math Anxiety Rating Scale for Adolescents (MARS-A), and achievement data were gathered from participants’ grade 9 academic math course marks and the EQAO Grade 9 Assessment of Mathematics. Nonparametric tests were conducted to determine whether there were relationships between the variables and to explore whether gender differences in anxiety, attitudes, and achievement existed for this sample. Results indicated that math anxiety was not related to math achievement but was a strong correlate of attitudes toward math. A strong positive relationship was found between math attitudes and achievement in math. Specifically, self-confidence in math, enjoyment of math, value of math, and motivation were all positive correlates of achievement in math. Also, results for gender comparisons were nonsignificant, indicating that gender differences in math anxiety, math attitudes, and math achievement scores were not prevalent in this group of grade 9 students. Therefore, attitudes toward math were considered to be a stronger predictor of performance than math anxiety or gender for this group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite being considered a disease of smokers, approximately 10-15% of lung cancer cases occur in never-smokers. Lung cancer risk prediction models have demonstrated excellent ability to discriminate cases from non-cases, and have been shown to be more efficient at selecting individuals for future screening than current criteria. Existing models have primarily been developed in populations of smokers, thus there was a need to develop an accurate model in never-smokers. This study focused on developing and validating a model using never-smokers from the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial. Cox regression analysis, with six-year follow-up, was used for model building. Predictors included: age, body mass index, education level, personal history of cancer, family history of lung cancer, previous chest X-ray, and secondhand smoke exposure. This model achieved fair discrimination (optimism corrected c-statistic = 0.6645) and good calibration. This represents an improvement on existing neversmoker models, but is not suitable for individual-level risk prediction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies seemingly unrelated linear models with integrated regressors and stationary errors. By adding leads and lags of the first differences of the regressors and estimating this augmented dynamic regression model by feasible generalized least squares using the long-run covariance matrix, we obtain an efficient estimator of the cointegrating vector that has a limiting mixed normal distribution. Simulation results suggest that this new estimator compares favorably with others already proposed in the literature. We apply these new estimators to the testing of purchasing power parity (PPP) among the G-7 countries. The test based on the efficient estimates rejects the PPP hypothesis for most countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans ce texte, nous revoyons certains développements récents de l’économétrie qui peuvent être intéressants pour des chercheurs dans des domaines autres que l’économie et nous soulignons l’éclairage particulier que l’économétrie peut jeter sur certains thèmes généraux de méthodologie et de philosophie des sciences, tels la falsifiabilité comme critère du caractère scientifique d’une théorie (Popper), la sous-détermination des théories par les données (Quine) et l’instrumentalisme. En particulier, nous soulignons le contraste entre deux styles de modélisation - l’approche parcimonieuse et l’approche statistico-descriptive - et nous discutons les liens entre la théorie des tests statistiques et la philosophie des sciences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans ce texte, nous analysons les développements récents de l’économétrie à la lumière de la théorie des tests statistiques. Nous revoyons d’abord quelques principes fondamentaux de philosophie des sciences et de théorie statistique, en mettant l’accent sur la parcimonie et la falsifiabilité comme critères d’évaluation des modèles, sur le rôle de la théorie des tests comme formalisation du principe de falsification de modèles probabilistes, ainsi que sur la justification logique des notions de base de la théorie des tests (tel le niveau d’un test). Nous montrons ensuite que certaines des méthodes statistiques et économétriques les plus utilisées sont fondamentalement inappropriées pour les problèmes et modèles considérés, tandis que de nombreuses hypothèses, pour lesquelles des procédures de test sont communément proposées, ne sont en fait pas du tout testables. De telles situations conduisent à des problèmes statistiques mal posés. Nous analysons quelques cas particuliers de tels problèmes : (1) la construction d’intervalles de confiance dans le cadre de modèles structurels qui posent des problèmes d’identification; (2) la construction de tests pour des hypothèses non paramétriques, incluant la construction de procédures robustes à l’hétéroscédasticité, à la non-normalité ou à la spécification dynamique. Nous indiquons que ces difficultés proviennent souvent de l’ambition d’affaiblir les conditions de régularité nécessaires à toute analyse statistique ainsi que d’une utilisation inappropriée de résultats de théorie distributionnelle asymptotique. Enfin, nous soulignons l’importance de formuler des hypothèses et modèles testables, et de proposer des techniques économétriques dont les propriétés sont démontrables dans les échantillons finis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The GARCH and Stochastic Volatility paradigms are often brought into conflict as two competitive views of the appropriate conditional variance concept : conditional variance given past values of the same series or conditional variance given a larger past information (including possibly unobservable state variables). The main thesis of this paper is that, since in general the econometrician has no idea about something like a structural level of disaggregation, a well-written volatility model should be specified in such a way that one is always allowed to reduce the information set without invalidating the model. To this respect, the debate between observable past information (in the GARCH spirit) versus unobservable conditioning information (in the state-space spirit) is irrelevant. In this paper, we stress a square-root autoregressive stochastic volatility (SR-SARV) model which remains true to the GARCH paradigm of ARMA dynamics for squared innovations but weakens the GARCH structure in order to obtain required robustness properties with respect to various kinds of aggregation. It is shown that the lack of robustness of the usual GARCH setting is due to two very restrictive assumptions : perfect linear correlation between squared innovations and conditional variance on the one hand and linear relationship between the conditional variance of the future conditional variance and the squared conditional variance on the other hand. By relaxing these assumptions, thanks to a state-space setting, we obtain aggregation results without renouncing to the conditional variance concept (and related leverage effects), as it is the case for the recently suggested weak GARCH model which gets aggregation results by replacing conditional expectations by linear projections on symmetric past innovations. Moreover, unlike the weak GARCH literature, we are able to define multivariate models, including higher order dynamics and risk premiums (in the spirit of GARCH (p,p) and GARCH in mean) and to derive conditional moment restrictions well suited for statistical inference. Finally, we are able to characterize the exact relationships between our SR-SARV models (including higher order dynamics, leverage effect and in-mean effect), usual GARCH models and continuous time stochastic volatility models, so that previous results about aggregation of weak GARCH and continuous time GARCH modeling can be recovered in our framework.