937 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this three phase study was to develop quality of radiotherapy care by the e-Feedback knowledge of radiotherapy -intervention (e-Re-Know). In Phase I, the purpose was to describe the quality of radiotherapy care and its deficits experienced by cancer patients. Based on the deficits in patient education in Phase II, the purpose was to describe cancer patients’ e-knowledge expectations in radiotherapy. In Phase III, the purpose was to develop and evaluate the outcomes of the e-Re-Know among breast cancer patients. The ultimate aim was to develop radiotherapy care to support patients’ empowerment with patient e-education. In Phase I (2004-2005), the descriptive design was used, and 134 radiotherapy patients evaluated their experiences by Good Nursing Care Scale for Patients (GNCS-P) in the middle of RT period. In Phase II (2006-2008), the descriptive longitudinal design was used and 100 radiotherapy patients’ e-knowledge expectations of RT were evaluated using open-ended questionnaire developed for this study before commencing first RT, in the middle of the treatment, and concluding RT period. In Phase III, firstly (2009-2010), the e-Re-Know intervention, i.e. knowledge test and feedback, was developed in terms of empowering knowledge and implemented with e-feedback approach based on literature and expert reviews. Secondly (2011-2014), the randomized controlled study was used to evaluate the e-Re-Know. Breast cancer patients randomized to either the intervention group (n=65) receiving the e-Re-Know by e-mail before commencing first RT and standard education or the control group (n=63) receiving standard education. The data were collected before commencing first RT, concluding last RT and 3 months after last RT using RT Knowledge Test, Spielberger’s State Trait Inventory (STAI) and Functional Assessment of Cancer Therapy - Breast (FACT-B) –instruments. Data were analyzed using statistical methods and content analysis. The study showed radiotherapy patients experienced quality of care high. However, there were deficits in patient education. Furthermore, radiotherapy patients’ multidimensional e-knowledge expectations through Internet covered mainly bio-physiological and functional knowledge. Thus, the e-Re-Know was developed and evaluated. The study showed when breast cancer patients’ carried out the e-Re-Know their knowledge of side effects self-care was significantly increased and quality of life (QOL) significantly improved in line with decrease in anxiety from time before radiotherapy period to three months after. In addition, the e-Re-Know has potential to have positive effects on anxiety and QOL, regardless of patient characteristics or knowledge level. The results support the theory of empowering patient education suggesting that empowerment can be supported by confirming patients’ understanding of own knowledge level. In summary, the e-Feedback knowledge of radiotherapy (e-Re-Know) intervention can be recommended in development of quality of radiotherapy care experienced by breast cancer patients. Further research is needed to assess and develop patient-centred quality of care by patient education among cancer patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksessa tarkastellaan peruskoulun yläkouluvalintoja Turussa. Tarkastelun keskiössä ovat vuonna 1997 syntyneiden turkulaislasten vanhempien yläkouluvalintaa koskeva yleinen sekä omaan lapseen kiinnittyvä puhe ja toimijuus paikallisessa institutionaalisessa kouluvalintatilassa sekä vanhempien lapsen koulutukseen ja kouluvalintaan liittämät perustelut, merkitykset, arvot ja arvostukset. Tämän lisäksi tutkimuksessa tarkastellaan puheesta ja toimista rakentuvia perheiden kouluvalintastrategioita, joita peilataan äitien koulutuksellisiin ja sosiaalisiin resursseihin sekä paikalliseen toimintapolitiikkaan. Tutkimus ei kerro ainoastaan paikallisessa kontekstissa tapahtuvista kouluvalinnoista, vaan laajemmin yhteiskunnassa vallitsevista hierarkioista ja arvoista sekä koulutukseen ja sosioekonomiseen asemaan linkittyvistä normatiivisista toimintatavoista. Tutkimuksessa käytetään haastattelu- ja kyselyaineistoja. Aineistot kerättiin osana kahta laajempaa Suomen Akatemian rahoittamaa Helsingin ja Turun yliopistojen kanssa yhteistyössä tehtyä tutkimusprojektia Vanhemmat ja kouluvalinta – Perheiden koulutusstrategiat, eriarvoistuminen ja paikalliset koulupolitiikat suomalaisessa peruskoulussa (VAKOVA) 2009–2012 sekä Parents and School Choice. Family Strategies, Segregation and School Policies in Chilean and Finnish Basic Schooling (PASC) 2010–2013. Tutkimusaineistot koostuvat 87 turkulaisäidin haastattelusta ja kyselyaineistosta. Kyselyaineiston analyysissä on käytetty kuvailevia tilastollisia menetelmiä, ja sitä käytetään ensisijaisesti taustoittamaan haastatteluaineistoa. Haastatteluaineiston analyysi perustuu pääasiallisesti teema-analyysiin, mutta toimija-asema-analyysin osalta myös diskursiiviseen lähestymistapaan. Haastatteluaineiston pohjalta esiin nousseiden lasten koulutusta ja kouluvalintoja koskevien kuvausten perusteella perheiden yläkouluvalinnat jaettiin kolmeen erityyppiseen valintastrategiaan: perinteiseen lähikouluvalintastrategiaan (n=41), ambivalenttiseen kouluvalintastrategiaan (n=23) ja päämäärätietoiseen kouluvalintastrategiaan (n=23). Jokainen kolmesta strategiasta piti sisällään kahdenlaista toimijuutta kouluvalintakentällä. Ryhmittely kouluvalintastrategioittain ja toimija-asemittain perustui äitien puhetapaan kouluvalinnoista ja yleisemmin koulutukseen liitetyistä merkityksistä ja arvoista sekä konkreettiseen toimintaan kouluvalinnan suhteen. Lähikouluvalintastrategiaa suosivien jälkeläiset siirtyivät koulunsa yleisluokalle. Perheet toimivat valintakentällä kaupungin rajaavan toimintapolitiikan ohjaamina, jolloin kouluvalinta näytti passiiviselta. Osoitteenmukaiseen kouluun siirtymistä perusteltiin praktisilla syillä; koulumatkan pituudella, kulkuyhteyksillä ja lapsen kaverisuhteilla. Hyvinvointivaltion edellytykseksi nähtiin kaikille taattu samanvertainen koulutus ja edelleen luotettiin perinteistä peruskoulua määrittävään mahdollisuuksien tasa-arvoon. Koulutuksen yhdeksi tärkeäksi tehtäväksi nähtiin lapsen kasvattaminen hyvinvoivaksi ja onnelliseksi. Vanhempien toiminta oli perinteisen kouluvalintastrategian mukaista. Ambivalenttista kouluvalintastrategiaa käyttävistä perheistä toiminta kouluvalintakentällä oli kahtalaista. Äidit joko harkitsivat kouluvalintoja tai vertailivat kouluja ja niihin pääsymahdollisuuksia realistisesti tasapainoillen ohjaavan ja mahdollistavan toimintapolitiikan välimaastossa. Tärkeintä oli olla tietoinen kaupungin kouluvalintapolitiikasta sekä siitä, että valinnoilla voi olla merkitystä jälkikasvun koulupolulle. Eri vaihtoehtojen punnitsemisen jälkeen päädyttiin useimmin lähikoulun painotettuun opetukseen. Lapsen peruskoulutusta haluttiin rikastaa painotetulla opetuksella ja hänen toivottiin pääsevän motivoituneeseen ja oppimismyönteiseen koululuokkaan. Valintoja tehtiin paikallisen toimintapolitiikan puitteissa lapsen parasta toivoen. Koulutuksen tehtäväksi nähtiin lapsen intellektuaalinen kasvu kiedottuna koulutuksen tuottamaan hyvinvointiin ja onnellisuuteen. Perheiden valintastrategiaksi muodostui ambivalenttinen strategia motivoituneen oppimisympäristön löytämiseksi. Päämäärätietoista kouluvalintastrategiaa käyttävät vanhemmat hyödynsivät aktiivisesti erilaisia reittejä tiettyihin yläkouluihin pääsemiseksi. Ennakoivien perheiden lapset olivat opiskelleet sellaisessa alakoulussa, joka ei kuulunut yläkoulun oppilasalueelle, mutta takasi lapselle reitin suosittuun yläkouluun. Määrätietoisten perheissä havahduttiin valintoihin puolestaan yläkouluun siirryttäessä, jolloin koulupaikkaa haettiin sopivimman painotetun opetuksen ja koulun maineen mukaan pois lähiyläkoulusta. Lähikoulu -periaate koettiin epäoikeudenmukaiseksi, sillä lapsella tulee olla oikeus toteuttaa omia kykyjään ja lahjakkuuttaan valikoidussa oppilasryhmässä ja perheillä mahdollisuus valita lapsen koulu. Paikallinen toimintapolitiikka ei näyttänyt rajaavan vanhempien kouluvalintoja. Koulutuksen tarkoitukseksi nähtiin intellektuaalinen kasvu ja akateemissivistävä tehtävä. Päämäärätietoisen kouluvalintavalintastrategian tavoitteena oli perheelle sopivan habituksen takaaminen. Paikallinen toimintapolitiikka mahdollisti vanhempien erilaisten kouluvalintastrategioiden rakentumisen ohjaten ensisijaisesti lähiyläkouluun, mutta samalla mahdollistaen koulun valinnan toissijaisen haun kriteerein. Kouluvalintastrategioihin ja toimintatapaan kouluvalintakentällä kytkeytyi vanhempien koulutukseen liittämät arvot sekä kulttuuriset ja sosiaaliset resurssit ja se, miten niitä käytettiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research concerns different statistical methods that assist to increase the demand forecasting accuracy of company X’s forecasting model. Current forecasting process was analyzed in details. As a result, graphical scheme of logical algorithm was developed. Based on the analysis of the algorithm and forecasting errors, all the potential directions for model future improvements in context of its accuracy were gathered into the complete list. Three improvement directions were chosen for further practical research, on their basis, three test models were created and verified. Novelty of this work lies in the methodological approach of the original analysis of the model, which identified its critical points, as well as the uniqueness of the developed test models. Results of the study formed the basis of the grant of the Government of St. Petersburg.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In vivo proton magnetic resonance spectroscopy (¹H-MRS) is a technique capable of assessing biochemical content and pathways in normal and pathological tissue. In the brain, ¹H-MRS complements the information given by magnetic resonance images. The main goal of the present study was to assess the accuracy of ¹H-MRS for the classification of brain tumors in a pilot study comparing results obtained by manual and semi-automatic quantification of metabolites. In vivo single-voxel ¹H-MRS was performed in 24 control subjects and 26 patients with brain neoplasms that included meningiomas, high-grade neuroglial tumors and pilocytic astrocytomas. Seven metabolite groups (lactate, lipids, N-acetyl-aspartate, glutamate and glutamine group, total creatine, total choline, myo-inositol) were evaluated in all spectra by two methods: a manual one consisting of integration of manually defined peak areas, and the advanced method for accurate, robust and efficient spectral fitting (AMARES), a semi-automatic quantification method implemented in the jMRUI software. Statistical methods included discriminant analysis and the leave-one-out cross-validation method. Both manual and semi-automatic analyses detected differences in metabolite content between tumor groups and controls (P < 0.005). The classification accuracy obtained with the manual method was 75% for high-grade neuroglial tumors, 55% for meningiomas and 56% for pilocytic astrocytomas, while for the semi-automatic method it was 78, 70, and 98%, respectively. Both methods classified all control subjects correctly. The study demonstrated that ¹H-MRS accurately differentiated normal from tumoral brain tissue and confirmed the superiority of the semi-automatic quantification method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beef can be contaminated during the slaughter process, thus other methods, besides the traditional water washing, must be adopted to preserve meat safety. The objective of this study was to evaluate the effect of 2% acetic acid interventions on the reduction of indicator bacteria on beef carcasses at a commercial slaughterhouse in Mexico. Reduction was measured by the count of mesophilic aerobic bacteria (TPC), total coliform (TC), and fecal coliform (FC) (log CFU/ cm²). Among the different interventions tested, treatments combining acetic acid solution sprayed following carcass water washing had greater microbial reduction level. Acetic acid solution sprayed at low pressure and longer time (10-30 psi/ 60 s) reached higher TPC, TC, and FC reductions than that obtained under high pressure/ shorter time (1,700 psi/ 15 s; P<0.05). Exposure time significantly affected microbial reduction on carcasses. Acetic acid solution sprayed after carcass washing can be successfully used to control sources of indicator bacteria on beef carcasses under commercial conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes finite-sample procedures for testing the SURE specification in multi-equation regression models, i.e. whether the disturbances in different equations are contemporaneously uncorrelated or not. We apply the technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] to obtain exact tests based on standard LR and LM zero correlation tests. We also suggest a MC quasi-LR (QLR) test based on feasible generalized least squares (FGLS). We show that the latter statistics are pivotal under the null, which provides the justification for applying MC tests. Furthermore, we extend the exact independence test proposed by Harvey and Phillips (1982) to the multi-equation framework. Specifically, we introduce several induced tests based on a set of simultaneous Harvey/Phillips-type tests and suggest a simulation-based solution to the associated combination problem. The properties of the proposed tests are studied in a Monte Carlo experiment which shows that standard asymptotic tests exhibit important size distortions, while MC tests achieve complete size control and display good power. Moreover, MC-QLR tests performed best in terms of power, a result of interest from the point of view of simulation-based tests. The power of the MC induced tests improves appreciably in comparison to standard Bonferroni tests and, in certain cases, outperforms the likelihood-based MC tests. The tests are applied to data used by Fischer (1993) to analyze the macroeconomic determinants of growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we develop finite-sample inference procedures for stationary and nonstationary autoregressive (AR) models. The method is based on special properties of Markov processes and a split-sample technique. The results on Markovian processes (intercalary independence and truncation) only require the existence of conditional densities. They are proved for possibly nonstationary and/or non-Gaussian multivariate Markov processes. In the context of a linear regression model with AR(1) errors, we show how these results can be used to simplify the distributional properties of the model by conditioning a subset of the data on the remaining observations. This transformation leads to a new model which has the form of a two-sided autoregression to which standard classical linear regression inference techniques can be applied. We show how to derive tests and confidence sets for the mean and/or autoregressive parameters of the model. We also develop a test on the order of an autoregression. We show that a combination of subsample-based inferences can improve the performance of the procedure. An application to U.S. domestic investment data illustrates the method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the context of multivariate regression (MLR) and seemingly unrelated regressions (SURE) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. in this paper, we propose finite-and large-sample likelihood-based test procedures for possibly non-linear hypotheses on the coefficients of MLR and SURE systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We provide a theoretical framework to explain the empirical finding that the estimated betas are sensitive to the sampling interval even when using continuously compounded returns. We suppose that stock prices have both permanent and transitory components. The permanent component is a standard geometric Brownian motion while the transitory component is a stationary Ornstein-Uhlenbeck process. The discrete time representation of the beta depends on the sampling interval and two components labelled \"permanent and transitory betas\". We show that if no transitory component is present in stock prices, then no sampling interval effect occurs. However, the presence of a transitory component implies that the beta is an increasing (decreasing) function of the sampling interval for more (less) risky assets. In our framework, assets are labelled risky if their \"permanent beta\" is greater than their \"transitory beta\" and vice versa for less risky assets. Simulations show that our theoretical results provide good approximations for the means and standard deviations of estimated betas in small samples. Our results can be perceived as indirect evidence for the presence of a transitory component in stock prices, as proposed by Fama and French (1988) and Poterba and Summers (1988).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This note investigates the adequacy of the finite-sample approximation provided by the Functional Central Limit Theorem (FCLT) when the errors are allowed to be dependent. We compare the distribution of the scaled partial sums of some data with the distribution of the Wiener process to which it converges. Our setup is purposely very simple in that it considers data generated from an ARMA(1,1) process. Yet, this is sufficient to bring out interesting conclusions about the particular elements which cause the approximations to be inadequate in even quite large sample sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we propose several finite-sample specification tests for multivariate linear regressions (MLR) with applications to asset pricing models. We focus on departures from the assumption of i.i.d. errors assumption, at univariate and multivariate levels, with Gaussian and non-Gaussian (including Student t) errors. The univariate tests studied extend existing exact procedures by allowing for unspecified parameters in the error distributions (e.g., the degrees of freedom in the case of the Student t distribution). The multivariate tests are based on properly standardized multivariate residuals to ensure invariance to MLR coefficients and error covariances. We consider tests for serial correlation, tests for multivariate GARCH and sign-type tests against general dependencies and asymmetries. The procedures proposed provide exact versions of those applied in Shanken (1990) which consist in combining univariate specification tests. Specifically, we combine tests across equations using the MC test procedure to avoid Bonferroni-type bounds. Since non-Gaussian based tests are not pivotal, we apply the “maximized MC” (MMC) test method [Dufour (2002)], where the MC p-value for the tested hypothesis (which depends on nuisance parameters) is maximized (with respect to these nuisance parameters) to control the test’s significance level. The tests proposed are applied to an asset pricing model with observable risk-free rates, using monthly returns on New York Stock Exchange (NYSE) portfolios over five-year subperiods from 1926-1995. Our empirical results reveal the following. Whereas univariate exact tests indicate significant serial correlation, asymmetries and GARCH in some equations, such effects are much less prevalent once error cross-equation covariances are accounted for. In addition, significant departures from the i.i.d. hypothesis are less evident once we allow for non-Gaussian errors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.