843 resultados para Residual-Based Cointegration Test
Resumo:
Coffee seeds have slow and irregular germination, losing fast their viability during storage, and the standard germination test of these seeds requires at least 30 days. Besides, the results may not reflect the actual physiological quality of these seeds. The objective of this work was to develop a fast and practical test for evaluating the viability of coffee seeds, which is based on the interpretation of different color hues of exudates from seeds. Coffee seeds of the cultivar Catuai 44 from six lots were submitted to germination, accelerated aging, and electrical conductivity tests. In the exudates color hue test, coffee seeds without the parchment and the silvery pellicle (four replications of 10 seeds each) were distributed on top of paper towels moistened and then maintained into a germinator, at 25 ºC for 24, 48, 72, 96, and 120 h. Three classes of color hues were established: colorless, light color hue, and dark color hue, assigning the values of 0, 1, and 3, for each class, respectively. The proposed exudates color hue test can be recommended for the fast assessment of viability for coffee seeds. The most promising results were obtained for seeds with 12% moisture content, after imbibition periods of 72, 96, and 120 h; and with 30% moisture content, after imbibition periods of 72 and 120 h.
Resumo:
Since different stock markets have become more integrated during 2000s, investors need new asset classes in order to gain diversification benefits. Commodities have become popular to invest in and thus it is important to examine whether the investors should use commodities as a part for portfolio diversification. This master’s thesis examines the dynamic relationship between Finnish stock market and commodities. The methodology is based on Vector Autoregressive models (VAR). The long-run relationship between Finnish stock market and commodities is examined with Johansen cointegration while short-run relationship is examined with VAR models and Granger causality test. In addition, impulse response test and forecast error variance decomposition are employed to strengthen the results of short-run relationship. The dynamic relationships might change under different market conditions. Thus, the sample period is divided into two sub-samples in order to reveal whether the dynamic relationship varies under different market conditions. The results show that Finnish stock market has stable long-run relationship with industrial metals, indicating that there would not be diversification benefits among the industrial metals. The long-run relationship between Finnish stock market and energy commodities is not as stable as the long-run relationship between Finnish stock market and industrial metals. Long-run relationship was found in the full sample period and first sub-sample which indicate less room for diversification. However, the long-run relationship disappeared in the second sub-sample which indicates diversification benefits. Long-run relationship between Finnish stock market and agricultural commodities was not found in the full sample period which indicates diversification benefits between the variables. However, long-run relationship was found from both sub-samples. The best diversification benefits would be achieved if investor invested in precious metals. No long-run relationship was found from either sample. In the full sample period OMX Helsinki had short-run relationship with most of the energy commodities and industrial metals and the causality was mostly running from equities to commodities. During the first sub period the number of short-run relationships and causality shrunk but during the crisis period the number of short-run relationships and causality increased. The most notable result found was unidirectional causality from gold to OMX Helsinki during the crisis period.
Resumo:
This document is focused on studying privacy perception and personality traits of users in the context of smartphone application privacy. It is divided into two parts. The first part presents an in depth systematic literature review of the existing academic writings available on the topic of relation between privacy perception and personality traits. Demographics, methodologies and other useful insight is extracted and the available literature is divided into broader group of topics bringing the five main areas of research to light and highlighting the current research trends in the field along with pinpointing the research gap of interest to the author. The second part of the thesis uses the results from the literature review to administer an empirical study to investigate the current privacy perception of users and the correlation between personality traits and privacy perception in smartphone applications. Big five personality test is used as the measure for personality traits whereas three sub-variables are used to measure privacy perception i.e. perceived privacy awareness, perceived threat to privacy and willingness to trade privacy. According to the study openness to experience is the most dominant trait having a strong correlation with two privacy sub-variables whereas emotional stability doesn’t show any correlation with privacy perception. Empirical study also explores other findings as preferred privacy sources and application installation preferences that provide further insight about users and might be useful in future.
Resumo:
In this work an agent based model (ABM) was proposed using the main idea from the Jabłonska-Capasso-Morale (JCM) model and maximized greediness concept. Using a multi-agents simulator, the power of the ABM was assessed by using the historical prices of silver metal dating from the 01.03.2000 to 01.03.2013. The model results, analysed in two different situations, with and without maximized greediness, have proven that the ABM is capable of explaining the silver price dynamics even in utmost events. The ABM without maximal greediness explained the prices with more irrationalities whereas the ABM with maximal greediness tracked the price movements with more rational decisions. In the comparison test, the model without maximal greediness stood as the best to capture the silver market dynamics. Therefore, the proposed ABM confirms the suggested reasons for financial crises or markets failure. It reveals that an economic or financial collapse may be stimulated by irrational and rational decisions, yet irrationalities may dominate the market.
Resumo:
Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.
Resumo:
Although alcohol problems and alcohol consumption are related, consumption does not fully account for differences in vulnerability to alcohol problems. Therefore, other factors should account for these differences. Based on previous research, it was hypothesized that risky drinking behaviours, illicit and prescription drug use, affect and sex differences would account for differences in vulnerability to alcohol problems while statistically controlling for overall alcohol consumption. Four models were developed that were intended to test the predictive ability of these factors, three of which tested the predictor sets separately and a fourth which tested them in a combined model. In addition, two distinct criterion variables were regressed on the predictors. One was a measure of the frequency that participants experienced negative consequences that they attributed to their drinking and the other was a measure of the extent to which participants perceived themselves to be problem drinkers. Each of the models was tested on four samples from different populations, including fIrst year university students, university students in their graduating year, a clinical sample of people in treatment for addiction, and a community sample of young adults randomly selected from the general population. Overall, support was found for each of the models and each of the predictors in accounting for differences in vulnerability to alcohol problems. In particular, the frequency with which people become intoxicated, frequency of illicit drug use and high levels of negative affect were strong and consistent predictors of vulnerability to alcohol problems across samples and criterion variables. With the exception of the clinical sample, the combined models predicted vulnerability to negative consequences better than vulnerability to problem drinker status. Among the clinical and community samples the combined model predicted problem drinker status better than in the student samples.
Resumo:
This study assessed the effectiveness of a reciprocal teaching program as a method of teaching reading comprehension, using narrative text material in a t.ypical grade seven classroom. In order to determine the effectiveness of the reciprocal teaching program, this method was compared to two other reading instruction approaches that, unlike rcciprocal teaching, did not include social interaction components. Two intact grade scven classes, and a grade seven teacher, participated in this study. Students were appropriately assigned to three treatment groups by reading achievement level as determined from a norm-referenced test. Training proceeded for a five week intervention period during regularly scheduled English periods. Throughout the program curriculum-based tests were administered. These tests were designed to assess comprehension in two distinct ways; namely, character analysis components as they relate to narrative text, and strategy use components as they contribute to student understanding of narrative and expository text. Pre, post, and maintenance tests were administered to measure overall training effects. Moreover, during intervention, training probes were administered in the last period of each week to evaluate treatment group performance. AU curriculum-based tests were coded and comparisons of pre, post, maintenance tests and training probes were presented in graph form. Results showed that the reciprocal group achieved some improvement in reading comprehension scores in the strategy use component of the tests. No improvements were observed for the character analysis components of the curriculum-based tests and the norm-referenced tests. At pre and post intervention, interviews requiring students to respond to questions that addressed metacomprehension awareness of study strategies were administered. The intelviews were coded and comparisons were made between the two intelVicws. No significant improvements were observed regarding student awareness of ten identified study strategies . This study indicated that reciprocal teaching is a viable approach that can be utilized to help students acquire more effective comprehension strategies. However, the maximum utility of the technique when administered to a population of grade seven students performing at average to above average levels of reading achievement has yet to be determined. In order to explore this issue, the refinement of training materials and curriculum-based measurements need to be explored. As well, this study revealed that reciprocal teaching placed heavier demands on the classroom teacher when compared to other reading instruction methods. This may suggest that innovative and intensive teacher training techniques are required before it is feasible to use this method in the classroom.
Resumo:
A class of twenty-two grade one children was tested to determine their reading levels using the Stanford Diagnostic Reading Achievement Test. Based on these results and teacher input the students were paired according to reading ability. The students ages ranged from six years four months to seven years four months at the commencement of the study. Eleven children were assigned to the language experience group and their partners became the text group. Each member of the language experience group generated a list of eight to be learned words. The treatment consisted of exposing the student to a given word three times per session for ten sessions, over a period of five days. The dependent variables consisted of word identification speed, word identification accuracy, and word recognition accuracy. Each member of the text group followed the same procedure using his/her partner's list of words. Upon completion of this training, the entire process was repeated with members of the text group from the first part becoming members of the language experience group and vice versa. The results suggest that generally speaking language experience words are identified faster than text words but that there is no difference in the rate at which these words are learned. Language experience words may be identified faster because the auditory-semantic information is more readily available in them than in text words. The rate of learning in both types of words, however, may be dictated by the orthography of the to be learned word.
Resumo:
Obsessive Compulsive Disorder (OCD) involves excessive worry coupled with engaging in rituals that are believed to help alleviate the worry. Pervasive Developmental Disorders (PODs) are characterized by impairments in social interaction, communication, and the presence of repetitive and/or restrictive behaviours (American Psychiatric Association, 2000). Research suggests that as many as 81% of children with a POD also meet criteria for a diagnosis ofOCD. Currently, only a handful of studies have investigated the use of Cognitive Behavioural Therapy (CBT) in treating OCD in children with autism (Reaven & Hepburn, 2003 ; Sze & Wood, 2007; Lehmkuhl, Storch, Bodtish & Geflken, 2008). In these case studies. the use of a multi-modal CBT treatment package was successful in alleviating OCD behaviours. The current study used function-based CBT with parent involvement and behavioural supplements to treat 2 children with POD and OCD. Using a multiple baseline design across behaviours and participants, parents reported that their child 's anxiety was alleviated and these gains were maintained at 6-month follow-up. According to results of the Children 's Yale-Brown Obsessive Compulsive Scale (Goodman, Price, Rasmussen, Riddle, & Rapoport, 1986) from preto post-test, OCD behaviours of the children decreased II"om the severe to the mild range. In addition, the parents rated the family's level of interference related to their child 's OCD as substantially lower. Last, the CBT treatment received high ratings of consumer satisfaction.
Resumo:
The present set of experiments was designed to investigate the organization and refmement of young children's face space. Past research has demonstrated that adults encode individual faces in reference to a distinct face prototype that represents the average of all faces ever encountered. The prototype is not a static abstracted norm but rather a malleable face average that is continuously updated by experience (Valentine, 1991); for example, following prolonged viewing of faces with compressed features (a technique referred to as adaptation), adults rate similarly distorted faces as more normal and more attractive (simple attractiveness aftereffects). Recent studies have shown that adults possess category-specific face prototypes (e.g., based on race, sex). After viewing faces from two categories (e.g., Caucasian/Chinese) that are distorted in opposite directions, adults' attractiveness ratings simultaneously shift in opposite directions (opposing aftereffects). The current series of studies used a child-friendly method to examine whether, like adults, 5- and 8-year-old children show evidence for category-contingent opposing aftereffects. Participants were shown a computerized storybook in which Caucasian and Chinese children's faces were distorted in opposite directions (expanded and compressed). Both before and after adaptation (i.e., reading the storybook), participants judged the normality/attractiveness of a small number of expanded, compressed, and undistorted Caucasian and Chinese faces. The method was first validated by testing adults (Experiment I ) and was then refined in order to test 8- (Experiment 2) and 5-yearold (Experiment 4a) children. Five-year-olds (our youngest age group) were also tested in a simple aftereffects paradigm (Experiment 3) and with male and female faces distorted in opposite directions (Experiment 4b). The current research is the first to demonstrate evidence for simple attractiveness aftereffects in children as young as 5, thereby indicating that similar to adults, 5-year-olds utilize norm-based coding. Furthermore, this research provides evidence for racecontingent opposing aftereffects in both 5- and 8-year-olds; however, the opposing aftereffects demonstrated by 5-year-olds were driven largely by simple aftereffects for Caucasian faces. The lack of simple aftereffects for Chinese faces in 5-year-olds may be reflective of young children's limited experience with other-race faces and suggests that children's face space undergoes a period of increasing differentiation over time with respect to race. Lastly, we found no evidence for sex -contingent opposing aftereffects in 5-year-olds, which suggests that young children do not rely on a fully adult-like face space even for highly salient face categories (i.e., male/female) with which they have comparable levels of experience.
Resumo:
This thesis introduces the Salmon Algorithm, a search meta-heuristic which can be used for a variety of combinatorial optimization problems. This algorithm is loosely based on the path finding behaviour of salmon swimming upstream to spawn. There are a number of tunable parameters in the algorithm, so experiments were conducted to find the optimum parameter settings for different search spaces. The algorithm was tested on one instance of the Traveling Salesman Problem and found to have superior performance to an Ant Colony Algorithm and a Genetic Algorithm. It was then tested on three coding theory problems - optimal edit codes, optimal Hamming distance codes, and optimal covering codes. The algorithm produced improvements on the best known values for five of six of the test cases using edit codes. It matched the best known results on four out of seven of the Hamming codes as well as three out of three of the covering codes. The results suggest the Salmon Algorithm is competitive with established guided random search techniques, and may be superior in some search spaces.
Resumo:
This study investigated improvements in parent knowledge of effective intervention strategies following participation in a group function-based CBT treatment (GFbCBT) package for children with comorbid OCD and ASD. Nineteen parents of children ages 7-12 years with High Functioning Autism (HFA) participated in the 9-week treatment program. Key components of treatment included psychoeducation and mapping, cognitive-behavioural skills training, function-based interventions and exposure and response prevention (ERP). Treatment sessions also included direct parent education, which followed a behavioural skills training model (Miltenberger, 2008). Parent knowledge (N = 19) was measured pre and post treatment using a vignette about a child demonstrating obsessive-compulsive behaviour. Results of a one-tailed pairwise t-test indicated statistically significant changes (p=.036) in overall parent knowledge following participation in treatment. Statistically significant changes were also found in parents’ ability to generate ERP and function-based intervention strategies. These results provide preliminary evidence that parents benefit from active involvement in the GFbCBT treatment package.
Resumo:
Feature selection plays an important role in knowledge discovery and data mining nowadays. In traditional rough set theory, feature selection using reduct - the minimal discerning set of attributes - is an important area. Nevertheless, the original definition of a reduct is restrictive, so in one of the previous research it was proposed to take into account not only the horizontal reduction of information by feature selection, but also a vertical reduction considering suitable subsets of the original set of objects. Following the work mentioned above, a new approach to generate bireducts using a multi--objective genetic algorithm was proposed. Although the genetic algorithms were used to calculate reduct in some previous works, we did not find any work where genetic algorithms were adopted to calculate bireducts. Compared to the works done before in this area, the proposed method has less randomness in generating bireducts. The genetic algorithm system estimated a quality of each bireduct by values of two objective functions as evolution progresses, so consequently a set of bireducts with optimized values of these objectives was obtained. Different fitness evaluation methods and genetic operators, such as crossover and mutation, were applied and the prediction accuracies were compared. Five datasets were used to test the proposed method and two datasets were used to perform a comparison study. Statistical analysis using the one-way ANOVA test was performed to determine the significant difference between the results. The experiment showed that the proposed method was able to reduce the number of bireducts necessary in order to receive a good prediction accuracy. Also, the influence of different genetic operators and fitness evaluation strategies on the prediction accuracy was analyzed. It was shown that the prediction accuracies of the proposed method are comparable with the best results in machine learning literature, and some of them outperformed it.
Resumo:
In the context of multivariate linear regression (MLR) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. In this paper, we propose a general method for constructing exact tests of possibly nonlinear hypotheses on the coefficients of MLR systems. For the case of uniform linear hypotheses, we present exact distributional invariance results concerning several standard test criteria. These include Wilks' likelihood ratio (LR) criterion as well as trace and maximum root criteria. The normality assumption is not necessary for most of the results to hold. Implications for inference are two-fold. First, invariance to nuisance parameters entails that the technique of Monte Carlo tests can be applied on all these statistics to obtain exact tests of uniform linear hypotheses. Second, the invariance property of the latter statistic is exploited to derive general nuisance-parameter-free bounds on the distribution of the LR statistic for arbitrary hypotheses. Even though it may be difficult to compute these bounds analytically, they can easily be simulated, hence yielding exact bounds Monte Carlo tests. Illustrative simulation experiments show that the bounds are sufficiently tight to provide conclusive results with a high probability. Our findings illustrate the value of the bounds as a tool to be used in conjunction with more traditional simulation-based test methods (e.g., the parametric bootstrap) which may be applied when the bounds are not conclusive.
Resumo:
A wide range of tests for heteroskedasticity have been proposed in the econometric and statistics literature. Although a few exact homoskedasticity tests are available, the commonly employed procedures are quite generally based on asymptotic approximations which may not provide good size control in finite samples. There has been a number of recent studies that seek to improve the reliability of common heteroskedasticity tests using Edgeworth, Bartlett, jackknife and bootstrap methods. Yet the latter remain approximate. In this paper, we describe a solution to the problem of controlling the size of homoskedasticity tests in linear regression contexts. We study procedures based on the standard test statistics [e.g., the Goldfeld-Quandt, Glejser, Bartlett, Cochran, Hartley, Breusch-Pagan-Godfrey, White and Szroeter criteria] as well as tests for autoregressive conditional heteroskedasticity (ARCH-type models). We also suggest several extensions of the existing procedures (sup-type of combined test statistics) to allow for unknown breakpoints in the error variance. We exploit the technique of Monte Carlo tests to obtain provably exact p-values, for both the standard and the new tests suggested. We show that the MC test procedure conveniently solves the intractable null distribution problem, in particular those raised by the sup-type and combined test statistics as well as (when relevant) unidentified nuisance parameter problems under the null hypothesis. The method proposed works in exactly the same way with both Gaussian and non-Gaussian disturbance distributions [such as heavy-tailed or stable distributions]. The performance of the procedures is examined by simulation. The Monte Carlo experiments conducted focus on : (1) ARCH, GARCH, and ARCH-in-mean alternatives; (2) the case where the variance increases monotonically with : (i) one exogenous variable, and (ii) the mean of the dependent variable; (3) grouped heteroskedasticity; (4) breaks in variance at unknown points. We find that the proposed tests achieve perfect size control and have good power.