804 resultados para Null Hypothesis


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bootstrap likelihood ratio tests of cointegration rank are commonly used because they tend to have rejection probabilities that are closer to the nominal level than the rejection probabilities of the correspond- ing asymptotic tests. The e¤ect of bootstrapping the test on its power is largely unknown. We show that a new computationally inexpensive procedure can be applied to the estimation of the power function of the bootstrap test of cointegration rank. The bootstrap test is found to have a power function close to that of the level-adjusted asymp- totic test. The bootstrap test estimates the level-adjusted power of the asymptotic test highly accurately. The bootstrap test may have low power to reject the null hypothesis of cointegration rank zero, or underestimate the cointegration rank. An empirical application to Euribor interest rates is provided as an illustration of the findings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reorganizing a dataset so that its hidden structure can be observed is useful in any data analysis task. For example, detecting a regularity in a dataset helps us to interpret the data, compress the data, and explain the processes behind the data. We study datasets that come in the form of binary matrices (tables with 0s and 1s). Our goal is to develop automatic methods that bring out certain patterns by permuting the rows and columns. We concentrate on the following patterns in binary matrices: consecutive-ones (C1P), simultaneous consecutive-ones (SC1P), nestedness, k-nestedness, and bandedness. These patterns reflect specific types of interplay and variation between the rows and columns, such as continuity and hierarchies. Furthermore, their combinatorial properties are interlinked, which helps us to develop the theory of binary matrices and efficient algorithms. Indeed, we can detect all these patterns in a binary matrix efficiently, that is, in polynomial time in the size of the matrix. Since real-world datasets often contain noise and errors, we rarely witness perfect patterns. Therefore we also need to assess how far an input matrix is from a pattern: we count the number of flips (from 0s to 1s or vice versa) needed to bring out the perfect pattern in the matrix. Unfortunately, for most patterns it is an NP-complete problem to find the minimum distance to a matrix that has the perfect pattern, which means that the existence of a polynomial-time algorithm is unlikely. To find patterns in datasets with noise, we need methods that are noise-tolerant and work in practical time with large datasets. The theory of binary matrices gives rise to robust heuristics that have good performance with synthetic data and discover easily interpretable structures in real-world datasets: dialectical variation in the spoken Finnish language, division of European locations by the hierarchies found in mammal occurrences, and co-occuring groups in network data. In addition to determining the distance from a dataset to a pattern, we need to determine whether the pattern is significant or a mere occurrence of a random chance. To this end, we use significance testing: we deem a dataset significant if it appears exceptional when compared to datasets generated from a certain null hypothesis. After detecting a significant pattern in a dataset, it is up to domain experts to interpret the results in the terms of the application.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider nonparametric or universal sequential hypothesis testing when the distribution under the null hypothesis is fully known but the alternate hypothesis corresponds to some other unknown distribution. These algorithms are primarily motivated from spectrum sensing in Cognitive Radios and intruder detection in wireless sensor networks. We use easily implementable universal lossless source codes to propose simple algorithms for such a setup. The algorithms are first proposed for discrete alphabet. Their performance and asymptotic properties are studied theoretically. Later these are extended to continuous alphabets. Their performance with two well known universal source codes, Lempel-Ziv code and KT-estimator with Arithmetic Encoder are compared. These algorithms are also compared with the tests using various other nonparametric estimators. Finally a decentralized version utilizing spatial diversity is also proposed and analysed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider nonparametric sequential hypothesis testing when the distribution under null hypothesis is fully known and the alternate hypothesis corresponds to some other unknown distribution. We use easily implementable universal lossless source codes to propose simple algorithms for such a setup. These algorithms are motivated from spectrum sensing application in Cognitive Radios. Universal sequential hypothesis testing using Lempel Ziv codes and Krichevsky-Trofimov estimator with Arithmetic Encoder are considered and compared for different distributions. Cooperative spectrum sensing with multiple Cognitive Radios using universal codes is also considered.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Visual search in real life involves complex displays with a target among multiple types of distracters, but in the laboratory, it is often tested using simple displays with identical distracters. Can complex search be understood in terms of simple searches? This link may not be straightforward if complex search has emergent properties. One such property is linear separability, whereby search is hard when a target cannot be separated from its distracters using a single linear boundary. However, evidence in favor of linear separability is based on testing stimulus configurations in an external parametric space that need not be related to their true perceptual representation. We therefore set out to assess whether linear separability influences complex search at all. Our null hypothesis was that complex search performance depends only on classical factors such as target-distracter similarity and distracter homogeneity, which we measured using simple searches. Across three experiments involving a variety of artificial and natural objects, differences between linearly separable and nonseparable searches were explained using target-distracter similarity and distracter heterogeneity. Further, simple searches accurately predicted complex search regardless of linear separability (r = 0.91). Our results show that complex search is explained by simple search, refuting the widely held belief that linear separability influences visual search.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider nonparametric sequential hypothesis testing problem when the distribution under the null hypothesis is fully known but the alternate hypothesis corresponds to a general family of distributions. We propose a simple algorithm to address the problem. Its performance is analysed and asymptotic properties are proved. The simulated and analysed performance of the algorithm is compared with an earlier algorithm addressing the same problem with similar assumptions. Finally, we provide a justification for our model motivated by a Cognitive Radio scenario and modify the algorithm for optimizing performance when information about the prior probabilities of occurrence of the two hypotheses is available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A correção de deformidades esqueléticas da face por meio de um tratamento ortodôntico-cirúrgico tornou-se uma opção segura e previsível. Os movimentos ósseos são milimetricamente calculados e executados cirurgicamente, assim como a oclusão é meticulosamente engrenada através dos movimentos ortodônticos. Os efeitos que os tecidos moles sofrem com as cirurgias ortognáticas são, no entanto, menos previsíveis, e apesar do principal objetivo da cirurgia ortognática ser uma melhora funcional, o componente estético é sem dúvida de extrema importância. Em especial, a região de base alar apresenta resultados muito variáveis, a despeito dos bons resultados esqueléticos atingidos. O objetivo deste estudo foi comparar 2 diferentes tipos de sutura utilizados na região de base do nariz, e observar qual tipo apresenta um resultado que melhor acompanhe os movimentos realizados pelo tecido esquelético. Trinta e cinco pacientes foram aleatoriamente distribuídos em 2 grupos. O grupo 1 funcionou como controle e os pacientes receberam a plicatura nasal intra-oral, que é o tipo de plicatura nasal mais descrito na literatura. Já os pacientes do grupo 2 receberam plicatura nasal extra-oral. Para análise estatística foram calculadas as médias e desvios padrões dos grupos, e a hipótese nula de que não havia diferença entre os 2 grupos foi testata com o teste T de Student. Em ambos os grupos ocorreu um alargamento da base do nariz, porém a média de alargamento do grupo 1 foi de 2,50 milímetros (mm), enquanto que a média de alargamento do grupo 2 foi de 1,26 mm. Além disso, o desvio padrão foi menor para o grupo 2, e a hipótese nula foi rejeitada (p<0,05), demonstrando que a diferença entre os grupos foi estatisticamente significativa. Pôde-se concluir que quando objetiva-se um controle mais previsível e rigoroso da base do nariz, a plicatura nasal extra-oral cumprirá melhor esta função.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fluvial systems form landscapes and sedimentary deposits with a rich hierarchy of structures that extend from grain- to valley scale. Large-scale pattern formation in fluvial systems is commonly attributed to forcing by external factors, including climate change, tectonic uplift, and sea-level change. Yet over geologic timescales, rivers may also develop large-scale erosional and depositional patterns that do not bear on environmental history. This dissertation uses a combination of numerical modeling and topographic analysis to identify and quantify patterns in river valleys that form as a consequence of river meandering alone, under constant external forcing. Chapter 2 identifies a numerical artifact in existing, grid-based models that represent the co-evolution of river channel migration and bank strength over geologic timescales. A new, vector-based technique for bank-material tracking is shown to improve predictions for the evolution of meander belts, floodplains, sedimentary deposits formed by aggrading channels, and bedrock river valleys, particularly when spatial contrasts in bank strength are strong. Chapters 3 and 4 apply this numerical technique to establishing valley topography formed by a vertically incising, meandering river subject to constant external forcing—which should serve as the null hypothesis for valley evolution. In Chapter 3, this scenario is shown to explain a variety of common bedrock river valley types and smaller-scale features within them—including entrenched channels, long-wavelength, arcuate scars in valley walls, and bedrock-cored river terraces. Chapter 4 describes the age and geometric statistics of river terraces formed by meandering with constant external forcing, and compares them to terraces in natural river valleys. The frequency of intrinsic terrace formation by meandering is shown to reflect a characteristic relief-generation timescale, and terrace length is identified as a key criterion for distinguishing these terraces from terraces formed by externally forced pulses of vertical incision. In a separate study, Chapter 5 utilizes image and topographic data from the Mars Reconnaissance Orbiter to quantitatively identify spatial structures in the polar layered deposits of Mars, and identifies sequences of beds, consistently 1-2 meters thick, that have accumulated hundreds of kilometers apart in the north polar layered deposits.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A L-arginina é reconhecida como um nutriente de fundamental importância na resposta imune, apesar de seus efeitos serem, por vezes, considerados inconstantes. O autoimplante esplênico tem sido proposto como alternativa à esplenectomia total isolada, mas existem preocupações quanto à eficácia do restabelecimento da resposta imune, haja vista que o paciente pode permanecer com risco aumentado de desenvolvimento de infecção fulminante pós esplenectomia, mesmo após a regeneração morfológica do órgão. O objetivo deste estudo foi avaliar a participação da suplementação dietética com L-arginina em subpopulações linfocitárias no sangue, no baço e nos autoimplantes esplênicos de ratos submetidos a esplenectomia isolada ou combinada com autoimplante esplênico. Foram utilizados 42 ratos Sprague-Dawley machos, randomicamente distribuídos em seis grupos: 1 Controle operação simulada; 2 esplenectomia total; 3 esplenectomia total combinada com autoimplante esplênico; 4 Controle operação simulada, com suplementação de L-arginina; 5 esplenectomia total, com suplementação de L-arginina; e 6 esplenectomia total combinada com autoimplante esplênico, com suplementação de L-arginina. Os animais dos grupos 4, 5 e 6 receberam suplementação de L-arginina, uma vez ao dia, durante 15 dias anteriores a coleta sangüínea realizada imediatamente antes dos procedimentos operatórios (semanas 0 e 12). A dose utilizada foi de 1,0 g/kg/dia, administrada por via intragástrica em bolus. As avaliações foram realizadas por meio de hemograma e citometria de fluxo. A análise estatística utilizou testes paramétricos e nãoparamétricos, sendo p<0,05 considerado para a rejeição da hipótese nula. A suplementação com L-arginina acarretou elevação da contagem relativa e absoluta de neutrófilos periféricos, 12 semanas após a realização de esplenectomia total combinada com autoimplante esplênico. A esplenectomia total ocasionou diminuição da contagem relativa de linfócitos T totais, T CD4+ e T CD8β no sangue, mas a suplementação dietética com L-arginina evitou a diminuição do percentual de células T totais e T CD8β no sangue dos animais submetidos a autoimplante esplênico. Tanto a realização de autoimplante esplênico como a suplementação de L-arginina previnem a diminuição da subpopulação de linfócitos T CD4+ no sangue periférico, fato que usualmente ocorre após realização de esplenectomia total. Houve maior proliferação de células brancas / g de tecido nos autoimplante esplênico dos animais suplementados, porém a suplementação não influenciou a contagem de linfócitos T, T CD4+ e B de zona marginal de baço. A suplementação do aminoácido L-arginina após a realização de esplenectomia total combinada com autoimplante esplênico em ratos foi capaz de reverter alterações observadas em algumas das subpopulações linfocitárias, ocasionadas pela esplenectomia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Um dos temas mais estudados na área de finanças corporativas é a determinação de fatores que influenciem o valor de mercado das companhias. Outro tema, também bastante comum, é a relação entre proprietários e agentes. Ambas as questões se entrelaçam porque, em tese, agentes podem orientar suas decisões em benefício próprio, contrariando os interesses dos proprietários, o que, em última análise, impactaria negativamente no valor das empresas. No Brasil, face às características próprias de controle e propriedade das companhias, o conflito de interesses passa a englobar também as relações entre controladores e minoritários. Uma rápida análise da literatura existente irá constatar que são muitas as variáveis consideradas. A tentativa de tratá-las num único modelo esbarra nas correlações muito elevadas que as referidas variáveis possuem entre si. Este estudo pretendeu lançar um novo olhar sobre o problema, utilizando Análise Fatorial para contornar estas dificuldades. Partindo de uma amostra com 114 empresas, foi possível identificar dois fatores aos quais se relacionam as nove variáveis consideradas no trabalho. Os fatores, nomeados Negociabilidade e Governança Corporativa, respondem por mais de 2/3 da variabilidade dos dados. As coordenadas dos fatores, divididas em altas e baixas, combinadas em pares, permitiu a caracterização de quatro quadrantes, pelos quais as empresas se distribuem. Através da utilização de teste não paramétrico de diferenças de médias, foi possível rejeitar a hipótese nula de igualdade do valor médio de mercado das empresas entre os quatro quadrantes. Os resultados encontrados são explicados pelos valores assumidos pelas variáveis para as empresas de cada quadrante.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Plant community ecologists use the null model approach to infer assembly processes from observed patterns of species co-occurrence. In about a third of published studies, the null hypothesis of random assembly cannot be rejected. When this occurs, plant ecologists interpret that the observed random pattern is not environmentally constrained - but probably generated by stochastic processes. The null model approach (using the C-score and the discrepancy index) was used to test for random assembly under two simulation algorithms. Logistic regression, distance-based redundancy analysis, and constrained ordination were used to test for environmental determinism (species segregation along environmental gradients or turnover and species aggregation). This article introduces an environmentally determined community of alpine hydrophytes that presents itself as randomly assembled. The pathway through which the random pattern arises in this community is suggested to be as follows: Two simultaneous environmental processes, one leading to species aggregation and the other leading to species segregation, concurrently generate the observed pattern, which results to be neither aggregated nor segregated - but random. A simulation study supports this suggestion. Although apparently simple, the null model approach seems to assume that a single ecological factor prevails or that if several factors decisively influence the community, then they all exert their influence in the same direction, generating either aggregation or segregation. As these assumptions are unlikely to hold in most cases and assembly processes cannot be inferred from random patterns, we would like to propose plant ecologists to investigate specifically the ecological processes responsible for observed random patterns, instead of trying to infer processes from patterns

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ninety-six bigeye tuna (88– 134 cm fork length) were caught and released with implanted archival (electronic data storage) tags near fish-aggregating devices (FADs) in the equatorial eastern Pacific Ocean (EPO) during April 2000. Twenty-nine fish were recaptured, and the data from twenty-seven tags were successfully downloaded and processed. Time at liberty ranged from 8 to 446 days, and data for 23 fish at liberty for 30 days or more are presented. The accuracy in geolocation estimates, derived from the light level data, is about 2 degrees in latitude and 0.5 degrees in longitude in this region. The movement paths derived from the filtered geolocation estimates indicated that none of the fish traveled west of 110°W during the period between release and recapture. The null hypothesis that the movement path is random was rejected in 17 of the 22 statistical tests of the observed movement paths. The estimated mean velocity was 117 km/d. The fish exhibited occasional deep-diving behavior, and some dives exceeded 1000 m where temperatures were less than 3°C. Evaluations of timed depth records, resulted in the discrimination of three distinct behaviors: 54.3% of all days were classified as unassociated (with a floating object) type-1 behavior, 27.7% as unassociated type-2 behavior, and 18.7% as behavior associated with a floating object. The mean residence time at floating objects was 3.1 d. Data sets separated into day and night were used to evaluate diel differences in behavior and habitat selection. When the fish were exhibiting unassociated type-1 behavior (diel vertical migrations), they were mostly at depths of less than 50 m (within the mixed layer) throughout the night, and during the day between 200 and 300 m and 13° and 14°C. They shifted their average depths in conjunction with dawn and dusk events, presumably tracking the deep-scattering layer as a foraging strategy. There were also observed changes in the average nighttime depth distributions of the fish in relation to moon phase.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study sets out to investigate the psychology of immersion and the immersive response of individuals in relation to video and computer games. Initially, an exhaustive review of literature is presented, including research into games, player demographics, personality and identity. Play in traditional psychology is also reviewed, as well as previous research into immersion and attempts to define and measure this construct. An online qualitative study was carried out (N=38), and data was analysed using content analysis. A definition of immersion emerged, as well as a classification of two separate types of immersion, namely, vicarious immersion and visceral immersion. A survey study (N=217) verified the discrete nature of these categories and rejected the null hypothesis that there was no difference between individuals' interpretations of vicarious and visceral immersion. The primary aim of this research was to create a quantitative instrument which measures the immersive response as experienced by the player in a single game session. The IMX Questionnaire was developed using data from the initial qualitative study and quantitative survey. Exploratory Factor Analysis was carried out on data from 300 participants for the IMX Version 1, and Confirmatory Factor Analysis was conducted on data from 380 participants on the IMX Version 2. IMX Version 3 was developed from the results of these analyses. This questionnaire was found to have high internal consistency reliability and validity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fatty acids in milk reflect the interplay between species-specific physiological mechanisms and maternal diet. Anthropoid primates (apes, Old and New World monkeys) vary in patterns of growth and development and dietary strategies. Milk fatty acid profiles also are predicted to vary widely. This study investigates milk fatty acid composition of five wild anthropoids (Alouatta palliata, Callithrix jacchus, Gorilla beringei beringei, Leontopithecus rosalia, Macaca sinica) to test the null hypothesis of a generalized anthropoid milk fatty acid composition. Milk from New and Old World monkeys had significantly more 8:0 and 10:0 than milk from apes. The leaf eating species G. b. beringei and A. paliatta had a significantly higher proportion of milk 18:3n-3, a fatty acid found primarily in plant lipids. Mean percent composition of 22:6n-3 was significantly different among monkeys and apes, but was similar to the lowest reported values for human milk. Mountain gorillas were unique among anthropoids in the high proportion of milk 20:4n-6. This seems to be unrelated to requirements of a larger brain and may instead reflect species-specific metabolic processes or an unknown source of this fatty acid in the mountain gorilla diet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To evaluate the empirical evidence linking nursing resources to patient outcomes in intensive care settings as a framework for future research in this area. Background: Concerns about patient safely and the quality of care are driving research on the clinical and cost-effectiveness of health care interventions, including the deployment of human resources. This is particularly important in intensive care where a large proportion of the health care budget is consumed and where nursing staff is the main item of expenditure. Recommendations about staffing levels have been trade but may not be evidence based and may not always be achieved in practice. Methods: We searched systematically for studies of the impact of nursing resources (e.g. nurse-patient ratios, nurses' level of education, training and experience) on patient Outcomes, including mortality and adverse events, in adult intensive care. Abstracts of articles were reviewed and retrieved if they investigated the relationship between nursing resources and patient Outcomes. Characteristics of the studies were tabulated and the quality of the Studies assessed. Results: Of the 15 studies included in this review, two reported it statistical relationship between nursing resources and both mortality and adverse events, one reported ail association to mortality only, seven studies reported that they Could not reject the null hypothesis of no relationship to mortality and 10 studies (out of 10 that tested the hypothesis) reported a relationship to adverse events. The main explanatory mechanisms were the lack of time for nurses to perform preventative measures, or for patient surveillance. The nurses' role in pain control was noted by One author. Studies were mainly observational and retrospective and varied in scope from 1 to 52 units. Recommendations for future research include developing the mechanisms linking nursing resources to patient Outcomes, and designing large multi-centre prospective Studies that link patient's exposure to nursing care oil a shift-by-shift basis over time. (C) 2007 Elsevier Ltd. All rights reserved.