804 resultados para Null hypothesis
Resumo:
The purpose of this quantitative study was to explore the previously unexamined phenomenon of middle school parental engagement in a large urban/suburban/rural school district of 209 schools in the mid-Atlantic region of the United States. Across 22 middle schools serving grades six-eight, this study collected and examined perceptions of the three key adult stakeholder groups – administrators, teachers, and parents – most actively involved in middle school parental engagement as described within the theoretical framework of academic socialization. Their reports of observable parental engagement activities were used to document how district stakeholders operationalize behaviors that represent the five actionable constructs and three themes of academic socialization to determine how the district “fares” in employing academic socialization as a middle school parent engagement strategy. The study also applied quantitative descriptive analysis through a one-way ANOVA to determine the significance of observable variations in actionable constructs between the perspectives of the three stakeholder groups. Finally, the study illuminated, through regression modeling, when confounding factors/independent variables such as race, income, school size, administrator and teacher experience, parents’ educational background, etc., impacted operationalization of academic socialization behaviors for middle school parent and family engagement. Rejecting the null hypothesis, the study found that the three stakeholder groups had statistically significant differences in perceptions of their implementation of activities aligned to academic socialization. This study ultimately illuminated ways in which these adult stakeholder groups share similar and varied perceptions about their engagement actions that support the achievement and maturation of middle school students. Significantly, this study provided key findings that illuminated areas that can be systemically addressed to transform middle school parent engagement practices through applied academic socialization theory into consistent and effective collaborative efforts between the home and school. The process of operationalizing academic socialization was outlined in terms that any school or district can follow to improve programs and practices of middle school parental engagement to serve in the best interests of students during this period of great transition for both child/adolescent growth and development and adult navigation of systems to provide support for students in this unique stage of growth and maturation.
Resumo:
Brazil is home to one of the richest avifaunas the world, which is subject to high levels of environmental degradation, in particular forest fragmentation. The Atlantic Forest biome depicts this history of devastation and today remains as small isolated fragments on highly degraded landscapes. This project aimed to evaluate the effects of forest fragmentation in an area with Atlantic Forest remnants in northern Paraná (Brazil) on the distribution and the organization of assemblage of forest birds and tested the hypothesis that the structure of the assembly in the fragments is different than expected by chance. We did four qualitative samplings of birds in three sets of forest fragments in the landscape, each with three fragments: large, medium and small. The method applied in the sampling was point counts along transects, traveled randomly for four hours in each fragment. Samples were taken in two periods: from September to November / 2013, and between March and May / 2014. The structure of the meeting was assessed by rates of co-occurring species (Checkerboard and CScore) and α diversity patterns (wealth) and β (turnover of species), while the landscape structure was analyzed from the parameters: area, distance between fragments, fractal dimension, edge density, fragment shape index and nuclear area index. The null hypothesis of no structure in the assembly of birds in the landscape was tested with null models from the co-occurrence indexes. The effects of landscape structure on the assembly of the structure were analyzed by the Mantel test and principal component analysis (PCA). The assembly of the structure in the landscape showed a pattern of spatiotemporal organization significantly different from that expected by chance, revealing a structure most influenced by segregation of the species. The fragments showed significant differences in richness, unlike sets of fragments, indicating relative homogeneity in the landscape structure. The differences between the size and the distance between the fragments significantly influenced the patterns of organization of the meeting of forest birds in the landscape and patterns of α and β diversity, indicating that the higher the fragment and smaller distances between them, more the standard of species cooccurrence is different than expected by chance. Thus, the fragmented landscape of remnants of the northern Paraná Atlantic Forest still has availability of environmental resources and physical characteristics that allow a persistent organizational structure of the assembly of forest birds in space over time.
Resumo:
This dissertation is composed of three essays covering two areas of interest. The first topic is personal transportation demand with a focus on price and fuel efficiency elasticities of mileage demand, challenging assumptions common in the rebound effect literature. The second topic is consumer finance with a focus on small loans. The first chapter creates separate variables for fuel prices during periods of increasing and decreasing prices as well as an observed fuel economy measure to empirically test the equivalence of these elasticities. Using a panel from Germany from 1997 to 2009 I find a fuel economy elasticity of mileage of 53.3%, which is significantly different from the gas price elasticity of mileage during periods of decreasing gas prices, 4.8%. I reject the null hypothesis or price symmetry, with the elasticity of mileage during period of increasing gas prices ranging from 26.2% and 28.9%. The second chapter explores the potential for the rebound effect to vary with income. Panel data from U.S. households from 1997 to 2003 is used to estimate the rebound effect in a median regression. The estimated rebound effect independent of income ranges from 17.8% to 23.6%. An interaction of income and fuel economy is negative and significant, indicating that the rebound effect may be much higher for low income individuals and decreases with income; the rebound effect for low income households ranged from 80.3% to 105.0%, indicating that such households may increase gasoline consumption given an improvement in fuel economy. The final chapter documents the costs of credit instruments found in major mail order catalogs throughout the 20th century. This study constructs a new dataset and finds that the cost of credit increased and became stickier as mail order retailers switched from an installment-style closed-end loan to a revolving-style credit card. This study argues that revolving credit's ability to decrease salience of credit costs in the price of goods is the best explanation for rate stickiness in the mail order industry as well as for the preference of revolving credit among retailers.
Resumo:
A investigação na área da saúde e a utilização dos seus resultados tem funcionado como base para a melhoria da qualidade de cuidados, exigindo dos profissionais de saúde conhecimentos na área específica onde desempenham funções, conhecimentos em metodologia de investigação que incluam as técnicas de observação, técnicas de recolha e análise de dados, para mais facilmente serem leitores capacitados dos resultados da investigação. Os profissionais de saúde são observadores privilegiados das respostas humanas à saúde e à doença, podendo contribuir para o desenvolvimento e bem-estar dos indivíduos muitas vezes em situações de grande vulnerabilidade. Em saúde infantil e pediatria o enfoque está nos cuidados centrados na família privilegiando-se o desenvolvimento harmonioso da criança e jovem, valorizando os resultados mensuráveis em saúde que permitam determinar a eficácia das intervenções e a qualidade de saúde e de vida. No contexto pediátrico realçamos as práticas baseadas na evidência, a importância atribuída à pesquisa e à aplicação dos resultados da investigação nas práticas clínicas, assim como o desenvolvimento de instrumentos de mensuração padronizados, nomeadamente as escalas de avaliação, de ampla utilização clínica, que facilitam a apreciação e avaliação do desenvolvimento e da saúde das crianças e jovens e resultem em ganhos em saúde. A observação de forma sistematizada das populações neonatais e pediátricas com escalas de avaliação tem vindo a aumentar, o que tem permitido um maior equilíbrio na avaliação das crianças e também uma observação baseada na teoria e nos resultados da investigação. Alguns destes aspetos serviram de base ao desenvolvimento deste trabalho que pretende dar resposta a 3 objetivos fundamentais. Para dar resposta ao primeiro objetivo, “Identificar na literatura científica, os testes estatísticos mais frequentemente utilizados pelos investigadores da área da saúde infantil e pediatria quando usam escalas de avaliação” foi feita uma revisão sistemática da literatura, que tinha como objetivo analisar artigos científicos cujos instrumentos de recolha de dados fossem escalas de avaliação, na área da saúde da criança e jovem, desenvolvidas com variáveis ordinais, e identificar os testes estatísticos aplicados com estas variáveis. A análise exploratória dos artigos permitiu-nos verificar que os investigadores utilizam diferentes instrumentos com diferentes formatos de medida ordinal (com 3, 4, 5, 7, 10 pontos) e tanto aplicam testes paramétricos como não paramétricos, ou os dois em simultâneo, com este tipo de variáveis, seja qual for a dimensão da amostra. A descrição da metodologia nem sempre explicita se são cumpridas as assunções dos testes. Os artigos consultados nem sempre fazem referência à distribuição de frequência das variáveis (simetria/assimetria) nem à magnitude das correlações entre os itens. A leitura desta bibliografia serviu de suporte à elaboração de dois artigos, um de revisão sistemática da literatura e outro de reflexão teórica. Apesar de terem sido encontradas algumas respostas às dúvidas com que os investigadores e os profissionais, que trabalham com estes instrumentos, se deparam, verifica-se a necessidade de desenvolver estudos de simulação que confirmem algumas situações reais e alguma teoria já existente, e trabalhem outros aspetos nos quais se possam enquadrar os cenários reais de forma a facilitar a tomada de decisão dos investigadores e clínicos que utilizam escalas de avaliação. Para dar resposta ao segundo objetivo “Comparar a performance, em termos de potência e probabilidade de erro de tipo I, das 4 estatísticas da MANOVA paramétrica com 2 estatísticas da MANOVA não paramétrica quando se utilizam variáveis ordinais correlacionadas, geradas aleatoriamente”, desenvolvemos um estudo de simulação, através do Método de Monte Carlo, efetuado no Software R. O delineamento do estudo de simulação incluiu um vetor com 3 variáveis dependentes, uma variável independente (fator com três grupos), escalas de avaliação com um formato de medida com 3, 4, 5, e 7 pontos, diferentes probabilidades marginais (p1 para distribuição simétrica, p2 para distribuição assimétrica positiva, p3 para distribuição assimétrica negativa e p4 para distribuição uniforme) em cada um dos três grupos, correlações de baixa, média e elevada magnitude (r=0.10, r=0.40, r=0.70, respetivamente), e seis dimensões de amostras (n=30, 60, 90, 120, 240, 300). A análise dos resultados permitiu dizer que a maior raiz de Roy foi a estatística que apresentou estimativas de probabilidade de erro de tipo I e de potência de teste mais elevadas. A potência dos testes apresenta comportamentos diferentes, dependendo da distribuição de frequência da resposta aos itens, da magnitude das correlações entre itens, da dimensão da amostra e do formato de medida da escala. Tendo por base a distribuição de frequência, considerámos três situações distintas: a primeira (com probabilidades marginais p1,p1,p4 e p4,p4,p1) em que as estimativas da potência eram muito baixas, nos diferentes cenários; a segunda situação (com probabilidades marginais p2,p3,p4; p1,p2,p3 e p2,p2,p3) em que a magnitude das potências é elevada, nas amostras com dimensão superior ou igual a 60 observações e nas escalas com 3, 4,5 pontos e potências de magnitude menos elevada nas escalas com 7 pontos, mas com a mesma ma magnitude nas amostras com dimensão igual a 120 observações, seja qual for o cenário; a terceira situação (com probabilidades marginais p1,p1,p2; p1,p2,p4; p2,p2,p1; p4,p4,p2 e p2,p2,p4) em que quanto maiores, a intensidade das correlações entre itens e o número de pontos da escala, e menor a dimensão das amostras, menor a potência dos testes, sendo o lambda de Wilks aplicado às ordens mais potente do que todas as outra s estatísticas da MANOVA, com valores imediatamente a seguir à maior raiz de Roy. No entanto, a magnitude das potências dos testes paramétricos e não paramétricos assemelha-se nas amostras com dimensão superior a 90 observações (com correlações de baixa e média magnitude), entre as variáveis dependentes nas escalas com 3, 4 e 5 pontos; e superiores a 240 observações, para correlações de baixa intensidade, nas escalas com 7 pontos. No estudo de simulação e tendo por base a distribuição de frequência, concluímos que na primeira situação de simulação e para os diferentes cenários, as potências são de baixa magnitude devido ao facto de a MANOVA não detetar diferenças entre grupos pela sua similaridade. Na segunda situação de simulação e para os diferentes cenários, a magnitude das potências é elevada em todos os cenários cuja dimensão da amostra seja superior a 60 observações, pelo que é possível aplicar testes paramétricos. Na terceira situação de simulação, e para os diferentes cenários quanto menor a dimensão da amostra e mais elevada a intensidade das correlações e o número de pontos da escala, menor a potência dos testes, sendo a magnitude das potências mais elevadas no teste de Wilks aplicado às ordens, seguido do traço de Pillai aplicado às ordens. No entanto, a magnitude das potências dos testes paramétricos e não paramétricos assemelha-se nas amostras com maior dimensão e correlações de baixa e média magnitude. Para dar resposta ao terceiro objetivo “Enquadrar os resultados da aplicação da MANOVA paramétrica e da MANOVA não paramétrica a dados reais provenientes de escalas de avaliação com um formato de medida com 3, 4, 5 e 7 pontos, nos resultados do estudo de simulação estatística” utilizaram-se dados reais que emergiram da observação de recém-nascidos com a escala de avaliação das competências para a alimentação oral, Early Feeding Skills (EFS), o risco de lesões da pele, com a Neonatal Skin Risk Assessment Scale (NSRAS), e a avaliação da independência funcional em crianças e jovens com espinha bífida, com a Functional Independence Measure (FIM). Para fazer a análise destas escalas foram realizadas 4 aplicações práticas que se enquadrassem nos cenários do estudo de simulação. A idade, o peso, e o nível de lesão medular foram as variáveis independentes escolhidas para selecionar os grupos, sendo os recém-nascidos agrupados por “classes de idade gestacional” e por “classes de peso” as crianças e jovens com espinha bífida por “classes etárias” e “níveis de lesão medular”. Verificou-se um bom enquadramento dos resultados com dados reais no estudo de simulação.
Resumo:
This work aims to study the fluctuation structure of physical properties of oil well profiles. It was used as technique the analysis of fluctuations without trend (Detrended Fluctuation Analysis - DFA). It has been made part of the study 54 oil wells in the Campo de Namorado located in the Campos Basin in Rio de Janeiro. We studied five sections, namely: sonic, density, porosity, resistivity and gamma rays. For most of the profiles , DFA analysis was available in the literature, though the sonic perfile was estimated with the aid of a standard algorithm. The comparison between the exponents of DFA of the five profiles was performed using linear correlation of variables, so we had 10 comparisons of profiles. Our null hypothesis is that the values of DFA for the various physical properties are independent. The main result indicates that no refutation of the null hypothesis. That is, the fluctuations observed by DFA in the profiles do not have a universal character, that is, in general the quantities display a floating structure of their own. From the ten correlations studied only the profiles of density and sonic one showed a significant correlation (p> 0.05). Finally these results indicate that one should use the data from DFA with caution, because, in general, based on geological analysis DFA different profiles can lead to disparate conclusions
Resumo:
This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.
Resumo:
An optimal multiple testing procedure is identified for linear hypotheses under the general linear model, maximizing the expected number of false null hypotheses rejected at any significance level. The optimal procedure depends on the unknown data-generating distribution, but can be consistently estimated. Drawing information together across many hypotheses, the estimated optimal procedure provides an empirical alternative hypothesis by adapting to underlying patterns of departure from the null. Proposed multiple testing procedures based on the empirical alternative are evaluated through simulations and an application to gene expression microarray data. Compared to a standard multiple testing procedure, it is not unusual for use of an empirical alternative hypothesis to increase by 50% or more the number of true positives identified at a given significance level.
Resumo:
Ziel dieses Beitrages ist die Analyse der Anwendung empirischer Tests in der deutschsprachigen Sportpsychologie. Die Ergebnisse vergleichbarer Analysen, bspw. in der Psychologie, zeigen, dass zwischen Anforderungen aus Testkonzepten und empirischer Realität Unterschiede existieren, die bislang für die Sportpsychologie nicht beschrieben und bewertet worden sind. Die Jahrgänge 1994–2007 der Zeitschrift für Sportpsychologie (früher psychologie und sport) wurden danach untersucht, ob Forschungsfragen formuliert, welche Stichprobenart gewählt, welches Testkonzept verwendet, welches Signifikanzniveau benutzt und ob statistische Probleme diskutiert wurden. 83 Artikel wurden von zwei unabhängigen Bewertern nach diesen Aspekten kategorisiert. Als Ergebnis ist festzuhalten, dass in der sportpsychologischen Forschung überwiegend eine Mischung aus Fishers Signifikanztesten sowie Neyman-Pearsons-Hypothesentesten zur Anwendung kommt,das sogenannte „Hybrid-Modell” oder „Null-Ritual”. Die Beschreibung der Teststärke ist kaum zu beobachten. Eine zeitliche Analyse der Beiträge zeigt, dass vor allem die Benutzung von Effektgrößen in den letzten Jahren zugenommen hat. Abschließend werden Ansätze zur Verbesserung und der Vereinheitlichung der Anwendung empirischer Tests vorgeschlagen und diskutiert.
Resumo:
Stimulation of inhibitory neurotransmitter receptors, such as γ-aminobutyric acid type B (GABAB) receptors, activates G protein-gated inwardly rectifying K+ channels (GIRK) which, in turn, influence membrane excitability. Seizure activity has been reported in a Girk2 null mutant mouse lacking GIRK2 channels but showing normal cerebellar development as well as in the weaver mouse, which has mutated GIRK2 channels and shows abnormal development. To understand how the function of GIRK2 channels differs in these two mutant mice, we compared the G protein-activated inwardly rectifying K+ currents in cerebellar granule cells isolated from Girk2 null mutant and weaver mutant mice with those from wild-type mice. Activation of GABAB receptors in wild-type granule cells induced an inwardly rectifying K+ current, which was sensitive to pertussis toxin and inhibited by external Ba2+ ions. The amplitude of the GABAB receptor-activated current was severely attenuated in granule cells isolated from both weaver and Girk2 null mutant mice. By contrast, the G protein-gated inwardly rectifying current and possibly the agonist-independent basal current appeared to be less selective for K+ ions in weaver but not Girk2 null mutant granule cells. Our results support the hypothesis that a nonselective current leads to the weaver phenotype. The loss of GABAB receptor-activated GIRK current appears coincident with the absence of GIRK2 channel protein and the reduction of GIRK1 channel protein in the Girk2 null mutant mouse, suggesting that GABAB receptors couple to heteromultimers composed of GIRK1 and GIRK2 channel subunits.
Resumo:
This dissertation investigates the acquisition of oblique relative clauses in L2 Spanish by English and Moroccan Arabic speakers in order to understand the role of previous linguistic knowledge and its interaction with Universal Grammar on the one hand, and the relationship between grammatical knowledge and its use in real-time, on the other hand. Three types of tasks were employed: an oral production task, an on-line self-paced grammaticality judgment task, and an on-line self-paced reading comprehension task. Results indicated that the acquisition of oblique relative clauses in Spanish is a problematic area for second language learners of intermediate proficiency in the language, regardless of their native language. In particular, this study has showed that, even when the learners’ native language shares the main properties of the L2, i.e., fronting of the obligatory preposition (Pied-Piping), there is still room for divergence, especially in production and timed grammatical intuitions. On the other hand, reaction time data have shown that L2 learners can and do converge at the level of sentence processing, showing exactly the same real-time effects for oblique relative clauses that native speakers had. Processing results demonstrated that native and non-native speakers alike are able to apply universal processing principles such as the Minimal Chain Principle (De Vincenzi, 1991) even when the L2 learners still have incomplete grammatical representations, a result that contradicts some of the predictions of the Shallow Structure Hypothesis (Clahsen & Felser, 2006). Results further suggest that the L2 processing and comprehension domains may be able to access some type of information that it is not yet available to other grammatical modules, probably because transfer of certain L1 properties occurs asymmetrically across linguistic domains. In addition, this study also explored the Null-Prep phenomenon in L2 Spanish, and proposed that Null-Prep is an interlanguage stage, fully available and accounted within UG, which intermediate L2 as well as first language learners go through in the development of pied-piping oblique relative clauses. It is hypothesized that this intermediate stage is the result of optionality of the obligatory preposition in the derivation, when it is not crucial for the meaning of the sentence, and when the DP is going to be in an A-bar position, so it can get default case. This optionality can be predicted by the Bottleneck Hypothesis (Slabakova, 2009c) if we consider that these prepositions are some sort of functional morphology. This study contributes to the field of SLA and L2 processing in various ways. First, it demonstrates that the grammatical representations may be dissociated from grammatical processing in the sense that L2 learners, unlike native speakers, can present unexpected asymmetries such as a convergent processing but divergent grammatical intuitions or production. This conclusion is only possible under the assumption of a modular language system. Finally, it contributes to the general debate of generative SLA since in argues for a fully UG-constrained interlanguage grammar.
Resumo:
We present a method for topological SLAM that specifically targets loop closing for edge-ordered graphs. Instead of using a heuristic approach to accept or reject loop closing, we propose a probabilistically grounded multi-hypothesis technique that relies on the incremental construction of a map/state hypothesis tree. Loop closing is introduced automatically within the tree expansion, and likely hypotheses are chosen based on their posterior probability after a sequence of sensor measurements. Careful pruning of the hypothesis tree keeps the growing number of hypotheses under control and a recursive formulation reduces storage and computational costs. Experiments are used to validate the approach.
Resumo:
Abstract The enemy release hypothesis predicts that native herbivores will either prefer or cause more damage to native than introduced plant species. We tested this using preference and performance experiments in the laboratory and surveys of leaf damage caused by the magpie moth Nyctemera amica on a co-occuring native and introduced species of fireweed (Senecio) in eastern Australia. In the laboratory, ovipositing females and feeding larvae preferred the native S. pinnatifolius over the introduced S. madagascariensis. Larvae performed equally well on foliage of S. pinnatifolius and S. madagascariensis: pupal weights did not differ between insects reared on the two species, but growth rates were significantly faster on S. pinnatifolius. In the field, foliage damage was significantly greater on native S. pinnatifolius than introduced S. madagascariensis. These results support the enemy release hypothesis, and suggest that the failure of native consumers to switch to introduced species contributes to their invasive success. Both plant species experienced reduced, rather than increased, levels of herbivory when growing in mixed populations, as opposed to pure stands in the field; thus, there was no evidence that apparent competition occurred.
Resumo:
Age-related maculopathy (ARM) has remained a challenging topic with respect to its aetiology, pathomechanisms, early detection and treatment since the late 19th century when it was first described as its own entity. ARM was previously considered an inflammatory disease, a degenerative disease, a tumor and as the result of choroidal hemodynamic disturbances and ischaemia. The latter processes have been repeatedly suggested to have a key role in its development and progression. In vivo experiments under hypoxic conditions could be models for the ischaemic deficits in ARM. Recent research has also linked ARM with gene polymorphisms. It is however unclear what triggers a person's gene susceptibility. In this manuscript, a linking hypothesis between aetiological factors including ischaemia and genetics and the development of early clinicopathological changes in ARM is proposed. New clinical psychophysical and electrophysiological tests are introduced that can detect ARM at an early stage. Models of early ARM based upon hemodynamic, photoreceptor and post-receptoral deficits are described and the mechanisms by which ischaemia may be involved as a final common pathway are considered. In neovascular age-related macular degeneration (neovascular AMD), ischaemia is thought to promote release of vascular endothelial growth factor (VEGF) which induces chorioretinal neovascularisation. VEGF is critical in the maintenance of the healthy choriocapillaris. In the final section of the manuscript the documentation of the effect of new anti-VEGF treatments on retinal function in neovascular AMD is critically viewed.