941 resultados para Multiple Hypothesis Testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Health disparities between groups remain even after accounting for established causes such as structural and economic factors. The present research tested, for the first time, whether multiple social categorization processes can explain enhanced support for immigrant health (measured by respondents’ behavioral intention to support immigrants’ vaccination against A H1N1 disease by cutting regional public funds). Moreover, the mediating role of individualization and the moderating role of social identity complexity were tested. Findings showed that multiple versus single categorization of immigrants lead to support their right to health and confirmed the moderated mediation hypothesis. The potential in developing this sort of social cognitive intervention to address health disparities is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A high-performance fuel gauging sensor is described that uses five diaphragm-based pressure sensors, which are monitored using a linear array of polymer optical fiber Bragg gratings. The sensors were initially characterized using water, revealing a sensitivity of 98 pm/cm for four of the sensors and 86 pm/cm for the fifth. The discrepancy in the sensitivity of the fifth sensor has been explained as being a result of the annealing of the other four sensors. Initial testing in JET A-1 aviation fuel revealed the unsuitability of silicone rubber diaphragms for prolonged usage in fuel. A second set of sensors manufactured with a polyurethane-based diaphragm showed no measurable deterioration over a three month period immersed in fuel. These sensors exhibited a sensitivity of 39 pm/cm, which is less than the silicone rubber devices due to the stiffer nature of the polyurethane material used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A versenyképesség, illetve a gazdaságos működés elengedhetetlen feltétele a fogyasztói elégedettség, melynek egyik meghatározó eleme az észlelt és elvárt minőség közti kapcsolat. A minőségi elvárások az internettel, mint napjaink egyik meghatározó csatornájával kapcsolatban is megfogalmazódtak már, így kapott jelentős szerepet az online szolgáltatásminőség meghatározása, illetve ezzel összekapcsolódva az online-fogyasztói elégedettségmérés. A tanulmány célja, hogy szakirodalmi áttekintést nyújtson a témában, és a szakirodalomból ismert E-S-QUAL és E-RecS-QUAL online-fogyasztói elégedettségmérésre szolgáló skálát megvizsgálja, érvényességét a magyar körülmények között letesztelje, és a szükségesnek látszó módosítások elvégzésével egy Magyarországon használható skálát hozzon létre. Az online-fogyasztók elégedettségmérésének alapjaként az online szolgáltatásminőség fogyasztói érzékelésével, illetve értékelésével kapcsolatos elméleteket járja körbe a tanulmány, és ezután kerül sor a különböző mérési módszerek bemutatására, kiemelt szerepet szánva az E-S-QUAL és E-RecS-QUAL skálának, mely az egyik leginkább alkalmazott módszernek számít. Az áttekintés középpontjában azok a honlapok állnak, melyeken vásárolni is lehet, a kutatást pedig az egyik jelentős hazai online könyvesbolt ügyfélkörében végeztem el. ______ Over the last decade the business-to-consumer online market has been growing very fast. In marketing literature a lot of studies have been created focusing on understanding and measuring e-service quality (e-sq) and online-customer satisfaction. The aim of the study is to summarize these concepts, analyse the relationship between e-sq and customer’s loyalty, which increases the competitiveness of the companies, and to create a valid and reliable scale to the Hungarian market for measuring online-customer satisfaction. The base of the empirical study is the E-S-QUAL and its second scale the E-RecS-QUAL that are widely used multiple scales measuring e-sq with seven dimensions: efficiency, system availability, fulfilment, privacy, responsiveness, compensation, and contact. The study is focusing on the websites customers use to shop online.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation examines the consequences of Electronic Data Interchange (EDI) use on interorganizational relations (IR) in the retail industry. EDI is a type of interorganizational information system that facilitates the exchange of business documents in structured, machine processable form. The research model links EDI use and three IR dimensions--structural, behavioral, and outcome. Based on relevant literature from organizational theory and marketing channels, fourteen hypotheses were proposed for the relationships among EDI use and the three IR dimensions.^ Data were collected through self-administered questionnaires from key informants in 97 retail companies (19% response rate). The hypotheses were tested using multiple regression analysis. The analysis supports the following hypothesis: (a) EDI use is positively related to information intensity and formalization, (b) formalization is positively related to cooperation, (c) information intensity is positively related to cooperation, (d) conflict is negatively related to performance and satisfaction, (e) cooperation is positively related to performance, and (f) performance is positively related to satisfaction. The results support the general premise of the model that the relationship between EDI use and satisfaction among channel members has to be viewed within an interorganizational context.^ Research on EDI is still in a nascent stage. By identifying and testing relevant interorganizational variables, this study offers insights for practitioners managing boundary-spanning activities in organizations using or planning to use EDI. Further, the thesis provides avenues for future research aimed at understanding the consequences of this interorganizational information technology. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine fifth grade students' perceptions of the Fitnessgram physical fitness testing program. This study examined if the Fitnessgram physical fitness testing experience promotes an understanding of the health-related fitness components and examined the relationship between individual fitness test scores and time spent participating in out-of-school physical activity. Lastly, students' thoughts and feelings concerning the Fitnessgram experience were examined. ^ The primary participant population for the study was 110 fifth grade students at Redland Elementary School, a Miami-Dade County Public School (M-DCPS). Data were collected over the course of 5 months. Multiple sources of data allowed for triangulation. Data sources included Fitnessgram test scores, questionnaires, document analysis, and in-depth interviews. ^ Interview data were analyzed qualitatively for common broad themes, which were identified and defined. Document analysis included analyzing student fitness test scores and student questionnaire data. This information was analyzed to determine if the Fitnessgram test scores have an impact on student views about the school fitness-testing program. Data were statistically analyzed using analysis of frequency, crosstabulations (Bryman & Duncan, 1997), and Somers'd Correlation (Bryman & Duncan, 1997). The results of the analysis of data on student knowledge of the physical fitness components tested by each Fitnessgram test revealed students do not understand the health-related fitness components. ^ The results of determining a relationship between individuals' fitness test scores and time spent in out-of-school physical activity revealed a significant positive relationship for 2 of the 6 Fitnessgram tests. ^ The results of examining students' thoughts and feelings about each Fitnessgram test focused around 2 broad themes: (a) these children do not mind the physical fitness testing and (b) how they felt about the experience was directly related to how they thought they had performed. ^ If the goal of physical fitness was only to get children fit, this test may be appropriate. However, the ultimate goal of physical fitness is to encourage students to live active and healthy lifestyles. Findings suggest the Fitnessgram as implemented by M-DCPS may not be the most suitable measurement instrument when assessing attitudinal changes that affect a healthy lifelong lifestyle. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Why do Argentines continue to support democracy despite distrusting political institutions and politicians? Support for democracy is high even though performance of the regime is poor. One would suspect that poor economic and political performance would open the door for military intervention given the history of Argentina. What changed? What explains variance across the multiple dimensions of political trust, such as trust in the regime, trust in political institutions, and trust in politicians? This dissertation is a case study of political culture through public opinion exploring the multiple dimensions of political trust in Argentina during the 1990s. ^ Variance across the different dimensions of political trust may be an indicator of the rise of a new type of citizens called "critical citizens." Critical citizens are citizens who criticize the regime to obtain democratic reforms but support the ideals of democracy. In established democracies, the rise of critical citizens is explained by a shift in individuals' value priorities towards postmaterialism. Postmaterialism is a cultural change in the direction of values that emphasize self-realization and individual well-being. Postmaterialism influences various social and political attitudes. ^ Because Argentina is experiencing a cultural change and a rise of critical citizens similar to more advanced societies, the theory of postmaterialism generated the main hypothesis to explain the multiple dimensions of political trust. This dissertation also tested an alternative explanation: the multiple dimensions of political trust responded instead to citizens' evaluations of performance. Ultimately, postmaterialism explained trust in the political regime and trust in the political institutions. Contrary to expectations, postmaterialism did not explain trust in the political elites or politicians. Trust in politicians was better explained by the alternative hypothesis, performance. ^ The main method of research was the statistical method supplemented with the comparative method when data were available. Two main databases were used: the World Values Surveys and the Latinobarometer. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary aim of this dissertation is to develop data mining tools for knowledge discovery in biomedical data when multiple (homogeneous or heterogeneous) sources of data are available. The central hypothesis is that, when information from multiple sources of data are used appropriately and effectively, knowledge discovery can be better achieved than what is possible from only a single source. ^ Recent advances in high-throughput technology have enabled biomedical researchers to generate large volumes of diverse types of data on a genome-wide scale. These data include DNA sequences, gene expression measurements, and much more; they provide the motivation for building analysis tools to elucidate the modular organization of the cell. The challenges include efficiently and accurately extracting information from the multiple data sources; representing the information effectively, developing analytical tools, and interpreting the results in the context of the domain. ^ The first part considers the application of feature-level integration to design classifiers that discriminate between soil types. The machine learning tools, SVM and KNN, were used to successfully distinguish between several soil samples. ^ The second part considers clustering using multiple heterogeneous data sources. The resulting Multi-Source Clustering (MSC) algorithm was shown to have a better performance than clustering methods that use only a single data source or a simple feature-level integration of heterogeneous data sources. ^ The third part proposes a new approach to effectively incorporate incomplete data into clustering analysis. Adapted from K-means algorithm, the Generalized Constrained Clustering (GCC) algorithm makes use of incomplete data in the form of constraints to perform exploratory analysis. Novel approaches for extracting constraints were proposed. For sufficiently large constraint sets, the GCC algorithm outperformed the MSC algorithm. ^ The last part considers the problem of providing a theme-specific environment for mining multi-source biomedical data. The database called PlasmoTFBM, focusing on gene regulation of Plasmodium falciparum, contains diverse information and has a simple interface to allow biologists to explore the data. It provided a framework for comparing different analytical tools for predicting regulatory elements and for designing useful data mining tools. ^ The conclusion is that the experiments reported in this dissertation strongly support the central hypothesis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Convergence among treatment, prevention, and developmental intervention approaches has led to the recognition of the need for evaluation models and research designs that employ a full range of evaluation information to provide an empirical basis for enhancing the efficiency, efficacy, and effectiveness of prevention and positive development interventions. This study reports an investigation of a positive youth development program using an Outcome Mediation Cascade (OMC) evaluation model, an integrated model for evaluating the empirical intersection between intervention and developmental processes. The Changing Lives Program (CLP) is a community supported positive youth development intervention implemented in a practice setting as a selective/indicated program for multi-ethnic, multi-problem at risk youth in urban alternative high schools. This study used a Relational Data Analysis integration of quantitative and qualitative data analysis strategies, including the use of both fixed and free response measures and a structural equation modeling approach, to construct and evaluate the hypothesized OMC model. Findings indicated that the hypothesized model fit the data (χ2 (7) = 6.991, p = .43; RMSEA = .00; CFI = 1.00; WRMR = .459). Findings also provided preliminary evidence consistent with the hypothesis that in addition to having effects on targeted positive outcomes, PYD interventions are likely to have progressive cascading effects on untargeted problem outcomes that operate through effects on positive outcomes. Furthermore, the general pattern of findings suggested the need to use methods capable of capturing both quantitative and qualitative change in order to increase the likelihood of identifying more complete theory informed empirically supported models of developmental intervention change processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Intersensory Redundancy Hypothesis (IRH; Bahrick & Lickliter, 2000, 2002, 2012) predicts that early in development information presented to a single sense modality will selectively recruit attention to modality-specific properties of stimulation and facilitate learning of those properties at the expense of amodal properties (unimodal facilitation). Vaillant (2010) demonstrated that bobwhite quail chicks prenatally exposed to a maternal call alone (unimodal stimulation) are able to detect a pitch change, a modality-specific property, in subsequent postnatal testing between the familiarized call and the same call with altered pitch. In contrast, chicks prenatally exposed to a maternal call paired with a temporally synchronous light (redundant audiovisual stimulation) were unable to detect a pitch change. According to the IRH (Bahrick & Lickliter, 2012), as development proceeds and the individual's perceptual abilities increase, the individual should detect modality-specific properties in both nonredundant, unimodal and redundant, bimodal conditions. However, when the perceiver is presented with a difficult task, relative to their level of expertise, unimodal facilitation should become evident. The first experiment of the present study exposed bobwhite quail chicks 24 hr after hatching to unimodal auditory, nonredundant audiovisual, or redundant audiovisual presentations of a maternal call for 10min/hr for 24 hours. All chicks were subsequently tested 24 hr after the completion of the stimulation (72 hr following hatching) between the familiarized maternal call and the same call with altered pitch. Chicks from all experimental groups (unimodal, nonredundant audiovisual, and redundant audiovisual exposure) significantly preferred the familiarized call over the pitch-modified call. The second experiment exposed chicks to the same exposure conditions, but created a more difficult task by narrowing the pitch range between the two maternal calls with which they were tested. Chicks in the unimodal and nonredundant audiovisual conditions demonstrated detection of the pitch change, whereas the redundant audiovisual exposure group did not show detection of the pitch change, providing evidence of unimodal facilitation. These results are consistent with predictions of the IRH and provide further support for the effects of unimodal facilitation and the role of task difficulty across early development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Omnibus tests of significance in contingency tables use statistics of the chi-square type. When the null is rejected, residual analyses are conducted to identify cells in which observed frequencies differ significantly from expected frequencies. Residual analyses are thus conditioned on a significant omnibus test. Conditional approaches have been shown to substantially alter type I error rates in cases involving t tests conditional on the results of a test of equality of variances, or tests of regression coefficients conditional on the results of tests of heteroscedasticity. We show that residual analyses conditional on a significant omnibus test are also affected by this problem, yielding type I error rates that can be up to 6 times larger than nominal rates, depending on the size of the table and the form of the marginal distributions. We explored several unconditional approaches in search for a method that maintains the nominal type I error rate and found out that a bootstrap correction for multiple testing achieved this goal. The validity of this approach is documented for two-way contingency tables in the contexts of tests of independence, tests of homogeneity, and fitting psychometric functions. Computer code in MATLAB and R to conduct these analyses is provided as Supplementary Material.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.

At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.

The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.

In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.

To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.

In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.

Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.

In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.