941 resultados para Multiple Hypothesis Testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores whether a specific group of large EU law firms exhibited multiple common behaviours regarding their EU geographies between 1998 and 2009. These potentially common behaviours included their preferences for trading in certain EU locations, their usage of law firm alliances, and the specific reasons why they opened or closed EU branch offices. If my hypothesis is confirmed, this may indicate that certain aspects of large law firm geography are predictable – a finding potentially of interest to various stakeholders globally, including legal regulators, academics and law firms. In testing my hypothesis, I have drawn on research conducted by the Globalization and World Cities (GaWC) Research Network to assist me. Between 1999 and 2010, the GaWC published seven research papers exploring the geographies of large US and UK law firms. Several of the GaWC’s observations arising from these studies were evidence-based; others were speculative – including a novel approach for explaining legal practice branch office change, not adopted in research conducted previously or subsequently. By distilling the GaWC’s key observations these papers into a series of “sub-hypotheses”, I been able to test whether the geographical behaviours of my novel cohort of large EU law firms reflect those suggested by the GaWC. The more the GaWC’s suggested behaviours are observed among my cohort, the more my hypothesis will be supported. In conducting this exercise, I will additionally evaluate the extent to which the GaWC’s research has aided our understanding of large EU law firm geography. Ultimately, my findings broadly support most of the GaWC’s observations, notwithstanding our cohort differences and the speculative nature of several of the GaWC’s propositions. My investigation has also allowed me to refine several of the GaWC’s observations regarding commonly-observable large law firm geographical behaviours, while also addressing a key omission from the group’s research output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although persuasion often occurs via oral communication, it remains a comparatively understudied area. This research tested the hypothesis that changes in three properties of voice influence perceptions of speaker confidence, which in turn differentially affects attitudes according to different underlying psychological processes that the Elaboration Likelihood Model (ELM, Petty & Cacioppo, 1984), suggests should emerge under different levels of thought. Experiment 1 was a 2 (Elaboration: high vs. low) x 2 (Vocal speed: increased speed vs. decreased speed) x 2 (Vocal intonation: falling intonation vs. rising intonation) between participants factorial design. Vocal speed and vocal intonation influenced perceptions of speaker confidence as predicted. In line with the ELM, under high elaboration, confidence biased thought favorability, which in turn influenced attitudes. Under low elaboration, confidence did not bias thoughts but rather directly influenced attitudes as a peripheral cue. Experiment 2 used a similar design as Experiment 1 but focused on vocal pitch. Results confirmed pitch influenced perceptions of confidence as predicted. Importantly, we also replicated the bias and cue processes found in Experiment 1. Experiment 3 investigated the process by which a broader spectrum of speech rate influenced persuasion under moderate elaboration. In a 2 (Argument quality: strong vs. weak) x 4 (Vocal speed: extremely slow vs. moderately slow vs. moderately fast vs. extremely fast) between participants factorial design, results confirmed the hypothesized non-linear relationship between speech rate and perceptions of confidence. In line with the ELM, speech rate influenced persuasion based on the amount of processing. Experiment 4 investigated the effects of a broader spectrum of vocal intonation on persuasion under moderate elaboration and used a similar design as Experiment 3. Results indicated a partial success of our vocal intonation manipulation. No evidence was found to support the hypothesized mechanism. These studies show that changes in several different properties of voice can influence the extent to which others perceive them as confident. Importantly, evidence suggests different vocal properties influence persuasion by the same bias and cue processes under high and low thought. Evidence also suggests that under moderate thought, speech rate influences persuasion based on the amount of processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lung cancer diagnostics have progressed greatly in the previous decade. Development of molecular testing to identify an increasing number of potentially clinically actionable genetic variants, using smaller samples obtained via minimally invasive techniques, is a huge challenge. Tumour heterogeneity and cancer evolution in response to therapy means that repeat biopsies or circulating biomarkers are likely to be increasingly useful to adapt treatment as resistance develops. We highlight some of the current challenges faced in clinical practice for molecular testing of EGFR, ALK, and new biomarkers such as PDL1. Implementation of next generation sequencing platforms for molecular diagnostics in non-small-cell lung cancer is increasingly common, allowing testing of multiple genetic variants from a single sample. The use of next generation sequencing to recruit for molecularly stratified clinical trials is discussed in the context of the UK Stratified Medicine Programme and The UK National Lung Matrix Trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple endocrine neoplasia syndromes have since been classified as types 1 and 2, each with specific phenotypic patterns. MEN1 is usually associated with pituitary, parathyroid and paraneoplastic neuroendocrine tumours. The hallmark of MEN2 is a very high lifetime risk of developing medullary thyroid carcinoma (MTC) more than 95% in untreated patients. Three clinical subtypesdMEN2A, MEN2B, and familial MTC (FMTC) have been defined based on the risk of pheochromocytoma, hyperparathyroidism, and the presence or absence of characteristic physical features). MEN2 occurs as a result of germline activating missense mutations of the RET (REarranged during Transfection) proto-oncogene. MEN2-associated mutations are almost always located in exons 10, 11, or 13 through 16. Strong genotype-phenotype correlations exist with respect to clinical subtype, age at onset, and aggressiveness of MTC in MEN2. These are used to determine the age at which prophylactic thyroidectomy should occur and whether screening for pheochromocytoma or hyperparathyroidism is necessary. Specific RET mutations can also impact management in patients presenting with apparently sporadic MTC. Therefore, genetic testing should be performed before surgical intervention in all patients diagnosed with MTC. Recently, Pellegata et al. have reported that germline mutations in CDKN1B can predispose to the development of multiple endocrine tumours in both rats and humans and this new MEN syndrome is named MENX and MEN4, respectively. CDKN1B. A recent report showed that in sporadic MTC, CDKN1B V109G polymorphism correlates with a more favorable disease progression than the wild-type allele and might be considered a new promising prognostic marker. New insights on MEN syndrome pathogenesis and related inherited endocrine disorders are of particular interest for an adequate surgical and therapeutic approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thin film adhesion often determines microelectronic device reliability and it is therefore essential to have experimental techniques that accurately and efficiently characterize it. Laser-induced delamination is a novel technique that uses laser-generated stress waves to load thin films at high strain rates and extract the fracture toughness of the film/substrate interface. The effectiveness of the technique in measuring the interface properties of metallic films has been documented in previous studies. The objective of the current effort is to model the effect of residual stresses on the dynamic delamination of thin films. Residual stresses can be high enough to affect the crack advance and the mode mixity of the delimitation event, and must therefore be adequately modeled to make accurate and repeatable predictions of fracture toughness. The equivalent axial force and bending moment generated by the residual stresses are included in a dynamic, nonlinear finite element model of the delaminating film, and the impact of residual stresses on the final extent of the interfacial crack, the relative contribution of shear failure, and the deformed shape of the delaminated film is studied in detail. Another objective of the study is to develop techniques to address issues related to the testing of polymeric films. These type of films adhere well to silicon and the resulting crack advance is often much smaller than for metallic films, making the extraction of the interface fracture toughness more difficult. The use of an inertial layer which enhances the amount of kinetic energy trapped in the film and thus the crack advance is examined. It is determined that the inertial layer does improve the crack advance, although in a relatively limited fashion. The high interface toughness of polymer films often causes the film to fail cohesively when the crack front leaves the weakly bonded region and enters the strong interface. The use of a tapered pre-crack region that provides a more gradual transition to the strong interface is examined. The tapered triangular pre-crack geometry is found to be effective in reducing the stresses induced thereby making it an attractive option. We conclude by studying the impact of modifying the pre-crack geometry to enable the testing of multiple polymer films.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study tests two general and independent hypotheses with the basic assumption that phytoactive secondary compounds produced by plants evolved primarily as plant defences against competitor plant species. The first hypothesis is that the production and main way of release of phytoactive compounds reflect an adaptive response to climatic conditions. Thus, higher phytoactivity by volatile compounds prevails in plants of hot, dry environments, whereas higher phytoactivity by water-soluble compounds is preponderant in plants from wetter environments. The second hypothesis is that synergy between plant phytoactive compounds is widespread, due to the resulting higher energy efficiency and economy of resources. The first hypothesis was tested on germination and early growth of cucumber treated with either water extracts or volatiles from leaves or vegetative shoot tops of four Mediterranean-type shrubs. The second hypothesis was tested on germination of subterranean clover treated with either water extracts of leaves or vegetative shoot tops of one tree and of three Mediterranean-type shrubs or with each of the three fractions obtained from water extracts. Our data do not support either hypotheses. We found no evidence for higher phytoactivity in volatile compounds released by plants that thrive in hot, dry Mediterranean-type environments. We also found no evidence for the predominance of synergy among the constituents of fractions. To the contrary, we found either antagonism or no interaction of effects among allelopathic compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students’ examination experience were analysed, leading to the identification of engagement issues in the delivery of high-stakes computer-based assessments.The exam combined short-answer open-response questions with multiple-choice-style items to assess knowledge and understanding of research methods. The findings suggest that engagement with computer-based testing depends, to a lesser extent, on students’ general levels of digital literacy and, to a greater extent, on their information technology (IT) proficiency for assessment and their ability to adapt their test-taking strategies, including organisational and cognitive strategies, to the online assessment environment. The socialisation and preparation of students for computer-based testing therefore emerge as key responsibilities for instructors to address, with students requesting increased opportunities for practice and training to develop the IT skills and test-taking strategies necessary to succeed in computer-based examinations. These findings and their implications in terms of instructional responsibilities form the basis of a proposal for a framework for Learner Engagement with e-Assessment Practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of educational and psychological measurement, the shift from paper-based to computerized tests has become a prominent trend in recent years. Computerized tests allow for more complex and personalized test administration procedures, like Computerized Adaptive Testing (CAT). CAT, following the Item Response Theory (IRT) models, dynamically generates tests based on test-taker responses, driven by complex statistical algorithms. Even if CAT structures are complex, they are flexible and convenient, but concerns about test security should be addressed. Frequent item administration can lead to item exposure and cheating, necessitating preventive and diagnostic measures. In this thesis a method called "CHeater identification using Interim Person fit Statistic" (CHIPS) is developed, designed to identify and limit cheaters in real-time during test administration. CHIPS utilizes response times (RTs) to calculate an Interim Person fit Statistic (IPS), allowing for on-the-fly intervention using a more secret item bank. Also, a slight modification is proposed to overcome situations with constant speed, called Modified-CHIPS (M-CHIPS). A simulation study assesses CHIPS, highlighting its effectiveness in identifying and controlling cheaters. However, it reveals limitations when cheaters possess all correct answers. The M-CHIPS overcame this limitation. Furthermore, the method has shown not to be influenced by the cheaters’ ability distribution or the level of correlation between ability and speed of test-takers. Finally, the method has demonstrated flexibility for the choice of significance level and the transition from fixed-length tests to variable-length ones. The thesis discusses potential applications, including the suitability of the method for multiple-choice tests, assumptions about RT distribution and level of item pre-knowledge. Also limitations are discussed to explore future developments such as different RT distributions, unusual honest respondent behaviors, and field testing in real-world scenarios. In summary, CHIPS and M-CHIPS offer real-time cheating detection in CAT, enhancing test security and ability estimation while not penalizing test respondents.