973 resultados para multiple testing
Resumo:
A versenyképesség, illetve a gazdaságos működés elengedhetetlen feltétele a fogyasztói elégedettség, melynek egyik meghatározó eleme az észlelt és elvárt minőség közti kapcsolat. A minőségi elvárások az internettel, mint napjaink egyik meghatározó csatornájával kapcsolatban is megfogalmazódtak már, így kapott jelentős szerepet az online szolgáltatásminőség meghatározása, illetve ezzel összekapcsolódva az online-fogyasztói elégedettségmérés. A tanulmány célja, hogy szakirodalmi áttekintést nyújtson a témában, és a szakirodalomból ismert E-S-QUAL és E-RecS-QUAL online-fogyasztói elégedettségmérésre szolgáló skálát megvizsgálja, érvényességét a magyar körülmények között letesztelje, és a szükségesnek látszó módosítások elvégzésével egy Magyarországon használható skálát hozzon létre. Az online-fogyasztók elégedettségmérésének alapjaként az online szolgáltatásminőség fogyasztói érzékelésével, illetve értékelésével kapcsolatos elméleteket járja körbe a tanulmány, és ezután kerül sor a különböző mérési módszerek bemutatására, kiemelt szerepet szánva az E-S-QUAL és E-RecS-QUAL skálának, mely az egyik leginkább alkalmazott módszernek számít. Az áttekintés középpontjában azok a honlapok állnak, melyeken vásárolni is lehet, a kutatást pedig az egyik jelentős hazai online könyvesbolt ügyfélkörében végeztem el. ______ Over the last decade the business-to-consumer online market has been growing very fast. In marketing literature a lot of studies have been created focusing on understanding and measuring e-service quality (e-sq) and online-customer satisfaction. The aim of the study is to summarize these concepts, analyse the relationship between e-sq and customer’s loyalty, which increases the competitiveness of the companies, and to create a valid and reliable scale to the Hungarian market for measuring online-customer satisfaction. The base of the empirical study is the E-S-QUAL and its second scale the E-RecS-QUAL that are widely used multiple scales measuring e-sq with seven dimensions: efficiency, system availability, fulfilment, privacy, responsiveness, compensation, and contact. The study is focusing on the websites customers use to shop online.
Resumo:
The purpose of this study was to determine fifth grade students' perceptions of the Fitnessgram physical fitness testing program. This study examined if the Fitnessgram physical fitness testing experience promotes an understanding of the health-related fitness components and examined the relationship between individual fitness test scores and time spent participating in out-of-school physical activity. Lastly, students' thoughts and feelings concerning the Fitnessgram experience were examined. ^ The primary participant population for the study was 110 fifth grade students at Redland Elementary School, a Miami-Dade County Public School (M-DCPS). Data were collected over the course of 5 months. Multiple sources of data allowed for triangulation. Data sources included Fitnessgram test scores, questionnaires, document analysis, and in-depth interviews. ^ Interview data were analyzed qualitatively for common broad themes, which were identified and defined. Document analysis included analyzing student fitness test scores and student questionnaire data. This information was analyzed to determine if the Fitnessgram test scores have an impact on student views about the school fitness-testing program. Data were statistically analyzed using analysis of frequency, crosstabulations (Bryman & Duncan, 1997), and Somers'd Correlation (Bryman & Duncan, 1997). The results of the analysis of data on student knowledge of the physical fitness components tested by each Fitnessgram test revealed students do not understand the health-related fitness components. ^ The results of determining a relationship between individuals' fitness test scores and time spent in out-of-school physical activity revealed a significant positive relationship for 2 of the 6 Fitnessgram tests. ^ The results of examining students' thoughts and feelings about each Fitnessgram test focused around 2 broad themes: (a) these children do not mind the physical fitness testing and (b) how they felt about the experience was directly related to how they thought they had performed. ^ If the goal of physical fitness was only to get children fit, this test may be appropriate. However, the ultimate goal of physical fitness is to encourage students to live active and healthy lifestyles. Findings suggest the Fitnessgram as implemented by M-DCPS may not be the most suitable measurement instrument when assessing attitudinal changes that affect a healthy lifelong lifestyle. ^
Resumo:
Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.
Resumo:
The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.
At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.
The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.
In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.
To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.
In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.
Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.
In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.
Resumo:
Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.
Resumo:
Lung cancer diagnostics have progressed greatly in the previous decade. Development of molecular testing to identify an increasing number of potentially clinically actionable genetic variants, using smaller samples obtained via minimally invasive techniques, is a huge challenge. Tumour heterogeneity and cancer evolution in response to therapy means that repeat biopsies or circulating biomarkers are likely to be increasingly useful to adapt treatment as resistance develops. We highlight some of the current challenges faced in clinical practice for molecular testing of EGFR, ALK, and new biomarkers such as PDL1. Implementation of next generation sequencing platforms for molecular diagnostics in non-small-cell lung cancer is increasingly common, allowing testing of multiple genetic variants from a single sample. The use of next generation sequencing to recruit for molecularly stratified clinical trials is discussed in the context of the UK Stratified Medicine Programme and The UK National Lung Matrix Trial.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Multiple endocrine neoplasia syndromes have since been classified as types 1 and 2, each with specific phenotypic patterns. MEN1 is usually associated with pituitary, parathyroid and paraneoplastic neuroendocrine tumours. The hallmark of MEN2 is a very high lifetime risk of developing medullary thyroid carcinoma (MTC) more than 95% in untreated patients. Three clinical subtypesdMEN2A, MEN2B, and familial MTC (FMTC) have been defined based on the risk of pheochromocytoma, hyperparathyroidism, and the presence or absence of characteristic physical features). MEN2 occurs as a result of germline activating missense mutations of the RET (REarranged during Transfection) proto-oncogene. MEN2-associated mutations are almost always located in exons 10, 11, or 13 through 16. Strong genotype-phenotype correlations exist with respect to clinical subtype, age at onset, and aggressiveness of MTC in MEN2. These are used to determine the age at which prophylactic thyroidectomy should occur and whether screening for pheochromocytoma or hyperparathyroidism is necessary. Specific RET mutations can also impact management in patients presenting with apparently sporadic MTC. Therefore, genetic testing should be performed before surgical intervention in all patients diagnosed with MTC. Recently, Pellegata et al. have reported that germline mutations in CDKN1B can predispose to the development of multiple endocrine tumours in both rats and humans and this new MEN syndrome is named MENX and MEN4, respectively. CDKN1B. A recent report showed that in sporadic MTC, CDKN1B V109G polymorphism correlates with a more favorable disease progression than the wild-type allele and might be considered a new promising prognostic marker. New insights on MEN syndrome pathogenesis and related inherited endocrine disorders are of particular interest for an adequate surgical and therapeutic approach.
Resumo:
Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.
Resumo:
In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.
Resumo:
International audience
Resumo:
Thin film adhesion often determines microelectronic device reliability and it is therefore essential to have experimental techniques that accurately and efficiently characterize it. Laser-induced delamination is a novel technique that uses laser-generated stress waves to load thin films at high strain rates and extract the fracture toughness of the film/substrate interface. The effectiveness of the technique in measuring the interface properties of metallic films has been documented in previous studies. The objective of the current effort is to model the effect of residual stresses on the dynamic delamination of thin films. Residual stresses can be high enough to affect the crack advance and the mode mixity of the delimitation event, and must therefore be adequately modeled to make accurate and repeatable predictions of fracture toughness. The equivalent axial force and bending moment generated by the residual stresses are included in a dynamic, nonlinear finite element model of the delaminating film, and the impact of residual stresses on the final extent of the interfacial crack, the relative contribution of shear failure, and the deformed shape of the delaminated film is studied in detail. Another objective of the study is to develop techniques to address issues related to the testing of polymeric films. These type of films adhere well to silicon and the resulting crack advance is often much smaller than for metallic films, making the extraction of the interface fracture toughness more difficult. The use of an inertial layer which enhances the amount of kinetic energy trapped in the film and thus the crack advance is examined. It is determined that the inertial layer does improve the crack advance, although in a relatively limited fashion. The high interface toughness of polymer films often causes the film to fail cohesively when the crack front leaves the weakly bonded region and enters the strong interface. The use of a tapered pre-crack region that provides a more gradual transition to the strong interface is examined. The tapered triangular pre-crack geometry is found to be effective in reducing the stresses induced thereby making it an attractive option. We conclude by studying the impact of modifying the pre-crack geometry to enable the testing of multiple polymer films.
Resumo:
Mestrado em Finanças
Resumo:
The issues influencing student engagement with high-stakes computer-based exams were investigated, drawing on feedback from two cohorts of international MA Education students encountering this assessment method for the first time. Qualitative data from surveys and focus groups on the students’ examination experience were analysed, leading to the identification of engagement issues in the delivery of high-stakes computer-based assessments.The exam combined short-answer open-response questions with multiple-choice-style items to assess knowledge and understanding of research methods. The findings suggest that engagement with computer-based testing depends, to a lesser extent, on students’ general levels of digital literacy and, to a greater extent, on their information technology (IT) proficiency for assessment and their ability to adapt their test-taking strategies, including organisational and cognitive strategies, to the online assessment environment. The socialisation and preparation of students for computer-based testing therefore emerge as key responsibilities for instructors to address, with students requesting increased opportunities for practice and training to develop the IT skills and test-taking strategies necessary to succeed in computer-based examinations. These findings and their implications in terms of instructional responsibilities form the basis of a proposal for a framework for Learner Engagement with e-Assessment Practices.