875 resultados para requirement-based testing
Resumo:
The unprecedented and relentless growth in the electronics industry is feeding the demand for integrated circuits (ICs) with increasing functionality and performance at minimum cost and power consumption. As predicted by Moore's law, ICs are being aggressively scaled to meet this demand. While the continuous scaling of process technology is reducing gate delays, the performance of ICs is being increasingly dominated by interconnect delays. In an effort to improve submicrometer interconnect performance, to increase packing density, and to reduce chip area and power consumption, the semiconductor industry is focusing on three-dimensional (3D) integration. However, volume production and commercial exploitation of 3D integration are not feasible yet due to significant technical hurdles.
At the present time, interposer-based 2.5D integration is emerging as a precursor to stacked 3D integration. All the dies and the interposer in a 2.5D IC must be adequately tested for product qualification. However, since the structure of 2.5D ICs is different from the traditional 2D ICs, new challenges have emerged: (1) pre-bond interposer testing, (2) lack of test access, (3) limited ability for at-speed testing, (4) high density I/O ports and interconnects, (5) reduced number of test pins, and (6) high power consumption. This research targets the above challenges and effective solutions have been developed to test both dies and the interposer.
The dissertation first introduces the basic concepts of 3D ICs and 2.5D ICs. Prior work on testing of 2.5D ICs is studied. An efficient method is presented to locate defects in a passive interposer before stacking. The proposed test architecture uses e-fuses that can be programmed to connect or disconnect functional paths inside the interposer. The concept of a die footprint is utilized for interconnect testing, and the overall assembly and test flow is described. Moreover, the concept of weighted critical area is defined and utilized to reduce test time. In order to fully determine the location of each e-fuse and the order of functional interconnects in a test path, we also present a test-path design algorithm. The proposed algorithm can generate all test paths for interconnect testing.
In order to test for opens, shorts, and interconnect delay defects in the interposer, a test architecture is proposed that is fully compatible with the IEEE 1149.1 standard and relies on an enhancement of the standard test access port (TAP) controller. To reduce test cost, a test-path design and scheduling technique is also presented that minimizes a composite cost function based on test time and the design-for-test (DfT) overhead in terms of additional through silicon vias (TSVs) and micro-bumps needed for test access. The locations of the dies on the interposer are taken into consideration in order to determine the order of dies in a test path.
To address the scenario of high density of I/O ports and interconnects, an efficient built-in self-test (BIST) technique is presented that targets the dies and the interposer interconnects. The proposed BIST architecture can be enabled by the standard TAP controller in the IEEE 1149.1 standard. The area overhead introduced by this BIST architecture is negligible; it includes two simple BIST controllers, a linear-feedback-shift-register (LFSR), a multiple-input-signature-register (MISR), and some extensions to the boundary-scan cells in the dies on the interposer. With these extensions, all boundary-scan cells can be used for self-configuration and self-diagnosis during interconnect testing. To reduce the overall test cost, a test scheduling and optimization technique under power constraints is described.
In order to accomplish testing with a small number test pins, the dissertation presents two efficient ExTest scheduling strategies that implements interconnect testing between tiles inside an system on chip (SoC) die on the interposer while satisfying the practical constraint that the number of required test pins cannot exceed the number of available pins at the chip level. The tiles in the SoC are divided into groups based on the manner in which they are interconnected. In order to minimize the test time, two optimization solutions are introduced. The first solution minimizes the number of input test pins, and the second solution minimizes the number output test pins. In addition, two subgroup configuration methods are further proposed to generate subgroups inside each test group.
Finally, the dissertation presents a programmable method for shift-clock stagger assignment to reduce power supply noise during SoC die testing in 2.5D ICs. An SoC die in the 2.5D IC is typically composed of several blocks and two neighboring blocks that share the same power rails should not be toggled at the same time during shift. Therefore, the proposed programmable method does not assign the same stagger value to neighboring blocks. The positions of all blocks are first analyzed and the shared boundary length between blocks is then calculated. Based on the position relationships between the blocks, a mathematical model is presented to derive optimal result for small-to-medium sized problems. For larger designs, a heuristic algorithm is proposed and evaluated.
In summary, the dissertation targets important design and optimization problems related to testing of interposer-based 2.5D ICs. The proposed research has led to theoretical insights, experiment results, and a set of test and design-for-test methods to make testing effective and feasible from a cost perspective.
Resumo:
Family health history (FHH) in the context of risk assessment has been shown to positively impact risk perception and behavior change. The added value of genetic risk testing is less certain. The aim of this study was to determine the impact of Type 2 Diabetes (T2D) FHH and genetic risk counseling on behavior and its cognitive precursors. Subjects were non-diabetic patients randomized to counseling that included FHH +/- T2D genetic testing. Measurements included weight, BMI, fasting glucose at baseline and 12 months and behavioral and cognitive precursor (T2D risk perception and control over disease development) surveys at baseline, 3, and 12 months. 391 subjects enrolled of which 312 completed the study. Behavioral and clinical outcomes did not differ across FHH or genetic risk but cognitive precursors did. Higher FHH risk was associated with a stronger perceived T2D risk (pKendall < 0.001) and with a perception of "serious" risk (pKendall < 0.001). Genetic risk did not influence risk perception, but was correlated with an increase in perception of "serious" risk for moderate (pKendall = 0.04) and average FHH risk subjects (pKendall = 0.01), though not for the high FHH risk group. Perceived control over T2D risk was high and not affected by FHH or genetic risk. FHH appears to have a strong impact on cognitive precursors of behavior change, suggesting it could be leveraged to enhance risk counseling, particularly when lifestyle change is desirable. Genetic risk was able to alter perceptions about the seriousness of T2D risk in those with moderate and average FHH risk, suggesting that FHH could be used to selectively identify individuals who may benefit from genetic risk testing.
Resumo:
HIV testing has been promoted as a key HIV prevention strategy in low-resource settings, despite studies showing variable impact on risk behavior. We sought to examine rates of HIV testing and the association between testing and sexual risk behaviors in Kisumu, Kenya. Participants were interviewed about HIV testing and sexual risk behaviors. They then underwent HIV serologic testing. We found that 47% of women and 36% of men reported prior testing. Two-thirds of participants who tested HIV-positive in this study reported no prior HIV test. Women who had undergone recent testing were less likely to report high-risk behaviors than women who had never been tested; this was not seen among men. Although rates of HIV testing were higher than seen in previous studies, the majority of HIV-infected people were unaware of their status. Efforts should be made to increase HIV testing among this population.
Resumo:
When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.
Testing a gravity-based accessibility instrument to engage stakeholders into integrated LUT planning
Resumo:
The paper starts from the concern that while there is a large body of literature focusing on the theoretical definitions and measurements of accessibility, the extent to which such measures are used in planning practice is less clear. Previous reviews of accessibility instruments have in fact identified a gap between the clear theoretical assumptions and the infrequent applications of accessibility instruments in spatial and transport planning. In this paper we present the results of a structured-workshop involving private and public stakeholders to test usability of gravity-based accessibility measures (GraBaM) to assess integrated land-use and transport policies. The research is part of the COST Action TU1002 “Accessibility Instruments for Planning Practice” during which different accessibility instruments where tested for different case studies. Here we report on the empirical case study of Rome.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
International audience
Resumo:
Introduction In 2007, St Luke’s Mission Hospital initiated a district-wide Door to Door HIV counselling and testing (HCT) programme in Zomba district. The intent of the programme was to provide quality HCT services to people in their homes and effectively those found to be HIV positive referred to appropriate services. Methodology This was a cross sectional study using a questionnaire consecutively administered to a sample of 105 counsellors who had resided in the community for a period of over one year. The questionnaire sought to establish, knowledge gained, experiences and recommendations on how the programme has been implemented and assist running of similar future programmes. Data analysis was done manually using both qualitative and quantitative methodologies. Results We report that nearly 23% of the counsellors thought that during their training as a door to door HTC councelor they had benefited in learning to working with communities; an aspect they found to be highly applicable in discharge of their duties. The major setbacks during the training were lack daily allowances, less amount of time spent on understanding child councelling and the manual used was diffucult to follow. Over 32% of the councellors were satisfied with the participation of their clients during pre-test counselling sessions, however, the major challenge they had was the misconception that they were blood suckers, a view reported by nearly 17% of the counsellors. Close to 72% reported not to have met any problems during post-test counselling compared to 24% who reported to have found challenges. Conclusion The study has revealed that there is a need to re-look child children counselling especially in training door to door HCT counsellors. It has also revealed the prevalent allowance culture despite the benefits of training. The common challenges were refusal of test Results and failure to understand discordance. Misconceptions may still exist in the community regarding anything dealing with removing blood. There is still need for more information regarding discordance especially among couples in the community.
Resumo:
Altough nowadays DMTA is one of the most used techniques to characterize polymers thermo-mechanical behaviour, it is only effective for small amplitude oscillatory tests and limited to a single frequency analysis (linear regime). In this thesis work a Fourier transform based experimental system has proven to give hint on structural and chemical changes in specimens during large amplitude oscillatory tests exploiting multi frequency spectral analysis turning out in a more sensitive tool than classical linear approach. The test campaign has been focused on three test typologies: Strain sweep tests, Damage investigation and temperature sweep tests.
Resumo:
The aim of the present study is to test a theory-based model of suicide in a low-risk nonclinical sample. A community sample of convenience of 200 adults, 102 men and 98 women, responded to the Depressive Experiences Questionnaire, the Center for the Epidemiologic Studies of Depression Scale, the Psychache Scale, the Interpersonal Needs Questionnaire, and the Suicide Behaviors Questionnaire Revised. The hypothesized structural equation model, including trait dimensions of self-criticism and neediness, and state dimensions of depression, psychache, perceived burdensomeness, and thwarted belongingness, fit the observed data well and significantly explained 49% of the variance of suicidality.
Resumo:
In the field of vibration qualification testing, with the popular Random Control mode of shakers, the specimen is excited by random vibrations typically set in the form of a Power Spectral Density (PSD). The corresponding signals are stationary and Gaussian, i.e. featuring a normal distribution. Conversely, real-life excitations are frequently non-Gaussian, exhibiting high peaks and/or burst signals and/or deterministic harmonic components. The so-called kurtosis is a parameter often used to statistically describe the occurrence and significance of high peak values in a random process. Since the similarity between test input profiles and real-life excitations is fundamental for qualification test reliability, some methods of kurtosis-control can be implemented to synthesize realistic (non-Gaussian) input signals. Durability tests are performed to check the resistance of a component to vibration-based fatigue damage. A procedure to synthesize test excitations which starts from measured data and preserves both the damage potential and the characteristics of the reference signals is desirable. The Fatigue Damage Spectrum (FDS) is generally used to quantify the fatigue damage potential associated with the excitation. The signal synthesized for accelerated durability tests (i.e. with a limited duration) must feature the same FDS as the reference vibration computed for the component’s expected lifetime. Current standard procedures are efficient in synthesizing signals in the form of a PSD, but prove inaccurate if reference data are non-Gaussian. This work presents novel algorithms for the synthesis of accelerated durability test profiles with prescribed FDS and a non-Gaussian distribution. An experimental campaign is conducted to validate the algorithms, by testing their accuracy, robustness, and practical effectiveness. Moreover, an original procedure is proposed for the estimation of the fatigue damage potential, aiming to minimize the computational time. The research is thus supposed to improve both the effectiveness and the efficiency of excitation profile synthesis for accelerated durability tests.
Resumo:
The final goal of the thesis should be a real-world application in the production test data environment. This includes the pre-processing of the data, building models and visualizing the results. To do this, different machine learning models, outlier prediction oriented, should be investigated using a real dataset. Finally, the different outlier prediction algorithms should be compared, and their performance discussed.
Resumo:
This work deals with the development of calibration procedures and control systems to improve the performance and efficiency of modern spark ignition turbocharged engines. The algorithms developed are used to optimize and manage the spark advance and the air-to-fuel ratio to control the knock and the exhaust gas temperature at the turbine inlet. The described work falls within the activity that the research group started in the previous years with the industrial partner Ferrari S.p.a. . The first chapter deals with the development of a control-oriented engine simulator based on a neural network approach, with which the main combustion indexes can be simulated. The second chapter deals with the development of a procedure to calibrate offline the spark advance and the air-to-fuel ratio to run the engine under knock-limited conditions and with the maximum admissible exhaust gas temperature at the turbine inlet. This procedure is then converted into a model-based control system and validated with a Software in the Loop approach using the engine simulator developed in the first chapter. Finally, it is implemented in a rapid control prototyping hardware to manage the combustion in steady-state and transient operating conditions at the test bench. The third chapter deals with the study of an innovative and cheap sensor for the in-cylinder pressure measurement, which is a piezoelectric washer that can be installed between the spark plug and the engine head. The signal generated by this kind of sensor is studied, developing a specific algorithm to adjust the value of the knock index in real-time. Finally, with the engine simulator developed in the first chapter, it is demonstrated that the innovative sensor can be coupled with the control system described in the second chapter and that the performance obtained could be the same reachable with the standard in-cylinder pressure sensors.
Resumo:
Knowledge graphs and ontologies are closely related concepts in the field of knowledge representation. In recent years, knowledge graphs have gained increasing popularity and are serving as essential components in many knowledge engineering projects that view them as crucial to their success. The conceptual foundation of the knowledge graph is provided by ontologies. Ontology modeling is an iterative engineering process that consists of steps such as the elicitation and formalization of requirements, the development, testing, refactoring, and release of the ontology. The testing of the ontology is a crucial and occasionally overlooked step of the process due to the lack of integrated tools to support it. As a result of this gap in the state-of-the-art, the testing of the ontology is completed manually, which requires a considerable amount of time and effort from the ontology engineers. The lack of tool support is noticed in the requirement elicitation process as well. In this aspect, the rise in the adoption and accessibility of knowledge graphs allows for the development and use of automated tools to assist with the elicitation of requirements from such a complementary source of data. Therefore, this doctoral research is focused on developing methods and tools that support the requirement elicitation and testing steps of an ontology engineering process. To support the testing of the ontology, we have developed XDTesting, a web application that is integrated with the GitHub platform that serves as an ontology testing manager. Concurrently, to support the elicitation and documentation of competency questions, we have defined and implemented RevOnt, a method to extract competency questions from knowledge graphs. Both methods are evaluated through their implementation and the results are promising.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.