962 resultados para Testing of embedded cores
Resumo:
The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.
Resumo:
The primary objective of this research was to understand what kinds of knowledge and skills people use in `extracting' relevant information from text and to assess the extent to which expert systems techniques could be applied to automate the process of abstracting. The approach adopted in this thesis is based on research in cognitive science, information science, psycholinguistics and textlinguistics. The study addressed the significance of domain knowledge and heuristic rules by developing an information extraction system, called INFORMEX. This system, which was implemented partly in SPITBOL, and partly in PROLOG, used a set of heuristic rules to analyse five scientific papers of expository type, to interpret the content in relation to the key abstract elements and to extract a set of sentences recognised as relevant for abstracting purposes. The analysis of these extracts revealed that an adequate abstract could be generated. Furthermore, INFORMEX showed that a rule based system was a suitable computational model to represent experts' knowledge and strategies. This computational technique provided the basis for a new approach to the modelling of cognition. It showed how experts tackle the task of abstracting by integrating formal knowledge as well as experiential learning. This thesis demonstrated that empirical and theoretical knowledge can be effectively combined in expert systems technology to provide a valuable starting approach to automatic abstracting.
Resumo:
This work has concentrated on the testing of induction machines to determine their temperature rise at full-load without mechanically coupling to a load machine. The achievements of this work are outlined as follows. 1. Four distinct categories of mixed-frequency test using an inverter have been identified by the author. The simulation results of these tests as well as the conventional 2-supply test have been analysed in detail. 2. Experimental work on mixed-frequency tests has been done on a small (4 kW) squirrel cage induction machine using a voltage source PWM inverter. Two out of the four categories of test suggested have been tested and the temperature rise results were found to be similar to the results of a direct loading test. Further, one of the categories of test proposed has been performed on a 3.3 kW slip-ring induction machine for the conformation of the rotor values. 3. A low current supply mixed-frequency test-rig has been proposed. For this purpose, a resonant bank was connected to the DC link of the inverter in order to maintain the exchange of power between the test machine and the resonant bank instead of between the main supply and the test machine. The resonant bank was then replaced with a special electro-mechanical energy storage unit. The current of the main power supply was then reduced in amplitude. 4. A variable inertia test for full load temperature rise testing of induction machines has been introduced. This test is purely mechanical in nature and does not require any electrical connection of the test machine to any other machine. It has the advantage of drawing very little net power from the supply.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Software applications created on top of the service-oriented architecture (SOA) are increasingly popular but testing them remains a challenge. In this paper a framework named TASSA for testing the functional and non-functional behaviour of service-based applications is presented. The paper focuses on the concept of design time testing, the corresponding testing approach and architectural integration of the consisting TASSA tools. The individual TASSA tools with sample validation scenarios were already presented with a general view of their relation. This paper’s contribution is the structured testing approach, based on the integral use of the tools and their architectural integration. The framework is based on SOA principles and is composable depending on user requirements.
Resumo:
2000 Mathematics Subject Classification: 14B05, 32S25.
Preparation and property testing of compatibilized poly(l-lactide)/thermoplastic polyurethane blends
Resumo:
Poly(l-lactide) (PLL) has been blended with a polycaprolactone-based thermoplastic polyurethane (TPU) elastomer as a toughening agent and a poly(l-lactide-co-caprolactone) (PLLCL) copolymer as a compatibilizer. Both 2-component (PLL/TPU) and 3-component (PLL/TPU/PLLCL) blends were prepared by melt mixing, characterized, hot-pressed into thin sheets and their tensile properties tested. The results showed that, although the TPU could toughen the PLL, the blends were largely immiscible leading to phase separation. However, addition of the PLLCL copolymer improved blend compatibility. The best all-round properties were found for the 3-component blend of composition PLL/TPU/PLLCL = 90/10/10 parts by weight.
Resumo:
Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.
Resumo:
Constant load, progressive load and multipass nanoscratch (nanowear) tests were carried out on 500 and 1500 nm TiN coatings on M42 steel chosen as model systems. The influences of film thickness, coating roughness, scratch direction relative to the grinding grooves on the critical load in the progressive load test and number of cycles to failure in the wear test have been determined. Progress towards the development of a suitable methodology for determining the scratch hardness from nanoscratch tests is discussed. © 2011 W. S. Maney & Son Ltd.
Resumo:
Individuals of Hispanic origin are the nation's largest minority (13.4%). Therefore, there is a need for models and methods that are culturally appropriate for mental health research with this burgeoning population. This is an especially salient issue when applying family systems theories to Hispanics, who are heavily influenced by family bonds in a way that appears to be different from the more individualistic non-Hispanic White culture. Bowen asserted that his family systems' concept of differentiation of self, which values both individuality and connectedness, could be universally applied. However, there is a paucity of research systematically assessing the applicability of the differentiation of self construct in ethnic minority populations. ^ This dissertation tested a multivariate model of differentiation of self with a Hispanic sample. The manner in which the construct of differentiation of self was being assessed and how accurately it represented this particular ethnic minority group's functioning was examined. Additionally, the proposed model included key contextual variables (e.g., anxiety, relationship satisfaction, attachment and acculturation related variables) which have been shown to be related to the differentiation process. ^ The results from structural equation modeling (SEM) analyses confirmed and extended previous research, and helped to illuminate the complex relationships between key factors that need to be considered in order to better understand individuals with this cultural background. Overall results indicated that the manner in which Hispanic individuals negotiate the boundaries of interconnectedness with a sense of individual expression appears to be different from their non-Hispanic White counterparts in some important ways. These findings illustrate the need for research on Hispanic individuals that provides a more culturally sensitive framework. ^
Resumo:
Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
Stereotype threat theory proposes that the possibility of being judged in terms of a negative stereotype in a particular domain negatively affects one’s performance. The proposed mixed-methods research will investigate the influences of stereotype threat on African American third-graders in a post-No Child Left Behind environment.