875 resultados para requirement-based testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective interaction with personal computers is a basic requirement for many of the functions that are performed in our daily lives. With the rapid emergence of the Internet and the World Wide Web, computers have become one of the premier means of communication in our society. Unfortunately, these advances have not become equally accessible to physically handicapped individuals. In reality, a significant number of individuals with severe motor disabilities, due to a variety of causes such as Spinal Cord Injury (SCI), Amyothrophic Lateral Sclerosis (ALS), etc., may not be able to utilize the computer mouse as a vital input device for computer interaction. The purpose of this research was to further develop and improve an existing alternative input device for computer cursor control to be used by individuals with severe motor disabilities. This thesis describes the development and the underlying principle for a practical hands-off human-computer interface based on Electromyogram (EMG) signals and Eye Gaze Tracking (EGT) technology compatible with the Microsoft Windows operating system (OS). Results of the software developed in this thesis show a significant improvement in the performance and usability of the EMG/EGT cursor control HCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Damages during extreme wind events highlight the weaknesses of mechanical fasteners at the roof-to-wall connections in residential timber frame buildings. The allowable capacity of the metal fasteners is based on results of unidirectional component testing that do not simulate realistic tri-axial aerodynamic loading effects. The first objective of this research was to simulate hurricane effects and study hurricane-structure interaction at full-scale, facilitating better understanding of the combined impacts of wind, rain, and debris on inter-component connections at spatial and temporal scales. The second objective was to evaluate the performance of a non-intrusive roof-to-wall connection system using fiber reinforced polymer (FRP) materials and compare its load capacity to the capacity of an existing metal fastener under simulated aerodynamic loads. The Wall of Wind (WoW) testing performed using FRP connections on a one-story gable-roof timber structure instrumented with a variety of sensors, was used to create a database on aerodynamic and aero-hydrodynamic loading on roof-to-wall connections tested under several parameters: angles of attack, wind-turbulence content, internal pressure conditions, with and without effects of rain. Based on the aerodynamic loading results obtained from WoW tests, sets of three force components (tri-axial mean loads) were combined into a series of resultant mean forces, which were used to test the FRP and metal connections in the structures laboratory up to failure. A new component testing system and test protocol were developed for testing fasteners under simulated tri-axial loading as opposed to uni-axial loading. The tri-axial and uni-axial test results were compared for hurricane clips. Also, comparison was made between tri-axial load capacity of FRP and metal connections. The research findings demonstrate that the FRP connection is a viable option for use in timber roof-to-wall connection system. Findings also confirm that current testing methods of mechanical fasteners tend to overestimate the actual load capacities of a connector. Additionally, the research also contributes to the development a new testing protocol for fasteners using tri-axial simultaneous loads based on the aerodynamic database obtained from the WoW testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptation is an important requirement for mobile applications due to the varying levels of resource availability that characterizes mobile environments. However without proper control, multiple applications can each adapt independently in response to a range of different adaptive stimuli, causing conflicts or sub optimal performance. In this thesis we presented a framework, which enables multiple adaptation mechanisms to coexist on one platform. The key component of this framework was the 'Policy Server', which has all the system policies and governs the rules for adaptation. We also simulated our framework and subjected it to various adaptation scenarios to demonstrate the working of the system as a whole. With the help of the simulation it was shown that our framework enables seamless adaptation of multiple applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assays that assess cellular mediated immune responses performed under Good Clinical Laboratory Practice (GCLP) guidelines are required to provide specific and reproducible results. Defined validation procedures are required to establish the Standard Operating Procedure (SOP), include pass and fail criteria, as well as implement positivity criteria. However, little to no guidance is provided on how to perform longitudinal assessment of the key reagents utilized in the assay. Through the External Quality Assurance Program Oversight Laboratory (EQAPOL), an Interferon-gamma (IFN-γ) Enzyme-linked immunosorbent spot (ELISpot) assay proficiency testing program is administered. A limit of acceptable within site variability was estimated after six rounds of proficiency testing (PT). Previously, a PT send-out specific within site variability limit was calculated based on the dispersion (variance/mean) of the nine replicate wells of data. Now an overall 'dispersion limit' for the ELISpot PT program within site variability has been calculated as a dispersion of 3.3. The utility of this metric was assessed using a control sample to calculate the within (precision) and between (accuracy) experiment variability to determine if the dispersion limit could be applied to bridging studies (studies that assess lot-to-lot variations of key reagents) for comparing the accuracy of results with new lots to results with old lots. Finally, simulations were conducted to explore how this dispersion limit could provide guidance in the number of replicate wells needed for within and between experiment variability and the appropriate donor reactivity (number of antigen-specific cells) to be used for the evaluation of new reagents. Our bridging study simulations indicate using a minimum of six replicate wells of a control donor sample with reactivity of at least 150 spot forming cells per well is optimal. To determine significant lot-to-lot variations use the 3.3 dispersion limit for between and within experiment variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge-based radiation treatment is an emerging concept in radiotherapy. It

mainly refers to the technique that can guide or automate treatment planning in

clinic by learning from prior knowledge. Dierent models are developed to realize

it, one of which is proposed by Yuan et al. at Duke for lung IMRT planning. This

model can automatically determine both beam conguration and optimization ob-

jectives with non-coplanar beams based on patient-specic anatomical information.

Although plans automatically generated by this model demonstrate equivalent or

better dosimetric quality compared to clinical approved plans, its validity and gener-

ality are limited due to the empirical assignment to a coecient called angle spread

constraint dened in the beam eciency index used for beam ranking. To eliminate

these limitations, a systematic study on this coecient is needed to acquire evidences

for its optimal value.

To achieve this purpose, eleven lung cancer patients with complex tumor shape

with non-coplanar beams adopted in clinical approved plans were retrospectively

studied in the frame of the automatic lung IMRT treatment algorithm. The primary

and boost plans used in three patients were treated as dierent cases due to the

dierent target size and shape. A total of 14 lung cases, thus, were re-planned using

the knowledge-based automatic lung IMRT planning algorithm by varying angle

spread constraint from 0 to 1 with increment of 0.2. A modied beam angle eciency

index used for navigate the beam selection was adopted. Great eorts were made to assure the quality of plans associated to every angle spread constraint as good

as possible. Important dosimetric parameters for PTV and OARs, quantitatively

re

ecting the plan quality, were extracted from the DVHs and analyzed as a function

of angle spread constraint for each case. Comparisons of these parameters between

clinical plans and model-based plans were evaluated by two-sampled Students t-tests,

and regression analysis on a composite index built on the percentage errors between

dosimetric parameters in the model-based plans and those in the clinical plans as a

function of angle spread constraint was performed.

Results show that model-based plans generally have equivalent or better quality

than clinical approved plans, qualitatively and quantitatively. All dosimetric param-

eters except those for lungs in the automatically generated plans are statistically

better or comparable to those in the clinical plans. On average, more than 15% re-

duction on conformity index and homogeneity index for PTV and V40, V60 for heart

while an 8% and 3% increase on V5, V20 for lungs, respectively, are observed. The

intra-plan comparison among model-based plans demonstrates that plan quality does

not change much with angle spread constraint larger than 0.4. Further examination

on the variation curve of the composite index as a function of angle spread constraint

shows that 0.6 is the optimal value that can result in statistically the best achievable

plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Banked, unrelated umbilical cord blood provides access to hematopoietic stem cell transplantation for patients lacking matched bone marrow donors, yet 10% to 15% of patients experience graft failure or delayed engraftment. This may be due, at least in part, to inadequate potency of the selected cord blood unit (CBU). CBU potency is typically assessed before cryopreservation, neglecting changes in potency occurring during freezing and thawing. Colony-forming units (CFUs) have been previously shown to predict CBU potency, defined as the ability to engraft in patients by day 42 posttransplant. However, the CFU assay is difficult to standardize and requires 2 weeks to perform. Consequently, we developed a rapid multiparameter flow cytometric CBU potency assay that enumerates cells expressing high levels of the enzyme aldehyde dehydrogenase (ALDH bright [ALDH(br)]), along with viable CD45(+) or CD34(+) cell content. These measurements are made on a segment that was attached to a cryopreserved CBU. We validated the assay with prespecified criteria testing accuracy, specificity, repeatability, intermediate precision, and linearity. We then prospectively examined the correlations among ALDH(br), CD34(+), and CFU content of 3908 segments over a 5-year period. ALDH(br) (r = 0.78; 95% confidence interval [CI], 0.76-0.79), but not CD34(+) (r = 0.25; 95% CI, 0.22-0.28), was strongly correlated with CFU content as well as ALDH(br) content of the CBU. These results suggest that the ALDH(br) segment assay (based on unit characteristics measured before release) is a reliable assessment of potency that allows rapid selection and release of CBUs from the cord blood bank to the transplant center for transplantation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of structural health monitoring of civil structures is ever expanding and by assessing the dynamical condition of structures, informed maintenance management can be conducted at both individual and network levels. With the continued growth of information age technology, the potential arises for smart monitoring systems to be integrated with civil infrastructure to provide efficient information on the condition of a structure. The focus of this thesis is the integration of smart technology with civil infrastructure for the purposes of structural health monitoring. The technology considered in this regard are devices based on energy harvesting materials. While there has been considerable focus on the development and optimisation of such devices using steady state loading conditions, their applications for civil infrastructure are less known. Although research is still in initial stages, studies into the uses associated with such applications are very promising. Through the use of the dynamical response of structures to a variety of loading conditions, the energy harvesting outputs from such devices is established and the potential power output determined. Through a power variance output approach, damage detection of deteriorating structures using the energy harvesting devices is investigated. Further applications of the integration of energy harvesting devices with civil infrastructure investigated by this research includes the use of the power output as a indicator for control. Four approaches are undertaken to determine the potential applications arising from integrating smart technology with civil infrastructure, namely • Theoretical analysis to determine the applications of energy harvesting devices for vibration based health monitoring of civil infrastructure. • Laboratory experimentation to verify the performance of different energy harvesting configurations for civil infrastructure applications. • Scaled model testing as a method to experimentally validate the integration of the energy harvesting devices with civil infrastructure. • Full scale deployment of energy harvesting device with a bridge structure. These four approaches validate the application of energy harvesting technology with civil infrastructure from a theoretical, experimental and practical perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis involved the development of two Biosensors and their associated assays for the detection of diseases, namely IBR and BVD for veterinary use and C1q protein as a biomarker to pancreatic cancer for medical application, using Surface Plasmon Resonance (SPR) and nanoplasmonics. SPR techniques have been used by a number of groups, both in research [1-3] and commercially [4, 5] , as a diagnostic tool for the detection of various biomolecules, especially antibodies [6-8]. The biosensor market is an ever expanding field, with new technology and new companies rapidly emerging on the market, for both human [8] and veterinary applications [9, 10]. In Chapter 2, we discuss the development of a simultaneous IBR and BVD virus assay for the detection of antibodies in bovine serum on an SPR-2 platform. Pancreatic cancer is the most lethal cancer by organ site, partially due to the lack of a reliable molecular signature for diagnostic testing. C1q protein has been recently proposed as a biomarker within a panel for the detection of pancreatic cancer. The third chapter discusses the fabrication, assays and characterisation of nanoplasmonic arrays. We will talk about developing C1q scFv antibody assays, clone screening of the antibodies and subsequently moving the assays onto the nanoplasmonic array platform for static assays, as well as a custom hybrid benchtop system as a diagnostic method for the detection of pancreatic cancer. Finally, in chapter 4, we move on to Guided Mode Resonance (GMR) sensors, as a low-cost option for potential use in Point-of Care diagnostics. C1q and BVD assays used in the prior formats are transferred to this platform, to ascertain its usability as a cost effective, reliable sensor for diagnostic testing. We discuss the fabrication, characterisation and assay development, as well as their use in the benchtop hybrid system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Key life history traits such as breeding time and clutch size are frequently both heritable and under directional selection, yet many studies fail to document micro-evolutionary responses. One general explanation is that selection estimates are biased by the omission of correlated traits that have causal effects on fitness, but few valid tests of this exist. Here we show, using a quantitative genetic framework and six decades of life-history data on two free-living populations of great tits Parus major, that selection estimates for egg-laying date and clutch size are relatively unbiased. Predicted responses to selection based on the Robertson-Price Identity were similar to those based on the multivariate breeder’s equation, indicating that unmeasured covarying traits were not missing from the analysis. Changing patterns of phenotypic selection on these traits (for laying date, linked to climate change) therefore reflect changing selection on breeding values, and genetic constraints appear not to limit their independent evolution. Quantitative genetic analysis of correlational data from pedigreed populations can be a valuable complement to experimental approaches to help identify whether apparent associations between traits and fitness are biased by missing traits, and to parse the roles of direct versus indirect selection across a range of environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.