48 resultados para requirement-based testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine the accuracy, acceptability and cost-effectiveness of polymerase chain reaction (PCR) and optical immunoassay (OIA) rapid tests for maternal group B streptococcal (GBS) colonisation at labour. DESIGN: A test accuracy study was used to determine the accuracy of rapid tests for GBS colonisation of women in labour. Acceptability of testing to participants was evaluated through a questionnaire administered after delivery, and acceptability to staff through focus groups. A decision-analytic model was constructed to assess the cost-effectiveness of various screening strategies. SETTING: Two large obstetric units in the UK. PARTICIPANTS: Women booked for delivery at the participating units other than those electing for a Caesarean delivery. INTERVENTIONS: Vaginal and rectal swabs were obtained at the onset of labour and the results of vaginal and rectal PCR and OIA (index) tests were compared with the reference standard of enriched culture of combined vaginal and rectal swabs. MAIN OUTCOME MEASURES: The accuracy of the index tests, the relative accuracies of tests on vaginal and rectal swabs and whether test accuracy varied according to the presence or absence of maternal risk factors. RESULTS: PCR was significantly more accurate than OIA for the detection of maternal GBS colonisation. Combined vaginal or rectal swab index tests were more sensitive than either test considered individually [combined swab sensitivity for PCR 84% (95% CI 79-88%); vaginal swab 58% (52-64%); rectal swab 71% (66-76%)]. The highest sensitivity for PCR came at the cost of lower specificity [combined specificity 87% (95% CI 85-89%); vaginal swab 92% (90-94%); rectal swab 92% (90-93%)]. The sensitivity and specificity of rapid tests varied according to the presence or absence of maternal risk factors, but not consistently. PCR results were determinants of neonatal GBS colonisation, but maternal risk factors were not. Overall levels of acceptability for rapid testing amongst participants were high. Vaginal swabs were more acceptable than rectal swabs. South Asian women were least likely to have participated in the study and were less happy with the sampling procedure and with the prospect of rapid testing as part of routine care. Midwives were generally positive towards rapid testing but had concerns that it might lead to overtreatment and unnecessary interference in births. Modelling analysis revealed that the most cost-effective strategy was to provide routine intravenous antibiotic prophylaxis (IAP) to all women without screening. Removing this strategy, which is unlikely to be acceptable to most women and midwives, resulted in screening, based on a culture test at 35-37 weeks' gestation, with the provision of antibiotics to all women who screened positive being most cost-effective, assuming that all women in premature labour would receive IAP. The results were sensitive to very small increases in costs and changes in other assumptions. Screening using a rapid test was not cost-effective based on its current sensitivity, specificity and cost. CONCLUSIONS: Neither rapid test was sufficiently accurate to recommend it for routine use in clinical practice. IAP directed by screening with enriched culture at 35-37 weeks' gestation is likely to be the most acceptable cost-effective strategy, although it is premature to suggest the implementation of this strategy at present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural Health Monitoring (SHM) ensures the structural health and safety of critical structures covering a wide range of application areas. This thesis presents novel, low-cost and good-performance fibre Bragg grating (FBG) based systems for detection of Acoustic Emission (AE) in aircraft structures, which is a part of SHM. Importantly a key aim, during the design of these systems, was to produce systems that were sufficiently small to install in an aircraft for lifetime monitoring. Two important techniques for monitoring high frequency AE that were developed as a part of this research were, Quadrature recombination technique and Active tracking technique. Active tracking technique was used extensively and was further developed to overcome the limitations that were observed while testing it at several test facilities and with different optical fibre sensors. This system was able to eliminate any low frequency spectrum shift due to environmental perturbation and keeps the sensor always working at optimum operation point. This is highly desirable in harsh industrial and operationally active environments. Experimental work carried out in the laboratory has proved that such systems can be used for high frequency detection and have capability to detect up to 600 kHz. However, the range of frequency depends upon the requirement and design of the interrogation system as the system can be altered accordingly for different applications. Several optical fibre configurations for wavelength detection were designed during the course of this work along with industrial partners. Fibre Bragg grating Fabry-Perot (FBG-FP) sensors have shown higher sensitivity and usability than the uniform FBGs to be used with such system. This was shown experimentally. The author is certain that further research will lead to development of a commercially marketable product and the use of active tracking systems can be extended in areas of healthcare, civil infrastructure monitoring etc. where it can be deployed. Finally, the AE detection system has been developed to aerospace requirements and was tested at NDT & Testing Technology test facility based at Airbus, Filton, UK on A350 testing panels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this study is development of parallelised version of severely sequential and iterative numerical algorithms based on multi-threaded parallel platform such as a graphics processing unit. This requires design and development of a platform-specific numerical solution that can benefit from the parallel capabilities of the chosen platform. Graphics processing unit was chosen as a parallel platform for design and development of a numerical solution for a specific physical model in non-linear optics. This problem appears in describing ultra-short pulse propagation in bulk transparent media that has recently been subject to several theoretical and numerical studies. The mathematical model describing this phenomenon is a challenging and complex problem and its numerical modeling limited on current modern workstations. Numerical modeling of this problem requires a parallelisation of an essentially serial algorithms and elimination of numerical bottlenecks. The main challenge to overcome is parallelisation of the globally non-local mathematical model. This thesis presents a numerical solution for elimination of numerical bottleneck associated with the non-local nature of the mathematical model. The accuracy and performance of the parallel code is identified by back-to-back testing with a similar serial version.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lock-in is observed in real world markets of experience goods; experience goods are goods whose characteristics are difficult to determine in advance, but ascertained upon consumption. We create an agent-based simulation of consumers choosing between two experience goods available in a virtual market. We model consumers in a grid representing the spatial network of the consumers. Utilising simple assumptions, including identical distributions of product experience and consumers having a degree of follower tendency, we explore the dynamics of the model through simulations. We conduct simulations to create a lock-in before testing several hypotheses upon how to break an existing lock-in; these include the effect of advertising and free give-away. Our experiments show that the key to successfully breaking a lock-in required the creation of regions in a consumer population. Regions arise due to the degree of local conformity between agents within the regions, which spread throughout the population when a mildly superior competitor was available. These regions may be likened to a niche in a market, which gains in popularity to transition into the mainstream.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes an integrative framework for the conduct of a more thorough and robust analysis regarding the linkage between Human Resource Management (HRM) and business performance. In order to provide the required basis for the proposed framework, initially, the core aspects of the main HRM models predicting business performance are analysed. The framework proposes both the principle of mediation (i.e. HRM outcomes mediate the relationship between organisational strategies and business performance) and the perspective of simultaneity of decision-making by firms with regard to the consideration of business strategies and HRM policies. In order to empirically test this framework the methodological approach of 'structural equation models' is employed. The empirical research is based on a sample of 178 organisations operating in the Greek manufacturing sector. The paper concludes that both the mediation principle and the simultaneity perspective are supported, emphasising further the positive role of HRM outcomes towards organisational performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is a cross-national study testing a framework relating cultural descriptive norms to entrepreneurship in a sample of 40 nations. Based on data from the Global Leadership and Organizational Behavior Effectiveness project, we identify two higher-order dimensions of culture – socially supportive culture (SSC) and performance-based culture (PBC) – and relate them to entrepreneurship rates and associated supply-side and demand-side variables available from the Global Entrepreneurship Monitor. Findings provide strong support for a social capital/SSC and supply-side variable explanation of entrepreneurship rate. PBC predicts demand-side variables, such as opportunity existence and the quality of formal institutions to support entrepreneurship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aston University has been working closely with key companies from within the electricity industry for several years, initially in the development and delivery of an employer-led foundation degree programme in electrical power engineering, and more recently, in the development of a progression pathway for foundation degree graduates to achieve a Bachelors-level qualification. The Electrical Power Engineering foundation degree was developed in close consultation with the industry such that the programme is essentially owned by the sector. Programme delivery has required significant shifts away from traditional HE teaching patterns whilst maintaining the quality requirement and without compromise of the academic degree standard. Block teaching (2-week slots), partnership delivery, off-site student support and work-based learning have all presented challenges as we have sought to maximise the student learning experience and to ensure that the graduates are fit-for purpose and "hit the ground running" within a defined career structure for sponsoring companies. This paper will outline the skills challenges facing the sector; describe programme developments and delivery challenges; before articulating some observations and conclusions around programme effectiveness, impact of foundation degree graduates in the workplace and the significance of the close working relationship with key sponsoring companies. Copyright © 2012, September.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background - Modelling the interaction between potentially antigenic peptides and Major Histocompatibility Complex (MHC) molecules is a key step in identifying potential T-cell epitopes. For Class II MHC alleles, the binding groove is open at both ends, causing ambiguity in the positional alignment between the groove and peptide, as well as creating uncertainty as to what parts of the peptide interact with the MHC. Moreover, the antigenic peptides have variable lengths, making naive modelling methods difficult to apply. This paper introduces a kernel method that can handle variable length peptides effectively by quantifying similarities between peptide sequences and integrating these into the kernel. Results - The kernel approach presented here shows increased prediction accuracy with a significantly higher number of true positives and negatives on multiple MHC class II alleles, when testing data sets from MHCPEP [1], MCHBN [2], and MHCBench [3]. Evaluation by cross validation, when segregating binders and non-binders, produced an average of 0.824 AROC for the MHCBench data sets (up from 0.756), and an average of 0.96 AROC for multiple alleles of the MHCPEP database. Conclusion - The method improves performance over existing state-of-the-art methods of MHC class II peptide binding predictions by using a custom, knowledge-based representation of peptides. Similarity scores, in contrast to a fixed-length, pocket-specific representation of amino acids, provide a flexible and powerful way of modelling MHC binding, and can easily be applied to other dynamic sequence problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work investigates the adhesive/cohesive molecular and physical interactions together with nanoscopic features of commonly used orally disintegrating tablet (ODT) excipients microcrystalline cellulose (MCC) and D-mannitol. This helps to elucidate the underlying physico-chemical and mechanical mechanisms responsible for powder densification and optimum product functionality. Atomic force microscopy (AFM) contact mode analysis was performed to measure nano-adhesion forces and surface energies between excipient-drug particles (6-10 different particles per each pair). Moreover, surface topography images (100 nm2-10 μm2) and roughness data were acquired from AFM tapping mode. AFM data were related to ODT macro/microscopic properties obtained from SEM, FTIR, XRD, thermal analysis using DSC and TGA, disintegration testing, Heckel and tabletability profiles. The study results showed a good association between the adhesive molecular and physical forces of paired particles and the resultant densification mechanisms responsible for mechanical strength of tablets. MCC micro roughness was 3 times that of D-mannitol which explains the high hardness of MCC ODTs due to mechanical interlocking. Hydrogen bonding between MCC particles could not be established from both AFM and FTIR solid state investigation. On the contrary, D-mannitol produced fragile ODTs due to fragmentation of surface crystallites during compression attained from its weak crystal structure. Furthermore, AFM analysis has shown the presence of extensive micro fibril structures inhabiting nano pores which further supports the use of MCC as a disintegrant. Overall, excipients (and model drugs) showed mechanistic behaviour on the nano/micro scale that could be related to the functionality of materials on the macro scale. © 2014 Al-khattawi et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behaviour of self adaptive systems can be emergent, which means that the system’s behaviour may be seen as unexpected by its customers and its developers. Therefore, a self-adaptive system needs to garner confidence in its customers and it also needs to resolve any surprise on the part of the developer during testing and maintenance. We believe that these two functions can only be achieved if a self-adaptive system is also capable of self-explanation. We argue a self-adaptive system’s behaviour needs to be explained in terms of satisfaction of its requirements. Since self-adaptive system requirements may themselves be emergent, we propose the use of goal-based requirements models at runtime to offer self-explanation of how a system is meeting its requirements. We demonstrate the analysis of run-time requirements models to yield a self-explanation codified in a domain specific language, and discuss possible future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While the literature has suggested the possibility of breach being composed of multiple facets, no previous study has investigated this possibility empirically. This study examined the factor structure of typical component forms in order to develop a multiple component form measure of breach. Two studies were conducted. In study 1 (N = 420) multi-item measures based on causal indicators representing promissory obligations were developed for the five potential component forms (delay, magnitude, type/form, inequity and reciprocal imbalance). Exploratory factor analysis showed that the five components loaded onto one higher order factor, namely psychological contract breach suggesting that breach is composed of different aspects rather than types of breach. Confirmatory factor analysis provided further evidence for the proposed model. In addition, the model achieved high construct reliability and showed good construct, convergent, discriminant and predictive validity. Study 2 data (N = 189), used to validate study 1 results, compared the multiple-component measure with an established multiple item measure of breach (rather than a single item as in study 1) and also tested for discriminant validity with an established multiple item measure of violation. Findings replicated those in study 1. The findings have important implications for considering alternative, more comprehensive and elaborate ways of assessing breach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform numerical simulations on a model describing a Brillouin-based temperature and strain sensor, testing its response when it is probed with relatively short pulses. Experimental results were recently published [e.g., Opt. Lett. 24, 510 (1999)] that showed a broadening of the Brillouin loss curve when the probe pulse duration is reduced, followed by a sudden and rather surprising reduction of the linewidth when the pulse duration gets shorter than the acoustic relaxation time. Our study reveals the processes responsible for this behavior. We give a clear physical insight into the problem, allowing us to define the best experimental conditions required for one to take the advantage of this effect.