946 resultados para software failure prediction


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The economic and efficient exploitation of composite materials in critical load bearing applications relies on the ability to predict safe operational lives without excessive conservatism. Developing life prediction and monitoring techniques in these complex, inhomogeneous materials requires an understanding of the various failure mechanisms which can take place. This article describes a range of damage mechanisms which are observed in polymer, metal and ceramic matrix composites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: The immunogenicity of peptides depends on their ability to bind to MHC molecules. MHC binding affinity prediction methods can save significant amounts of experimental work. The class II MHC binding site is open at both ends, making epitope prediction difficult because of the multiple binding ability of long peptides. Results: An iterative self-consistent partial least squares (PLS)-based additive method was applied to a set of 66 pep- tides no longer than 16 amino acids, binding to DRB1*0401. A regression equation containing the quantitative contributions of the amino acids at each of the nine positions was generated. Its predictability was tested using two external test sets which gave r pred =0.593 and r pred=0.655, respectively. Furthermore, it was benchmarked using 25 known T-cell epitopes restricted by DRB1*0401 and we compared our results with four other online predictive methods. The additive method showed the best result finding 24 of the 25 T-cell epitopes. Availability: Peptides used in the study are available from http://www.jenner.ac.uk/JenPep. The PLS method is available commercially in the SYBYL molecular modelling software package. The final model for affinity prediction of peptides binding to DRB1*0401 molecule is available at http://www.jenner.ac.uk/MHCPred. Models developed for DRB1*0101 and DRB1*0701 also are available in MHC- Pred

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need of the oil industry to ensure the safety of the facilities, employees and the environment, not to mention the search for maximum efficiency of its facilities, makes it seeks to achieve a high level of excellence in all stages of its production processes in order to obtain the required quality of the final product. Know the reliability of equipment and what it stands for a system is of fundamental importance for ensuring the operational safety. The reliability analysis technique has been increasingly applied in the oil industry as fault prediction tool and undesirable events that can affect business continuity. It is an applied scientific methodology that involves knowledge in engineering and statistics to meet and or analyze the performance of components, equipment and systems in order to ensure that they perform their function without fail, for a period of time and under a specific condition. The results of reliability analyzes help in making decisions about the best maintenance strategy of petrochemical plants. Reliability analysis was applied on equipment (bike-centrifugal fan) between the period 2010-2014 at the Polo Petrobras Guamaré Industrial, situated in rural Guamaré municipality in the state of Rio Grande do Norte, where he collected data field, analyzed historical equipment and observing the behavior of faults and their impacts. The data were processed in commercial software reliability ReliaSoft BlockSim 9. The results were compared with a study conducted by the experts in the field in order to get the best maintenance strategy for the studied system. With the results obtained from the reliability analysis tools was possible to determine the availability of the centrifugal motor-fan and what will be its impact on the security of process units if it will fail. A new maintenance strategy was established to improve the reliability, availability, maintainability and decreased likelihood of Moto-Centrifugal Fan failures, it is a series of actions to promote the increased system reliability and consequent increase in cycle life of the asset. Thus, this strategy sets out preventive measures to reduce the probability of failure and mitigating aimed at minimizing the consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most liquid electrolytes used in commercial lithium-ion batteries are composed by alkylcarbonate mixture containing lithium salt. The decomposition of these solvents by oxidation or reduction during cycling of the cell, induce generation of gases (CO2, CH4, C2H4, CO …) increasing of pressure in the sealed cell, which causes a safety problem [1]. The prior understanding of parameters, such as structure and nature of salt, temperature pressure, concentration, salting effects and solvation parameters, which influence gas solubility and vapor pressure of electrolytes is required to formulate safer and suitable electrolytes especially at high temperature.

We present in this work the CO2, CH4, C2H4, CO solubility in different pure alkyl-carbonate solvents (PC, DMC, EMC, DEC) and their binary or ternary mixtures as well as the effect of temperature and lithium salt LiX (X = LiPF6, LiTFSI or LiFAP) structure and concentration on these properties. Furthermore, in order to understand parameters that influence the choice of the structure of the solvents and their ability to dissolve gas through the addition of a salt, we firstly analyzed experimentally the transport properties (Self diffusion coefficient (D), fluidity (h-1), and conductivity (s) and lithium transport number (tLi) using the Stock-Einstein, and extended Jones-Dole equations [2]. Furthermore, measured data for the of CO2, C2H4, CH4 and CO solubility in pure alkylcarbonates and their mixtures containing LiPF6; LiFAP; LiTFSI salt, are reported as a function of temperature and concentration in salt. Based on experimental solubility data, the Henry’s law constant of gases in these solvents and electrolytes was then deduced and compared with values predicted by using COSMO-RS methodology within COSMOthermX software. From these results, the molar thermodynamic functions of dissolution such as the standard Gibbs energy, the enthalpy, and the entropy, as well as the mixing enthalpy of the solvents and electrolytes with the gases in its hypothetical liquid state were calculated and discussed [3]. Finally, the analysis of the CO2 solubility variations with the salt addition was then evaluated by determining specific ion parameters Hi by using the Setchenov coefficients in solution. This study showed that the gas solubility is entropy driven and can been influenced by the shape, charge density, and size of the anions in lithium salt.

References

[1] S.A. Freunberger, Y. Chen, Z. Peng, J.M. Griffin, L.J. Hardwick, F. Bardé, P. Novák, P.G. Bruce, Journal of the American Chemical Society 133 (2011) 8040-8047.

[2] P. Porion, Y.R. Dougassa, C. Tessier, L. El Ouatani, J. Jacquemin, M. Anouti, Electrochimica Acta 114 (2013) 95-104.

[3] Y.R. Dougassa, C. Tessier, L. El Ouatani, M. Anouti, J. Jacquemin, The Journal of Chemical Thermodynamics 61 (2013) 32-44.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply Chain Simulation (SCS) is applied to acquire information to support outsourcing decisions but obtaining enough detail in key parameters can often be a barrier to making well informed decisions.
One aspect of SCS that has been relatively unexplored is the impact of inaccurate data around delays within the SC. The impact of the magnitude and variability of process cycle time on typical performance indicators in a SC context is studied.
System cycle time, WIP levels and throughput are more sensitive to the magnitude of deterministic deviations in process cycle time than variable deviations. Manufacturing costs are not very sensitive to these deviations.
Future opportunities include investigating the impact of process failure or product defects, including logistics and transportation between SC members and using alternative costing methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Increased arterial stiffness is a common finding in patients with end-stage renal disease. Following creation of an arteriovenous fistula (AVF), appropriate dilation of the feeding artery must occur to facilitate AVF maturation. Arterial stiffness may impair the arterial dilation required to facilitate AVF development and contribute to subsequent failure to mature (FTM). The aim of this pilot study was to investigate the association between measurements of central and peripheral arterial stiffness, and AVF FTM.

METHODS: Patients undergoing AVF creation in a single centre (Belfast City Hospital, UK) between January and December 2015 were invited to have their carotid-femoral pulse wave velocity (PWV), brachial-radial PWV and augmentation index (AI) measured prior to AVF creation. Subsequent AVF outcomes were identified.

RESULTS: Fifty-nine patients who had an AVF procedure were included in the final analysis (mean age 62 years); 50.8% had diabetes mellitus. The mean pre-operative arterial diameter for all AVFs was 3.9 mm. Average values for carotid-femoral PWV were 9.5 m/s, brachial-radial PWV 7.7 m/s and AI 25.6%. Using logistic regression, these arterial stiffness parameters did not predict AVF FTM: carotid-femoral PWV (P = 0.20), brachial-radial PWV (P = 0.13), AI (P = 0.50).

CONCLUSIONS: This is the largest study to date exploring the association between arterial stiffness and AVF FTM. The measured central and peripheral arterial stiffness parameters were not associated with AVF FTM. Further research is needed to define if non-invasive arterial physiological measurements would be clinically useful in the prediction of AVF FTM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-07

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protective factors are neglected in risk assessment in adult psychiatric and criminal justice populations. This review investigated the predictive efficacy of selected tools that assess protective factors. Five databases were searched using comprehensive terms for records up to June 2014, resulting in 17 studies (n = 2,198). Results were combined in a multilevel meta-analysis using the R (R Core Team, R: A Language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing, 2015) metafor package (Viechtbauer, Journal of Statistical Software, 2010, 36, 1). Prediction of outcomes was poor relative to a reference category of violent offending, with the exception of prediction of discharge from secure units. There were no significant differences between the predictive efficacy of risk scales, protective scales, and summary judgments. Protective factor assessment may be clinically useful, but more development is required. Claims that use of these tools is therapeutically beneficial require testing.