933 resultados para Concurrent exception handling
Resumo:
This thesis has sought to investigate disinfection agents and procedures which may provide sanitisation against bacterial spores. A hard-surface disinfection test method was designed to ascertain which combinations of biocide and application method were most effective against bacterial spores. A combination of spraying and wiping was the most effective method of disinfection against Bacillus spores, with wiping found to play a key role in spore removal. The most efficacious of the biocides investigated was the 6% hydrogen peroxide. Vaporised Hydrogen Peroxide (VHP) gassing was more effective than traditional disinfection. In addition to efficacy, the toxic potential of the biocides to human airway epithelial cells in vitro was evaluated. Toxicity against human bronchial and nasal epithelial cells was assessed by determining cell viability, inflammatory status, protein oxidation and epithelial cell layer integrity. In addition the cell death mechanism following biocide exposure was investigated. There was a decrease in viable cells following exposure to all biocides when applied at practical concentrations. Almost all of the biocides tested elicited a pro-inflammatory response from the cells as measured by IL-8 production. All biocides increased protein oxidation as measured by thiol and carbonyl levels. Measurement of transepithelial electrical resistance and paracellular permeability indicated biocide-dependent decrease in epithelial cell barrier function. The cellular response was biased towards necrotic rather than apoptotic death. The use of biocides, although efficacious to some effects against Bacillus spores, will require careful monitoring for adverse health effects on personnel.
Resumo:
Research on the drivers of satisfaction with complaint handling (SATCOM) underlines the importance of procedural, relational, and interactional justice (Orsingher, Valentini, & de Angelis, 2010). Since these SATCOM-studies are largely conducted in business-to-consumer (B2C) markets, it is unclear what drives SATCOM in business-to-business (B2B) markets. Therefore, we replicate the justice model in an industrial context and find significant differences for procedural justice and interactional justice but not for distributive justice. While distributive justice is equally important in both contexts, procedural justice is more important in B2B markets whereas interactional justice drives SATCOM only in B2C markets. © 2013 Elsevier B.V.
Resumo:
Fluctuations of liquids at the scales where the hydrodynamic and atomistic descriptions overlap are considered. The importance of these fluctuations for atomistic motions is discussed and examples of their accurate modelling with a multi-space-time-scale fluctuating hydrodynamics scheme are provided. To resolve microscopic details of liquid systems, including biomolecular solutions, together with macroscopic fluctuations in space-time, a novel hybrid atomistic-fluctuating hydrodynamics approach is introduced. For a smooth transition between the atomistic and continuum representations, an analogy with two-phase hydrodynamics is used that leads to a strict preservation of macroscopic mass and momentum conservation laws. Examples of numerical implementation of the new hybrid approach for the multiscale simulation of liquid argon in equilibrium conditions are provided. © 2014 The Author(s) Published by the Royal Society.
Resumo:
We describe a novel and potentially important tool for candidate subunit vaccine selection through in silico reverse-vaccinology. A set of Bayesian networks able to make individual predictions for specific subcellular locations is implemented in three pipelines with different architectures: a parallel implementation with a confidence level-based decision engine and two serial implementations with a hierarchical decision structure, one initially rooted by prediction between membrane types and another rooted by soluble versus membrane prediction. The parallel pipeline outperformed the serial pipeline, but took twice as long to execute. The soluble-rooted serial pipeline outperformed the membrane-rooted predictor. Assessment using genomic test sets was more equivocal, as many more predictions are made by the parallel pipeline, yet the serial pipeline identifies 22 more of the 74 proteins of known location.
Resumo:
Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.
Resumo:
Completing projects faster than normal is always a challenge as it often demands many paradigm shifts. Globalization opportunities and competition from private sectors and multinationals are forcing the management of public sector organizations in India's petroleum industry to take various aggressive strategies to maintain profitability. These projects are required to be completed sooner than with a typical schedule to remain competitive, get faster return on investment and give longer project life.
Resumo:
One of the main challenges of classifying clinical data is determining how to handle missing features. Most research favours imputing of missing values or neglecting records that include missing data, both of which can degrade accuracy when missing values exceed a certain level. In this research we propose a methodology to handle data sets with a large percentage of missing values and with high variability in which particular data are missing. Feature selection is effected by picking variables sequentially in order of maximum correlation with the dependent variable and minimum correlation with variables already selected. Classification models are generated individually for each test case based on its particular feature set and the matching data values available in the training population. The method was applied to real patients' anonymous mental-health data where the task was to predict the suicide risk judgement clinicians would give for each patient's data, with eleven possible outcome classes: zero to ten, representing no risk to maximum risk. The results compare favourably with alternative methods and have the advantage of ensuring explanations of risk are based only on the data given, not imputed data. This is important for clinical decision support systems using human expertise for modelling and explaining predictions.
Resumo:
The foot and mouth disease (FMD) epidemic of 2001 was a disaster for sections of the agricultural industry, a number of businesses and for the Ministry of Agriculture, Fisheries and Food (MAFF), which met its demise as a government department during the crisis, being replaced by the Department of the Environment, Food and Rural Affairs (DEFRA). There were some 2,030 confirmed cases and over four million animals slaughtered. It caused the postponement of local elections and of a general election. From a public policy perspective it raised questions about contingency planning, the adjustment of policy to take account of change and how to manage a crisis. This article focuses on the background to the crisis and how it was handled.
Resumo:
Lyophilisation or freeze drying is the preferred dehydrating method for pharmaceuticals liable to thermal degradation. Most biologics are unstable in aqueous solution and may use freeze drying to prolong their shelf life. Lyophilisation is however expensive and has seen lots of work aimed at reducing cost. This thesis is motivated by the potential cost savings foreseen with the adoption of a cost efficient bulk drying approach for large and small molecules. Initial studies identified ideal formulations that adapted well to bulk drying and further powder handling requirements downstream in production. Low cost techniques were used to disrupt large dried cakes into powder while the effects of carrier agent concentration were investigated for powder flowability using standard pharmacopoeia methods. This revealed superiority of crystalline mannitol over amorphous sucrose matrices and established that the cohesive and very poor flow nature of freeze dried powders were potential barriers to success. Studies from powder characterisation showed increased powder densification was mainly responsible for significant improvements in flow behaviour and an initial bulking agent concentration of 10-15 %w/v was recommended. Further optimisation studies evaluated the effects of freezing rates and thermal treatment on powder flow behaviour. Slow cooling (0.2 °C/min) with a -25°C annealing hold (2hrs) provided adequate mechanical strength and densification at 0.5-1 M mannitol concentrations. Stable bulk powders require powder transfer into either final vials or intermediate storage closures. The targeted dosing of powder formulations using volumetric and gravimetric powder dispensing systems where evaluated using Immunoglobulin G (IgG), Lactate Dehydrogenase (LDH) and Beta Galactosidase models. Final protein content uniformity in dosed vials was assessed using activity and protein recovery assays to draw conclusions from deviations and pharmacopeia acceptance values. A correlation between very poor flowability (p<0.05), solute concentration, dosing time and accuracy was revealed. LDH and IgG lyophilised in 0.5 M and 1 M mannitol passed Pharmacopeia acceptance values criteria with 0.1-4 while formulations with micro collapse showed the best dose accuracy (0.32-0.4% deviation). Bulk mannitol content above 0.5 M provided no additional benefits to dosing accuracy or content uniformity of dosed units. This study identified considerations which included the type of protein, annealing, cake disruption process, physical form of the phases present, humidity control and recommended gravimetric transfer as optimal for dispensing powder. Dosing lyophilised powders from bulk was demonstrated as practical, time efficient, economical and met regulatory requirements in cases. Finally the use of a new non-destructive technique, X-ray microcomputer tomography (MCT), was explored for cake and particle characterisation. Studies demonstrated good correlation with traditional gas porosimetry (R2 = 0.93) and morphology studies using microscopy. Flow characterisation from sample sizes of less than 1 mL was demonstrated using three dimensional X-ray quantitative image analyses. A platinum-mannitol dispersion model used revealed a relationship between freezing rate, ice nucleation sites and variations in homogeneity within the top to bottom segments of a formulation.
Resumo:
Recent literature has argued that whereas remembering the past and imagining the future make use of shared cognitive substrates, simulating future events places heavier demands on executive resources. These propositions were explored in 3 experiments comparing the impact of imagery and concurrent task demands on speed and accuracy of past event retrieval and future event simulation. Results provide support for the suggestion that both past and future episodes can be constructed through 2 mechanisms: a noneffortful "direct" pathway and a controlled, effortful "generative" pathway. However, limited evidence emerged for the suggestion that simulating of future, compared with retrieving past, episodes places heavier demands on executive resources; only under certain conditions did it emerge as a more error prone and lengthier process. The findings are discussed in terms of how retrieval and simulation make use of the same cognitive substrates in subtly different ways. © 2011 American Psychological Association.
Resumo:
Java software or libraries can evolve via subclassing. Unfortunately, subclassing may not properly support code adaptation when there are dependencies between classes. More precisely, subclassing in collections of related classes may require reimplementation of otherwise valid classes. This problem is defined as the subclassing anomaly, which is an issue when software evolution or code reuse is a goal of the programmer who is using existing classes. Object Teams offers an implicit fix to this problem and is largely compatible with the existing JVMs. In this paper, we evaluate how well Object Teams succeeds in providing a solution for a complex, real world project. Our results indicate that while Object Teams is a suitable solution for simple examples, it does not meet the requirements for large scale projects. The reasons why Object Teams fails in certain usages may prove useful to those who create linguistic modifications in languages or those who seek new methods for code adaptation.
Resumo:
The paper describes cluster management software and hardware of SCIT supercomputer clusters built in Glushkov Institute of Cybernetics NAS of Ukraine. The paper shows the performance results received on systems that were built and the specific means used to fulfil the goal of performance increase. It should be useful for those scientists and engineers that are practically engaged in a cluster supercomputer systems design, integration and services.
Resumo:
We propose a method for detecting and analyzing the so-called replay attacks in intrusion detection systems, when an intruder contributes a small amount of hostile actions to a recorded session of a legitimate user or process, and replays this session back to the system. The proposed approach can be applied if an automata-based model is used to describe behavior of active entities in a computer system.
Resumo:
Concurrent coding is an encoding scheme with 'holographic' type properties that are shown here to be robust against a significant amount of noise and signal loss. This single encoding scheme is able to correct for random errors and burst errors simultaneously, but does not rely on cyclic codes. A simple and practical scheme has been tested that displays perfect decoding when the signal to noise ratio is of order -18dB. The same scheme also displays perfect reconstruction when a contiguous block of 40% of the transmission is missing. In addition this scheme is 50% more efficient in terms of transmitted power requirements than equivalent cyclic codes. A simple model is presented that describes the process of decoding and can determine the computational load that would be expected, as well as describing the critical levels of noise and missing data at which false messages begin to be generated.