28 resultados para Concurrent exception handling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on the drivers of satisfaction with complaint handling (SATCOM) underlines the importance of procedural, relational, and interactional justice (Orsingher, Valentini, & de Angelis, 2010). Since these SATCOM-studies are largely conducted in business-to-consumer (B2C) markets, it is unclear what drives SATCOM in business-to-business (B2B) markets. Therefore, we replicate the justice model in an industrial context and find significant differences for procedural justice and interactional justice but not for distributive justice. While distributive justice is equally important in both contexts, procedural justice is more important in B2B markets whereas interactional justice drives SATCOM only in B2C markets. © 2013 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluctuations of liquids at the scales where the hydrodynamic and atomistic descriptions overlap are considered. The importance of these fluctuations for atomistic motions is discussed and examples of their accurate modelling with a multi-space-time-scale fluctuating hydrodynamics scheme are provided. To resolve microscopic details of liquid systems, including biomolecular solutions, together with macroscopic fluctuations in space-time, a novel hybrid atomistic-fluctuating hydrodynamics approach is introduced. For a smooth transition between the atomistic and continuum representations, an analogy with two-phase hydrodynamics is used that leads to a strict preservation of macroscopic mass and momentum conservation laws. Examples of numerical implementation of the new hybrid approach for the multiscale simulation of liquid argon in equilibrium conditions are provided. © 2014 The Author(s) Published by the Royal Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a novel and potentially important tool for candidate subunit vaccine selection through in silico reverse-vaccinology. A set of Bayesian networks able to make individual predictions for specific subcellular locations is implemented in three pipelines with different architectures: a parallel implementation with a confidence level-based decision engine and two serial implementations with a hierarchical decision structure, one initially rooted by prediction between membrane types and another rooted by soluble versus membrane prediction. The parallel pipeline outperformed the serial pipeline, but took twice as long to execute. The soluble-rooted serial pipeline outperformed the membrane-rooted predictor. Assessment using genomic test sets was more equivocal, as many more predictions are made by the parallel pipeline, yet the serial pipeline identifies 22 more of the 74 proteins of known location.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Completing projects faster than normal is always a challenge as it often demands many paradigm shifts. Globalization opportunities and competition from private sectors and multinationals are forcing the management of public sector organizations in India's petroleum industry to take various aggressive strategies to maintain profitability. These projects are required to be completed sooner than with a typical schedule to remain competitive, get faster return on investment and give longer project life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main challenges of classifying clinical data is determining how to handle missing features. Most research favours imputing of missing values or neglecting records that include missing data, both of which can degrade accuracy when missing values exceed a certain level. In this research we propose a methodology to handle data sets with a large percentage of missing values and with high variability in which particular data are missing. Feature selection is effected by picking variables sequentially in order of maximum correlation with the dependent variable and minimum correlation with variables already selected. Classification models are generated individually for each test case based on its particular feature set and the matching data values available in the training population. The method was applied to real patients' anonymous mental-health data where the task was to predict the suicide risk judgement clinicians would give for each patient's data, with eleven possible outcome classes: zero to ten, representing no risk to maximum risk. The results compare favourably with alternative methods and have the advantage of ensuring explanations of risk are based only on the data given, not imputed data. This is important for clinical decision support systems using human expertise for modelling and explaining predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foot and mouth disease (FMD) epidemic of 2001 was a disaster for sections of the agricultural industry, a number of businesses and for the Ministry of Agriculture, Fisheries and Food (MAFF), which met its demise as a government department during the crisis, being replaced by the Department of the Environment, Food and Rural Affairs (DEFRA). There were some 2,030 confirmed cases and over four million animals slaughtered. It caused the postponement of local elections and of a general election. From a public policy perspective it raised questions about contingency planning, the adjustment of policy to take account of change and how to manage a crisis. This article focuses on the background to the crisis and how it was handled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lyophilisation or freeze drying is the preferred dehydrating method for pharmaceuticals liable to thermal degradation. Most biologics are unstable in aqueous solution and may use freeze drying to prolong their shelf life. Lyophilisation is however expensive and has seen lots of work aimed at reducing cost. This thesis is motivated by the potential cost savings foreseen with the adoption of a cost efficient bulk drying approach for large and small molecules. Initial studies identified ideal formulations that adapted well to bulk drying and further powder handling requirements downstream in production. Low cost techniques were used to disrupt large dried cakes into powder while the effects of carrier agent concentration were investigated for powder flowability using standard pharmacopoeia methods. This revealed superiority of crystalline mannitol over amorphous sucrose matrices and established that the cohesive and very poor flow nature of freeze dried powders were potential barriers to success. Studies from powder characterisation showed increased powder densification was mainly responsible for significant improvements in flow behaviour and an initial bulking agent concentration of 10-15 %w/v was recommended. Further optimisation studies evaluated the effects of freezing rates and thermal treatment on powder flow behaviour. Slow cooling (0.2 °C/min) with a -25°C annealing hold (2hrs) provided adequate mechanical strength and densification at 0.5-1 M mannitol concentrations. Stable bulk powders require powder transfer into either final vials or intermediate storage closures. The targeted dosing of powder formulations using volumetric and gravimetric powder dispensing systems where evaluated using Immunoglobulin G (IgG), Lactate Dehydrogenase (LDH) and Beta Galactosidase models. Final protein content uniformity in dosed vials was assessed using activity and protein recovery assays to draw conclusions from deviations and pharmacopeia acceptance values. A correlation between very poor flowability (p<0.05), solute concentration, dosing time and accuracy was revealed. LDH and IgG lyophilised in 0.5 M and 1 M mannitol passed Pharmacopeia acceptance values criteria with 0.1-4 while formulations with micro collapse showed the best dose accuracy (0.32-0.4% deviation). Bulk mannitol content above 0.5 M provided no additional benefits to dosing accuracy or content uniformity of dosed units. This study identified considerations which included the type of protein, annealing, cake disruption process, physical form of the phases present, humidity control and recommended gravimetric transfer as optimal for dispensing powder. Dosing lyophilised powders from bulk was demonstrated as practical, time efficient, economical and met regulatory requirements in cases. Finally the use of a new non-destructive technique, X-ray microcomputer tomography (MCT), was explored for cake and particle characterisation. Studies demonstrated good correlation with traditional gas porosimetry (R2 = 0.93) and morphology studies using microscopy. Flow characterisation from sample sizes of less than 1 mL was demonstrated using three dimensional X-ray quantitative image analyses. A platinum-mannitol dispersion model used revealed a relationship between freezing rate, ice nucleation sites and variations in homogeneity within the top to bottom segments of a formulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent literature has argued that whereas remembering the past and imagining the future make use of shared cognitive substrates, simulating future events places heavier demands on executive resources. These propositions were explored in 3 experiments comparing the impact of imagery and concurrent task demands on speed and accuracy of past event retrieval and future event simulation. Results provide support for the suggestion that both past and future episodes can be constructed through 2 mechanisms: a noneffortful "direct" pathway and a controlled, effortful "generative" pathway. However, limited evidence emerged for the suggestion that simulating of future, compared with retrieving past, episodes places heavier demands on executive resources; only under certain conditions did it emerge as a more error prone and lengthier process. The findings are discussed in terms of how retrieval and simulation make use of the same cognitive substrates in subtly different ways. © 2011 American Psychological Association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concurrent coding is an encoding scheme with 'holographic' type properties that are shown here to be robust against a significant amount of noise and signal loss. This single encoding scheme is able to correct for random errors and burst errors simultaneously, but does not rely on cyclic codes. A simple and practical scheme has been tested that displays perfect decoding when the signal to noise ratio is of order -18dB. The same scheme also displays perfect reconstruction when a contiguous block of 40% of the transmission is missing. In addition this scheme is 50% more efficient in terms of transmitted power requirements than equivalent cyclic codes. A simple model is presented that describes the process of decoding and can determine the computational load that would be expected, as well as describing the critical levels of noise and missing data at which false messages begin to be generated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature selection is important in medical field for many reasons. However, selecting important variables is a difficult task with the presence of censoring that is a unique feature in survival data analysis. This paper proposed an approach to deal with the censoring problem in endovascular aortic repair survival data through Bayesian networks. It was merged and embedded with a hybrid feature selection process that combines cox's univariate analysis with machine learning approaches such as ensemble artificial neural networks to select the most relevant predictive variables. The proposed algorithm was compared with common survival variable selection approaches such as; least absolute shrinkage and selection operator LASSO, and Akaike information criterion AIC methods. The results showed that it was capable of dealing with high censoring in the datasets. Moreover, ensemble classifiers increased the area under the roc curves of the two datasets collected from two centers located in United Kingdom separately. Furthermore, ensembles constructed with center 1 enhanced the concordance index of center 2 prediction compared to the model built with a single network. Although the size of the final reduced model using the neural networks and its ensembles is greater than other methods, the model outperformed the others in both concordance index and sensitivity for center 2 prediction. This indicates the reduced model is more powerful for cross center prediction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the world's synchrotrons and X-FELs endeavour to meet the need to analyse ever-smaller protein crystals, there grows a requirement for a new technique to present nano-dimensional samples to the beam for X-ray diffraction experiments.The work presented here details developmental work to reconfigure the nano tweezer technology developed by Optofluidics (PA, USA) for the trapping of nano dimensional protein crystals for X-ray crystallography experiments. The system in its standard configuration is used to trap nano particles for optical microscopy. It uses silicon nitride laser waveguides that bridge a micro fluidic channel. These waveguides contain 180 nm apertures of enabling the system to use biologically compatible 1.6 micron wavelength laser light to trap nano dimensional biological samples. Using conventional laser tweezers, the wavelength required to trap such nano dimensional samples would destroy them. The system in its optical configuration has trapped protein molecules as small as 10 nanometres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the world's synchrotrons and X-FELs endeavour to meet the need to analyse ever-smaller protein crystals, there grows a requirement for a new technique to present nano-dimensional samples to the beam for X-ray diffraction experiments.The work presented here details developmental work to reconfigure the nano tweezer technology developed by Optofluidics (PA, USA) for the trapping of nano dimensional protein crystals for X-ray crystallography experiments. The system in its standard configuration is used to trap nano particles for optical microscopy. It uses silicon nitride laser waveguides that bridge a micro fluidic channel. These waveguides contain 180 nm apertures of enabling the system to use biologically compatible 1.6 micron wavelength laser light to trap nano dimensional biological samples. Using conventional laser tweezers, the wavelength required to trap such nano dimensional samples would destroy them. The system in its optical configuration has trapped protein molecules as small as 10 nanometres.