895 resultados para multiple approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the contemporary customer-driven supply chain, maximization of customer service plays an equally important role as minimization of costs for a company to retain and increase its competitiveness. This article develops a multiple-criteria optimization approach, combining the analytic hierarchy process (AHP) and an integer linear programming (ILP) model, to aid the design of an optimal logistics distribution network. The proposed approach outperforms traditional cost-based optimization techniques because it considers both quantitative and qualitative factors and also aims at maximizing the benefits of deliverer and customers. In the approach, the AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to some critical customer-oriented criteria. The results of AHP prioritization are utilized as the input of the ILP model, the objective of which is to select the best warehouses at the lowest possible cost. In this article, two commercial packages are used: including Expert Choice and LINDO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to define and manipulate the interaction of peptides with MHC molecules has immense immunological utility, with applications in epitope identification, vaccine design, and immunomodulation. However, the methods currently available for prediction of peptide-MHC binding are far from ideal. We recently described the application of a bioinformatic prediction method based on quantitative structure-affinity relationship methods to peptide-MHC binding. In this study we demonstrate the predictivity and utility of this approach. We determined the binding affinities of a set of 90 nonamer peptides for the MHC class I allele HLA-A*0201 using an in-house, FACS-based, MHC stabilization assay, and from these data we derived an additive quantitative structure-affinity relationship model for peptide interaction with the HLA-A*0201 molecule. Using this model we then designed a series of high affinity HLA-A2-binding peptides. Experimental analysis revealed that all these peptides showed high binding affinities to the HLA-A*0201 molecule, significantly higher than the highest previously recorded. In addition, by the use of systematic substitution at principal anchor positions 2 and 9, we showed that high binding peptides are tolerant to a wide range of nonpreferred amino acids. Our results support a model in which the affinity of peptide binding to MHC is determined by the interactions of amino acids at multiple positions with the MHC molecule and may be enhanced by enthalpic cooperativity between these component interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis objective is to discover “How are informal decisions reached by screeners when filtering out undesirable job applications?” Grounded theory techniques were employed in the field to observe and analyse informal decisions at the source by screeners in three distinct empirical studies. Whilst grounded theory provided the method for case and cross-case analysis, literature from academic and non-academic sources was evaluated and integrated to strengthen this research and create a foundation for understanding informal decisions. As informal decisions in early hiring processes have been under researched, this thesis contributes to current knowledge in several ways. First, it locates the Cycle of Employment which enhances Robertson and Smith’s (1993) Selection Paradigm through the integration of stages that individuals occupy whilst seeking employment. Secondly, a general depiction of the Workflow of General Hiring Processes provides a template for practitioners to map and further develop their organisational processes. Finally, it highlights the emergence of the Locality Effect, which is a geographically driven heuristic and bias that can significantly impact recruitment and informal decisions. Although screeners make informal decisions using multiple variables, informal decisions are made in stages as evidence in the Cycle of Employment. Moreover, informal decisions can be erroneous as a result of a majority and minority influence, the weighting of information, the injection of inappropriate information and criteria, and the influence of an assessor. This thesis considers these faults and develops a basic framework of understanding informal decisions to which future research can be launched.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work a self-referenced technique for fiberoptic intensity sensors using virtual lock-in amplifiers is proposed and discussed. The topology is compatible with WDM networks so multiple remote sensors can simultaneously be interrogated. A hybrid approach combining both silica fiber Bragg gratings and polymer optical fiber Bragg gratings is analyzed. The feasibility of the proposed solution for potential medical environments and biomedical applications is shown and tested using a selfreferenced configuration based on a power ratio parameter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our recent work in different bioreactors up to 2.5L in scale, we have successfully cultured hMSCs using the minimum agitator speed required for complete microcarrier suspension, N JS. In addition, we also reported a scaleable protocol for the detachment from microcarriers in spinner flasks of hMSCs from two donors. The essence of the protocol is the use of a short period of intense agitation in the presence of enzymes such that the cells are detached; but once detachment is achieved, the cells are smaller than the Kolmogorov scale of turbulence and hence not damaged. Here, the same approach has been effective for culture at N JS and detachment in-situ in 15mL ambr™ bioreactors, 100mL spinner flasks and 250mL Dasgip bioreactors. In these experiments, cells from four different donors were used along with two types of microcarrier with and without surface coatings (two types), four different enzymes and three different growth media (with and without serum), a total of 22 different combinations. In all cases after detachment, the cells were shown to retain their desired quality attributes and were able to proliferate. This agitation strategy with respect to culture and harvest therefore offers a sound basis for a wide range of scales of operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62P10, 92D10, 92D30, 94A17, 62L10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a business model for e-supermarkets to enable multi-product sourcing capacity through co-opetition (collaborative competition). The logistics aspect of our approach is to design and execute a network system where “premium” goods are acquired from vendors at multiple locations in the supply network and delivered to customers. Our specific goals are to: (i) investigate the role of premium product offerings in creating critical mass and profit; (ii) develop a model for the multiple-pickup single-delivery vehicle routing problem in the presence of multiple vendors; and (iii) propose a hybrid solution approach. To solve the problem introduced in this paper, we develop a hybrid metaheuristic approach that uses a Genetic Algorithm for vendor selection and allocation, and a modified savings algorithm for the capacitated VRP with multiple pickup, single delivery and time windows (CVRPMPDTW). The proposed Genetic Algorithm guides the search for optimal vendor pickup location decisions, and for each generated solution in the genetic population, a corresponding CVRPMPDTW is solved using the savings algorithm. We validate our solution approach against published VRPTW solutions and also test our algorithm with Solomon instances modified for CVRPMPDTW.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this talk we investigate the usage of spectrally shaped amplified spontaneous emission (ASE) in order to emulate highly dispersed wavelength division multiplexed (WDM) signals in an optical transmission system. Such a technique offers various simplifications to large scale WDM experiments. Not only does it offer a reduction in transmitter complexity, removing the need for multiple source lasers, it potentially reduces the test and measurement complexity by requiring only the centre channel of a WDM system to be measured in order to estimate WDM worst case performance. The use of ASE as a test and measurement tool is well established in optical communication systems and several measurement techniques will be discussed [1, 2]. One of the most prevalent uses of ASE is in the measurement of receiver sensitivity where ASE is introduced in order to degrade the optical signal to noise ratio (OSNR) and measure the resulting bit error rate (BER) at the receiver. From an analytical point of view noise has been used to emulate system performance, the Gaussian Noise model is used as an estimate of highly dispersed signals and has had consider- able interest [3]. The work to be presented here extends the use of ASE by using it as a metric to emulate highly dispersed WDM signals and in the process reduce WDM transmitter complexity and receiver measurement time in a lab environment. Results thus far have indicated [2] that such a transmitter configuration is consistent with an AWGN model for transmission, with modulation format complexity and nonlinearities playing a key role in estimating the performance of systems utilising the ASE channel emulation technique. We conclude this work by investigating techniques capable of characterising the nonlinear and damage limits of optical fibres and the resultant information capacity limits. REFERENCES McCarthy, M. E., N. Mac Suibhne, S. T. Le, P. Harper, and A. D. Ellis, “High spectral efficiency transmission emulation for non-linear transmission performance estimation for high order modulation formats," 2014 European Conference on IEEE Optical Communication (ECOC), 2014. 2. Ellis, A., N. Mac Suibhne, F. Gunning, and S. Sygletos, “Expressions for the nonlinear trans- mission performance of multi-mode optical fiber," Opt. Express, Vol. 21, 22834{22846, 2013. Vacondio, F., O. Rival, C. Simonneau, E. Grellier, A. Bononi, L. Lorcy, J. Antona, and S. Bigo, “On nonlinear distortions of highly dispersive optical coherent systems," Opt. Express, Vol. 20, 1022-1032, 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Intensive risk factor management is recommended for individuals with diabetes. However, it is not known if such an approach is appropriate in the elderly with multiple comorbidities and limited life expectancy. The aim of this study was to characterise a cohort of very elderly individuals with diabetes and assess the impact of known risk factors on mortality. Methods: This was a retrospective audit approved by the clinical audit lead. All patients aged >80 years who attended diabetes outpatient clinics 2 years prior to the date of the audit (April 2012) were identified from clinic records. A detailed history including demographics, comorbidities and treatment were collected. Blood pressure readings, HbA1c, cholesterol and renal function were extracted and the mean of these readings was recorded. Survival status at 2 years was recorded for all patients. Statistical analysis was performed using SPSS19. Results: Data were available for 864 (381 male, 483 female) patients. The majority (75%) lived in their own home. More than 60% had multiple comorbidities and 25% had a prior history of cardiovascular disease. Two-thirds of the patients had more than one hospital admission in 2 years and a third had more than three admissions. 60% were on either insulin or a sulfonylurea. Mean HbA1c was 7.6%, cholesterol 4.2mmol/l, systolic blood pressure 145mmHg and eGFR 53ml/min. Over 2 years, 174 (20%)had died. Age, creatinine and previous coronary heart disease were significant predictors of death. Conclusion: The benefits of intensive diabetes management appear to be uncertain in very elderly patients. The need for intensive treatment must therefore be individualised to each patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Belief-desire reasoning is a core component of 'Theory of Mind' (ToM), which can be used to explain and predict the behaviour of agents. Neuroimaging studies reliably identify a network of brain regions comprising a 'standard' network for ToM, including temporoparietal junction and medial prefrontal cortex. Whilst considerable experimental evidence suggests that executive control (EC) may support a functioning ToM, co-ordination of neural systems for ToM and EC is poorly understood. We report here use of a novel task in which psychologically relevant ToM parameters (true versus false belief; approach versus avoidance desire) were manipulated orthogonally. The valence of these parameters not only modulated brain activity in the 'standard' ToM network but also in EC regions. Varying the valence of both beliefs and desires recruits anterior cingulate cortex, suggesting a shared inhibitory component associated with negatively valenced mental state concepts. Varying the valence of beliefs additionally draws on ventrolateral prefrontal cortex, reflecting the need to inhibit self perspective. These data provide the first evidence that separate functional and neural systems for EC may be recruited in the service of different aspects of ToM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Access control (AC) limits access to the resources of a system only to authorized entities. Given that information systems today are increasingly interconnected, AC is extremely important. The implementation of an AC service is a complicated task. Yet the requirements to an AC service vary a lot. Accordingly, the design of an AC service should be flexible and extensible in order to save development effort and time. Unfortunately, with conventional object-oriented techniques, when an extension has not been anticipated at the design time, the modification incurred by the extension is often invasive. Invasive changes destroy design modularity, further deteriorate design extensibility, and even worse, they reduce product reliability. ^ A concern is crosscutting if it spans multiple object-oriented classes. It was identified that invasive changes were due to the crosscutting nature of most unplanned extensions. To overcome this problem, an aspect-oriented design approach for AC services was proposed, as aspect-oriented techniques could effectively encapsulate crosscutting concerns. The proposed approach was applied to develop an AC framework that supported role-based access control model. In the framework, the core role-based access control mechanism is given in an object-oriented design, while each extension is captured as an aspect. The resulting framework is well-modularized, flexible, and most importantly, supports noninvasive adaptation. ^ In addition, a process to formalize the aspect-oriented design was described. The purpose is to provide high assurance for AC services. Object-Z was used to specify the static structure and Predicate/Transition net was used to model the dynamic behavior. Object-Z was extended to facilitate specification in an aspect-oriented style. The process of formal modeling helps designers to enhance their understanding of the design, hence to detect problems. Furthermore, the specification can be mathematically verified. This provides confidence that the design is correct. It was illustrated through an example that the model was ready for formal analysis. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary aim of this dissertation is to develop data mining tools for knowledge discovery in biomedical data when multiple (homogeneous or heterogeneous) sources of data are available. The central hypothesis is that, when information from multiple sources of data are used appropriately and effectively, knowledge discovery can be better achieved than what is possible from only a single source. ^ Recent advances in high-throughput technology have enabled biomedical researchers to generate large volumes of diverse types of data on a genome-wide scale. These data include DNA sequences, gene expression measurements, and much more; they provide the motivation for building analysis tools to elucidate the modular organization of the cell. The challenges include efficiently and accurately extracting information from the multiple data sources; representing the information effectively, developing analytical tools, and interpreting the results in the context of the domain. ^ The first part considers the application of feature-level integration to design classifiers that discriminate between soil types. The machine learning tools, SVM and KNN, were used to successfully distinguish between several soil samples. ^ The second part considers clustering using multiple heterogeneous data sources. The resulting Multi-Source Clustering (MSC) algorithm was shown to have a better performance than clustering methods that use only a single data source or a simple feature-level integration of heterogeneous data sources. ^ The third part proposes a new approach to effectively incorporate incomplete data into clustering analysis. Adapted from K-means algorithm, the Generalized Constrained Clustering (GCC) algorithm makes use of incomplete data in the form of constraints to perform exploratory analysis. Novel approaches for extracting constraints were proposed. For sufficiently large constraint sets, the GCC algorithm outperformed the MSC algorithm. ^ The last part considers the problem of providing a theme-specific environment for mining multi-source biomedical data. The database called PlasmoTFBM, focusing on gene regulation of Plasmodium falciparum, contains diverse information and has a simple interface to allow biologists to explore the data. It provided a framework for comparing different analytical tools for predicting regulatory elements and for designing useful data mining tools. ^ The conclusion is that the experiments reported in this dissertation strongly support the central hypothesis.^