81 resultados para Decision-analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projects that are exposed to uncertain environments can be effectively controlled with the application of risk analysis during the planning stage. The Analytic Hierarchy Process, a multiattribute decision-making technique, can be used to analyse and assess project risks which are objective or subjective in nature. Among other advantages, the process logically integrates the various elements in the planning process. The results from risk analysis and activity analysis are then used to develop a logical contingency allowance for the project through the application of probability theory. The contingency allowance is created in two parts: (a) a technical contingency, and (b) a management contingency. This provides a basis for decision making in a changing project environment. Effective control of the project is made possible by the limitation of the changes within the monetary contingency allowance for the work package concerned, and the utilization of the contingency through proper appropriation. The whole methodology is applied to a pipeline-laying project in India, and its effectiveness in project control is demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Oxidation of proteins has received a lot of attention in the last decades due to the fact that they have been shown to accumulate and to be implicated in the progression and the patho-physiology of several diseases such as Alzheimer, coronary heart diseases, etc. This has also resulted in the fact that research scientist became more eager to be able to measure accurately the level of oxidized protein in biological materials, and to determine the precise site of the oxidative attack on the protein, in order to get insights into the molecular mechanisms involved in the progression of diseases. Several methods for measuring protein carbonylation have been implemented in different laboratories around the world. However, to date no methods prevail as the most accurate, reliable and robust. The present paper aims at giving an overview of the common methods used to determine protein carbonylation in biological material as well as to highlight the limitations and the potential. The ultimate goal is to give quick tips for a rapid decision making when a method has to be selected and taking into consideration the advantage and drawback of the methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores differences in how primary care doctors process the clinical presentation of depression by African American and African-Caribbean patients compared with white patients in the US and the UK. The aim is to gain a better understanding of possible pathways by which racial disparities arise in depression care. One hundred and eight doctors described their thought processes after viewing video recorded simulated patients presenting with identical symptoms strongly suggestive of depression. These descriptions were analysed using the CliniClass system, which captures information about micro-components of clinical decision making and permits a systematic, structured and detailed analysis of how doctors arrive at diagnostic, intervention and management decisions. Video recordings of actors portraying black (both African American and African-Caribbean) and white (both White American and White British) male and female patients (aged 55 years and 75 years) were presented to doctors randomly selected from the Massachusetts Medical Society list and from Surrey/South West London and West Midlands National Health Service lists, stratified by country (US v.UK), gender, and years of clinical experience (less v. very experienced). Findings demonstrated little evidence of bias affecting doctors' decision making processes, with the exception of less attention being paid to the potential outcomes associated with different treatment options for African American compared with White American patients in the US. Instead, findings suggest greater clinical uncertainty in diagnosing depression amongst black compared with white patients, particularly in the UK. This was evident in more potential diagnoses. There was also a tendency for doctors in both countries to focus more on black patients' physical rather than psychological symptoms and to identify endocrine problems, most often diabetes, as a presenting complaint for them. This suggests that doctors in both countries have a less well developed mental model of depression for black compared with white patients. © 2014 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) has gained a wide range of applications in measuring comparative efficiency of decision making units (DMUs) with multiple incommensurate inputs and outputs. The standard DEA method requires that the status of all input and output variables be known exactly. However, in many real applications, the status of some measures is not clearly known as inputs or outputs. These measures are referred to as flexible measures. This paper proposes a flexible slacks-based measure (FSBM) of efficiency in which each flexible measure can play input role for some DMUs and output role for others to maximize the relative efficiency of the DMU under evaluation. Further, we will show that when an operational unit is efficient in a specific flexible measure, this measure can play both input and output roles for this unit. In this case, the optimal input/output designation for flexible measure is one that optimizes the efficiency of the artificial average unit. An application in assessing UK higher education institutions used to show the applicability of the proposed approach. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To be competitive in contemporary turbulent environments, firms must be capable of processing huge amounts of information, and effectively convert it into actionable knowledge. This is particularly the case in the marketing context, where problems are also usually highly complex, unstructured and ill-defined. In recent years, the development of marketing management support systems has paralleled this evolution in informational problems faced by managers, leading to a growth in the study (and use) of artificial intelligence and soft computing methodologies. Here, we present and implement a novel intelligent system that incorporates fuzzy logic and genetic algorithms to operate in an unsupervised manner. This approach allows the discovery of interesting association rules, which can be linguistically interpreted, in large scale databases (KDD or Knowledge Discovery in Databases.) We then demonstrate its application to a distribution channel problem. It is shown how the proposed system is able to return a number of novel and potentially-interesting associations among variables. Thus, it is argued that our method has significant potential to improve the analysis of marketing and business databases in practice, especially in non-programmed decisional scenarios, as well as to assist scholarly researchers in their exploratory analysis. © 2013 Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four bar mechanisms are basic components of many important mechanical devices. The kinematic synthesis of four bar mechanisms is a difficult design problem. A novel method that combines the genetic programming and decision tree learning methods is presented. We give a structural description for the class of mechanisms that produce desired coupler curves. Constructive induction is used to find and characterize feasible regions of the design space. Decision trees constitute the learning engine, and the new features are created by genetic programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to investigate the effects of circularity, comorbidity, prevalence and presentation variation on the accuracy of differential diagnoses made in optometric primary care using a modified form of naïve Bayesian sequential analysis. No such investigation has ever been reported before. Data were collected for 1422 cases seen over one year. Positive test outcomes were recorded for case history (ethnicity, age, symptoms and ocular and medical history) and clinical signs in relation to each diagnosis. For this reason only positive likelihood ratios were used for this modified form of Bayesian analysis that was carried out with Laplacian correction and Chi-square filtration. Accuracy was expressed as the percentage of cases for which the diagnoses made by the clinician appeared at the top of a list generated by Bayesian analysis. Preliminary analyses were carried out on 10 diagnoses and 15 test outcomes. Accuracy of 100% was achieved in the absence of presentation variation but dropped by 6% when variation existed. Circularity artificially elevated accuracy by 0.5%. Surprisingly, removal of Chi-square filtering increased accuracy by 0.4%. Decision tree analysis showed that accuracy was influenced primarily by prevalence followed by presentation variation and comorbidity. Analysis of 35 diagnoses and 105 test outcomes followed. This explored the use of positive likelihood ratios, derived from the case history, to recommend signs to look for. Accuracy of 72% was achieved when all clinical signs were entered. The drop in accuracy, compared to the preliminary analysis, was attributed to the fact that some diagnoses lacked strong diagnostic signs; the accuracy increased by 1% when only recommended signs were entered. Chi-square filtering improved recommended test selection. Decision tree analysis showed that accuracy again influenced primarily by prevalence, followed by comorbidity and presentation variation. Future work will explore the use of likelihood ratios based on positive and negative test findings prior to considering naïve Bayesian analysis as a form of artificial intelligence in optometric practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To explore current risk assessment processes in general practice and Improving Access to Psychological Therapies (IAPT) services, and to consider whether the Galatean Risk and Safety Tool (GRiST) can help support improved patient care. Background: Much has been written about risk assessment practice in secondary mental health care, but little is known about how it is undertaken at the beginning of patients' care pathways, within general practice and IAPT services. Methods: Interviews with eight general practice and eight IAPT clinicians from two primary care trusts in the West Midlands, UK, and eight service users from the same region. Interviews explored current practice and participants' views and experiences of mental health risk assessment. Two focus groups were also carried out, one with general practice and one with IAPT clinicians, to review interview findings and to elicit views about GRiST from a demonstration of its functionality. Data were analysed using thematic analysis. Findings Variable approaches to mental health risk assessment were observed. Clinicians were anxious that important risk information was being missed, and risk communication was undermined. Patients felt uninvolved in the process, and both clinicians and patients expressed anxiety about risk assessment skills. Clinicians were positive about the potential for GRiST to provide solutions to these problems. Conclusions: A more structured and systematic approach to risk assessment in general practice and IAPT services is needed, to ensure important risk information is captured and communicated across the care pathway. GRiST has the functionality to support this aspect of practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new technique for optimizing the trading strategy of brokers that autonomously trade in re- tail and wholesale markets. Simultaneous optimization of re- tail and wholesale strategies has been considered by existing studies as intractable. Therefore, each of these strategies is optimized separately and their interdependence is generally ignored, with resulting broker agents not aiming for a glob- ally optimal retail and wholesale strategy. In this paper, we propose a novel formalization, based on a semi-Markov deci- sion process (SMDP), which globally and simultaneously op- timizes retail and wholesale strategies. The SMDP is solved using hierarchical reinforcement learning (HRL) in multi- agent environments. To address the curse of dimensionality, which arises when applying SMDP and HRL to complex de- cision problems, we propose an ecient knowledge transfer approach. This enables the reuse of learned trading skills in order to speed up the learning in new markets, at the same time as making the broker transportable across market envi- ronments. The proposed SMDP-broker has been thoroughly evaluated in two well-established multi-agent simulation en- vironments within the Trading Agent Competition (TAC) community. Analysis of controlled experiments shows that this broker can outperform the top TAC-brokers. More- over, our broker is able to perform well in a wide range of environments by re-using knowledge acquired in previously experienced settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper seeks to advance the theory and practice of the dynamics of complex networks in relation to direct and indirect citations. It applies social network analysis (SNA) and the ordered weighted averaging operator (OWA) to study a patent citations network. So far the SNA studies investigating long chains of patents citations have rarely been undertaken and the importance of a node in a network has been associated mostly with its number of direct ties. In this research OWA is used to analyse complex networks, assess the role of indirect ties, and provide guidance to reduce complexity for decision makers and analysts. An empirical example of a set of European patents published in 2000 in the renewable energy industry is provided to show the usefulness of the proposed approach for the preference ranking of patent citations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to investigate the technological development of electronic inventory solutions from perspective of patent analysis. We first applied the international patent classification to classify the top categories of data processing technologies and their corresponding top patenting countries. Then we identified the core technologies by the calculation of patent citation strength and standard deviation criterion for each patent. To eliminate those core innovations having no reference relationships with the other core patents, relevance strengths between core technologies were evaluated also. Our findings provide market intelligence not only for the research and development community, but for the decision making of advanced inventory solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuzzy data envelopment analysis (DEA) models emerge as another class of DEA models to account for imprecise inputs and outputs for decision making units (DMUs). Although several approaches for solving fuzzy DEA models have been developed, there are some drawbacks, ranging from the inability to provide satisfactory discrimination power to simplistic numerical examples that handles only triangular fuzzy numbers or symmetrical fuzzy numbers. To address these drawbacks, this paper proposes using the concept of expected value in generalized DEA (GDEA) model. This allows the unification of three models - fuzzy expected CCR, fuzzy expected BCC, and fuzzy expected FDH models - and the ability of these models to handle both symmetrical and asymmetrical fuzzy numbers. We also explored the role of fuzzy GDEA model as a ranking method and compared it to existing super-efficiency evaluation models. Our proposed model is always feasible, while infeasibility problems remain in certain cases under existing super-efficiency models. In order to illustrate the performance of the proposed method, it is first tested using two established numerical examples and compared with the results obtained from alternative methods. A third example on energy dependency among 23 European Union (EU) member countries is further used to validate and describe the efficacy of our approach under asymmetric fuzzy numbers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local Government Authorities (LGAs) are mainly characterised as information-intensive organisations. To satisfy their information requirements, effective information sharing within and among LGAs is necessary. Nevertheless, the dilemma of Inter-Organisational Information Sharing (IOIS) has been regarded as an inevitable issue for the public sector. Despite a decade of active research and practice, the field lacks a comprehensive framework to examine the factors influencing Electronic Information Sharing (EIS) among LGAs. The research presented in this paper contributes towards resolving this problem by developing a conceptual framework of factors influencing EIS in Government-to-Government (G2G) collaboration. By presenting this model, we attempt to clarify that EIS in LGAs is affected by a combination of environmental, organisational, business process, and technological factors and that it should not be scrutinised merely from a technical perspective. To validate the conceptual rationale, multiple case study based research strategy was selected. From an analysis of the empirical data from two case organisations, this paper exemplifies the importance (i.e. prioritisation) of these factors in influencing EIS by utilising the Analytical Hierarchy Process (AHP) technique. The intent herein is to offer LGA decision-makers with a systematic decision-making process in realising the importance (i.e. from most important to least important) of EIS influential factors. This systematic process will also assist LGA decision-makers in better interpreting EIS and its underlying problems. The research reported herein should be of interest to both academics and practitioners who are involved in IOIS, in general, and collaborative e-Government, in particular. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major drawback of coherent optical orthogonal frequency-division multiplexing (CO-OFDM) that hitherto remains unsolved is its vulnerability to nonlinear fiber effects due to its high peak-to-average power ratio. Several digital signal processing techniques have been investigated for the compensation of fiber nonlinearities, e.g., digital back-propagation, nonlinear pre- and post-compensation and nonlinear equalizers (NLEs) based on the inverse Volterra-series transfer function (IVSTF). Alternatively, nonlinearities can be mitigated using nonlinear decision classifiers such as artificial neural networks (ANNs) based on a multilayer perceptron. In this paper, ANN-NLE is presented for a 16QAM CO-OFDM system. The capability of the proposed approach to compensate the fiber nonlinearities is numerically demonstrated for up to 100-Gb/s and over 1000km and compared to the benchmark IVSTF-NLE. Results show that in terms of Q-factor, for 100-Gb/s at 1000km of transmission, ANN-NLE outperforms linear equalization and IVSTF-NLE by 3.2dB and 1dB, respectively.