853 resultados para Probabilistic decision process model
Resumo:
Network building and exchange of information by people within networks is crucial to the innovation process. Contrary to older models, in social networks the flow of information is noncontinuous and nonlinear. There are critical barriers to information flow that operate in a problematic manner. New models and new analytic tools are needed for these systems. This paper introduces the concept of virtual circuits and draws on recent concepts of network modelling and design to introduce a probabilistic switch theory that can be described using matrices. It can be used to model multistep information flow between people within organisational networks, to provide formal definitions of efficient and balanced networks and to describe distortion of information as it passes along human communication channels. The concept of multi-dimensional information space arises naturally from the use of matrices. The theory and the use of serial diagonal matrices have applications to organisational design and to the modelling of other systems. It is hypothesised that opinion leaders or creative individuals are more likely to emerge at information-rich nodes in networks. A mathematical definition of such nodes is developed and it does not invariably correspond with centrality as defined by early work on networks.
Resumo:
The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.
Resumo:
Expert systems, and artificial intelligence more generally, can provide a useful means for representing decision-making processes. By linking expert systems software to simulation software an effective means of including these decision-making processes in a simulation model can be achieved. This paper demonstrates how a commercial-off-the-shelf simulation package (Witness) can be linked to an expert systems package (XpertRule) through a Visual Basic interface. The methodology adopted could be used for models, and possibly software, other than those presented here.
Resumo:
This thesis reviews the main methodological developments in public sector investment appraisal and finds growing evidence that appraisal techniques are not fulfilling their earlier promise. It is suggested that an important reason for this failure lies in the inability of these techniques to handle uncertainty except in a highly circumscribed fashion. It is argued that a more fruitful approach is to strive for flexibility. Investment projects should be formulated with a view to making them responsive to a wide range of possible future events, rather than embodying a solution which is optimal for one configuration of circumstances only. The distinction drawn in economics between the short and the long run is used to examine the nature of flexibility. The concept of long run flexibility is applied to the pre-investment range of choice open to the decisionmaker. It is demonstrated that flexibility is reduced at a very early stage of decisionmaking by the conventional system of appraisal which evaluates only a small number of options. The pre-appraisal filtering process is considered further in relation to decisionmaking models. It is argued that for public sector projects the narrowing down of options is best understood in relation to an amended mixed scanning model which places importance on the process by which the 'national interest ' is determined. Short run flexibility deals with operational characteristics, the degree to which particular projects may respond to changing demands when the basic investment is already in place. The tension between flexibility and cost is noted. A short case study on the choice of electricity generating plant is presented. The thesis concludes with a brief examination of the approaches used by successive British governments to public sector investment, particularly in relation to the nationalised industries
Resumo:
This thesis explores the process of developing a principled approach for translating a model of mental-health risk expertise into a probabilistic graphical structure. Probabilistic graphical structures can be a combination of graph and probability theory that provide numerous advantages when it comes to the representation of domains involving uncertainty, domains such as the mental health domain. In this thesis the advantages that probabilistic graphical structures offer in representing such domains is built on. The Galatean Risk Screening Tool (GRiST) is a psychological model for mental health risk assessment based on fuzzy sets. In this thesis the knowledge encapsulated in the psychological model was used to develop the structure of the probability graph by exploiting the semantics of the clinical expertise. This thesis describes how a chain graph can be developed from the psychological model to provide a probabilistic evaluation of risk that complements the one generated by GRiST’s clinical expertise by the decomposing of the GRiST knowledge structure in component parts, which were in turned mapped into equivalent probabilistic graphical structures such as Bayesian Belief Nets and Markov Random Fields to produce a composite chain graph that provides a probabilistic classification of risk expertise to complement the expert clinical judgements
Resumo:
Despite concerted academic interest in the strategic decision-making process (SDMP) since the 1980s, a coherent body of theory capable of guiding practice has not materialised. This is because many prior studies focus only on a single process characteristic, often rationality or comprehensiveness, and have paid insufficient attention to context. To further develop theory, research is required which examines: (i) the influence of context from multiple theoretical perspectives (e.g. upper echelons, environmental determinism); (ii) different process characteristics from both synoptic formal (e.g. rationality) and political incremental (e.g. politics) perspectives, and; (iii) the effects of context and process characteristics on a range of SDMP outcomes. Using data from 30 interviews and 357 questionnaires, this thesis addresses several opportunities for theory development by testing an integrative model which incorporates: (i) five SDMP characteristics representing both synoptic formal (procedural rationality, comprehensiveness, and behavioural integration) and political incremental (intuition, and political behaviour) perspectives; (ii) four SDMP outcome variables—strategic decision (SD) quality, implementation success, commitment, and SD speed, and; (iii) contextual variables from the four theoretical perspectives—upper echelons, SD-specific characteristics, environmental determinism, and firm characteristics. The present study makes several substantial and original contributions to knowledge. First, it provides empirical evidence of the contextual boundary conditions under which intuition and political behaviour positively influence SDMP outcomes. Second, it establishes the predominance of the upper echelons perspective; with TMT variables explaining significantly more variance in SDMP characteristics than SD specific characteristics, the external environment, and firm characteristics. A newly developed measure of top management team expertise also demonstrates highly significant direct and indirect effects on the SDMP. Finally, it is evident that SDMP characteristics and contextual variables influence a number of SDMP outcomes, not just overall SD quality, but also implementation success, commitment, and SD speed.
Developing a probabilistic graphical structure from a model of mental-health clinical risk expertise
Resumo:
This paper explores the process of developing a principled approach for translating a model of mental-health risk expertise into a probabilistic graphical structure. The Galatean Risk Screening Tool [1] is a psychological model for mental health risk assessment based on fuzzy sets. This paper details how the knowledge encapsulated in the psychological model was used to develop the structure of the probability graph by exploiting the semantics of the clinical expertise. These semantics are formalised by a detailed specification for an XML structure used to represent the expertise. The component parts were then mapped to equivalent probabilistic graphical structures such as Bayesian Belief Nets and Markov Random Fields to produce a composite chain graph that provides a probabilistic classification of risk expertise to complement the expert clinical judgements. © Springer-Verlag 2010.
Resumo:
This research describes a computerized model of human classification which has been constructed to represent the process by which assessments are made for psychodynamic psychotherapy. The model assigns membership grades (MGs) to clients so that the most suitable ones have high values in the therapy category. Categories consist of a hierarchy of components, one of which, ego strength, is analysed in detail to demonstrate the way it has captured the psychotherapist's knowledge. The bottom of the hierarchy represents the measurable factors being assessed during an interview. A questionnaire was created to gather the identified information and was completed by the psychotherapist after each assessment. The results were fed into the computerized model, demonstrating a high correlation between the model MGs and the suitability ratings of the psychotherapist (r = .825 for 24 clients). The model has successfully identified the relevant data involved in assessment and simulated the decision-making process of the expert. Its cognitive validity enables decisions to be explained, which means that it has potential for therapist training and also for enhancing the referral process, with benefits in cost effectiveness as well as in the reduction of trauma to clients. An adapted version measuring client improvement would give quantitative evidence for the benefit of therapy, thereby supporting auditing and accountability. © 1997 The British Psychological Society.
Resumo:
This study draws upon effectuation and causation as examples of planning-based and flexible decision-making logics, and investigates dynamics in the use of both logics. The study applies a longitudinal process research approach to investigate strategic decision-making in new venture creation over time. Combining qualitative and quantitative methods, we analyze 385 decision events across nine technology-based ventures. Our observations suggest a hybrid perspective on strategic decision-making, demonstrating how effectuation and causation logics are combined, and how entrepreneurs’ emphasis on these logics shifts and re-shifts over time. We induce a dynamic model which extends the literature on strategic decision-making in venture creation.
Resumo:
2000 Mathematics Subject Classification: 60J80.
Resumo:
Local Government Authorities (LGAs) are mainly characterised as information-intensive organisations. To satisfy their information requirements, effective information sharing within and among LGAs is necessary. Nevertheless, the dilemma of Inter-Organisational Information Sharing (IOIS) has been regarded as an inevitable issue for the public sector. Despite a decade of active research and practice, the field lacks a comprehensive framework to examine the factors influencing Electronic Information Sharing (EIS) among LGAs. The research presented in this paper contributes towards resolving this problem by developing a conceptual framework of factors influencing EIS in Government-to-Government (G2G) collaboration. By presenting this model, we attempt to clarify that EIS in LGAs is affected by a combination of environmental, organisational, business process, and technological factors and that it should not be scrutinised merely from a technical perspective. To validate the conceptual rationale, multiple case study based research strategy was selected. From an analysis of the empirical data from two case organisations, this paper exemplifies the importance (i.e. prioritisation) of these factors in influencing EIS by utilising the Analytical Hierarchy Process (AHP) technique. The intent herein is to offer LGA decision-makers with a systematic decision-making process in realising the importance (i.e. from most important to least important) of EIS influential factors. This systematic process will also assist LGA decision-makers in better interpreting EIS and its underlying problems. The research reported herein should be of interest to both academics and practitioners who are involved in IOIS, in general, and collaborative e-Government, in particular. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.
Resumo:
The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors’ sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, “ Investor Sentiment and Intrinsic Stock Prices”, a new technical trading strategy was developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results indicate that sample firms trade within a range and give signals as to when to buy or sell. In the second essay, “Managerial Sentiment and the Value of the Firm”, examined the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Final analysis reported that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. Changes in the cost of capital, weighted cost of average capital were found due to managerial sentiment. In the last essay, “Investor Sentiment and Optimal Portfolio Selection”, analyzed how the investor sentiment affects the nature and composition of the optimal portfolio as well as the portfolio performance. Results suggested that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicated the practical application of behavioral model based technical indicator for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.
Resumo:
The most important factor that affects the decision making process in finance is the risk which is usually measured by variance (total risk) or systematic risk (beta). Since investors' sentiment (whether she is an optimist or pessimist) plays a very important role in the choice of beta measure, any decision made for the same asset within the same time horizon will be different for different individuals. In other words, there will neither be homogeneity of beliefs nor the rational expectation prevalent in the market due to behavioral traits. This dissertation consists of three essays. In the first essay, Investor Sentiment and Intrinsic Stock Prices, a new technical trading strategy is developed using a firm specific individual sentiment measure. This behavioral based trading strategy forecasts a range within which a stock price moves in a particular period and can be used for stock trading. Results show that sample firms trade within a range and show signals as to when to buy or sell. The second essay, Managerial Sentiment and the Value of the Firm, examines the effect of managerial sentiment on the project selection process using net present value criterion and also effect of managerial sentiment on the value of firm. Findings show that high sentiment and low sentiment managers obtain different values for the same firm before and after the acceptance of a project. The last essay, Investor Sentiment and Optimal Portfolio Selection, analyzes how the investor sentiment affects the nature and composition of the optimal portfolio as well as the performance measures. Results suggest that the choice of the investor sentiment completely changes the portfolio composition, i.e., the high sentiment investor will have a completely different choice of assets in the portfolio in comparison with the low sentiment investor. The results indicate the practical application of behavioral model based technical indicators for stock trading. Additional insights developed include the valuation of firms with a behavioral component and the importance of distinguishing portfolio performance based on sentiment factors.