787 resultados para decision-making, decision modelling, value of information
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Design management research usually deals with the processes within the professional design team and yet, in the UK, the volume of the total project information produced by the specialist trade contractors equals or exceeds that produced by the design team. There is a need to understand the scale of this production task and to plan and manage it accordingly. The model of the process on which the plan is to be based, while generic, must be sufficiently robust to cover the majority of instances. An approach using design elements, in sufficient depth to possibly develop tools for a predictive model of the process, is described. The starting point is that each construction element and its components have a generic sequence of design activities. Specific requirements tailor the element's application to the building. Then there are the constraints produced due to the interaction with other elements. Therefore, the selection of a component within the element may impose a set of constraints that will affect the choice of other design elements. Thus, a design decision can be seen as an interrelated element-constraint-element (ECE) sub-net. To illustrate this approach, an example of the process within precast concrete cladding has been used.
Resumo:
Information technology has become heavily embedded in business operations. As business needs change over time, IT applications are expected to continue providing required support. Whether the existing IT applications are still fit for the business purpose they were intended or new IT applications should be introduced, is a strategic decision for business, IT and business-aligned IT. In this paper, we present a method which aims to analyse business functions and IT roles, and to evaluate business-aligned IT from both social and technical perspectives. The method introduces a set of techniques that systematically supports the evaluation of the existing IT applications in relation to their technical capabilities for maximising business value. Furthermore, we discuss the evaluation process and results which are illustrated and validated through a real-life case study of a UK borough council, and followed by discussion on implications for researchers and practitioners.
Resumo:
This thesis provides three original contributions to the field of Decision Sciences. The first contribution explores the field of heuristics and biases. New variations of the Cognitive Reflection Test (CRT--a test to measure "the ability or disposition to resist reporting the response that first comes to mind"), are provided. The original CRT (S. Frederick [2005] Journal of Economic Perspectives, v. 19:4, pp.24-42) has items in which the response is immediate--and erroneous. It is shown that by merely varying the numerical parameters of the problems, large deviations in response are found. Not only the final results are affected by the proposed variations, but so is processing fluency. It seems that numbers' magnitudes serve as a cue to activate system-2 type reasoning. The second contribution explores Managerial Algorithmics Theory (M. Moldoveanu [2009] Strategic Management Journal, v. 30, pp. 737-763); an ambitious research program that states that managers display cognitive choices with a "preference towards solving problems of low computational complexity". An empirical test of this hypothesis is conducted, with results showing that this premise is not supported. A number of problems are designed with the intent of testing the predictions from managerial algorithmics against the predictions of cognitive psychology. The results demonstrate (once again) that framing effects profoundly affect choice, and (an original insight) that managers are unable to distinguish computational complexity problem classes. The third contribution explores a new approach to a computationally complex problem in marketing: the shelf space allocation problem (M-H Yang [2001] European Journal of Operational Research, v. 131, pp.107--118). A new representation for a genetic algorithm is developed, and computational experiments demonstrate its feasibility as a practical solution method. These studies lie at the interface of psychology and economics (with bounded rationality and the heuristics and biases programme), psychology, strategy, and computational complexity, and heuristics for computationally hard problems in management science.
Resumo:
This study has been prepared with the aim of bringing together the main research lines that seek to identify and convey the value of participating in initiatives like the ISE. This document is structured on three major fronts. On the first front, we conducted a survey to collect the main data on SRI trends in the two largest market niches: Europe and the United States, besides results from an MIT research study conducted among executives on the value of sustainability in the current market. Next, we carried out a non-exhaustive survey on academic studies that seek to identify the intangible and tangible gains achieved by companies through their participation in the ISE or related initiatives. Finally, we present the results of a survey done with major Brazilian pension funds aimed at investigating the knowledge of the ISE and how it could be used in their analysis and decision-making process.
Resumo:
Information flows are formed naturally or formally induced in organizational settings, passing from the strategic level to operational level, reflecting, and impacting in the processes that make up the organization, including the decision-making process and therefore the action strategies of organization. The management of organizational environments based on information requires careful attention to various kinds of languages used for communication between sectors and employees of the organization, whose goal is to share, disseminate and socialize the information produced in this environment.
Resumo:
Modern policy-making is increasingly influenced by different types of uncertainty. Political actors are supposed to behave differently under the context of uncertainty then in “usual” decision-making processes. Actors exchange information in order to convince other actors and decision-makers, to coordinate their lobbying activities and form coalitions, and to get information and learn on the substantive issue. The literature suggests that preference similarity, social trust, perceived power and functional interdependence are particularly important drivers of information exchange. We assume that social trust as well as being connected to scientific actors is more important under uncertainty than in a setting with less uncertainty. To investigate information exchange under uncertainty analyze the case of unconventional shale gas development in the UK from 2008 till 2014. Our study will rely on statistical analyses of survey data on a diverse set of actors dealing with shale gas development and regulation in the UK.
Resumo:
In this work we propose the adoption of a statistical framework used in the evaluation of forensic evidence as a tool for evaluating and presenting circumstantial "evidence" of a disease outbreak from syndromic surveillance. The basic idea is to exploit the predicted distributions of reported cases to calculate the ratio of the likelihood of observing n cases given an ongoing outbreak over the likelihood of observing n cases given no outbreak. The likelihood ratio defines the Value of Evidence (V). Using Bayes' rule, the prior odds for an ongoing outbreak are multiplied by V to obtain the posterior odds. This approach was applied to time series on the number of horses showing clinical respiratory symptoms or neurological symptoms. The separation between prior beliefs about the probability of an outbreak and the strength of evidence from syndromic surveillance offers a transparent reasoning process suitable for supporting decision makers. The value of evidence can be translated into a verbal statement, as often done in forensics or used for the production of risk maps. Furthermore, a Bayesian approach offers seamless integration of data from syndromic surveillance with results from predictive modeling and with information from other sources such as disease introduction risk assessments.
Resumo:
This paper integrates the literatures on the social value of lawsuits, the evolution of the law, and judicial preferences to evaluate the hypothesis that the law evolves toward efficiency. The setting is a simple accident model with costly litigation where the efficient law minimizes the sum of accident plus litigation costs. In the steady state equilibrium, the distribution of legal rules is not necessarily efficient but instead depends on a combination of selective litigation, judicial bias, and precedent.
Resumo:
Retrospective clinical data presents many challenges for data mining and machine learning. The transcription of patient records from paper charts and subsequent manipulation of data often results in high volumes of noise as well as a loss of other important information. In addition, such datasets often fail to represent expert medical knowledge and reasoning in any explicit manner. In this research we describe applying data mining methods to retrospective clinical data to build a prediction model for asthma exacerbation severity for pediatric patients in the emergency department. Difficulties in building such a model forced us to investigate alternative strategies for analyzing and processing retrospective data. This paper describes this process together with an approach to mining retrospective clinical data by incorporating formalized external expert knowledge (secondary knowledge sources) into the classification task. This knowledge is used to partition the data into a number of coherent sets, where each set is explicitly described in terms of the secondary knowledge source. Instances from each set are then classified in a manner appropriate for the characteristics of the particular set. We present our methodology and outline a set of experiential results that demonstrate some advantages and some limitations of our approach. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
This paper seeks to advance the theory and practice of the dynamics of complex networks in relation to direct and indirect citations. It applies social network analysis (SNA) and the ordered weighted averaging operator (OWA) to study a patent citations network. So far the SNA studies investigating long chains of patents citations have rarely been undertaken and the importance of a node in a network has been associated mostly with its number of direct ties. In this research OWA is used to analyse complex networks, assess the role of indirect ties, and provide guidance to reduce complexity for decision makers and analysts. An empirical example of a set of European patents published in 2000 in the renewable energy industry is provided to show the usefulness of the proposed approach for the preference ranking of patent citations.
Resumo:
This article presents a literature review on current issues in the field of library science, related to competence in the management and use of information, information technology and communication, information society and knowledge among others. It further seeks to highlight the importance of users to acquire these skills so they can deal effectively with decision-making, problem solving, conducting investigations and their own training. This is possible if the information and documentation systems as dynamic agents engaged to distribute scientific and technical knowledge.
Resumo:
The objective of this research is to identify the factors that influence the migration of free software to proprietary software, or vice-versa. The theoretical framework was developed in light of the Diffusion of Innovations Theory (DIT) proposed by Rogers (1976, 1995), and the Unified Theory of Acceptance and Use of Technology (UTAUT) proposed by Venkatesh, Morris, Davis and Davis (2003). The research was structured in two phases: the first phase was exploratory, characterized by adjustments of the revised theory to fit Brazilian reality and the identification of companies that could be the subject of investigation; and the second phase was qualitative, in which case studies were conducted at ArcelorMittal Tubarão (AMT), a private company that migrated from proprietary software (Unix) to free software (Linux), and the city government of Serra, in Espírito Santo state, a public organization that migrated from free software (OpenOffice) to proprietary (MS Office). The results show that software migration decision takes into account factors that go beyond issues involving technical or cost aspects, such as cultural barriers, user rejection and resistance to change. These results underscore the importance of social aspects, which can play a decisive role in the decision regarding software migration and its successful implementation.