896 resultados para Decision-support tools
Resumo:
Objectives: To develop a decision support system (DSS), myGRaCE, that integrates service user (SU) and practitioner expertise about mental health and associated risks of suicide, self-harm, harm to others, self-neglect, and vulnerability. The intention is to help SUs assess and manage their own mental health collaboratively with practitioners. Methods: An iterative process involving interviews, focus groups, and agile software development with 115 SUs, to elicit and implement myGRaCE requirements. Results: Findings highlight shared understanding of mental health risk between SUs and practitioners that can be integrated within a single model. However, important differences were revealed in SUs' preferred process of assessing risks and safety, which are reflected in the distinctive interface, navigation, tool functionality and language developed for myGRaCE. A challenge was how to provide flexible access without overwhelming and confusing users. Conclusion: The methods show that practitioner expertise can be reformulated in a format that simultaneously captures SU expertise, to provide a tool highly valued by SUs. A stepped process adds necessary structure to the assessment, each step with its own feedback and guidance. Practice Implications: The GRiST web-based DSS (www.egrist.org) links and integrates myGRaCE self-assessments with GRiST practitioner assessments for supporting collaborative and self-managed healthcare.
Resumo:
The use of digital games and gamification has demonstrated potential to improve many aspects of how businesses provide training to staff, and communicate with consumers. However, there is still a need for better understanding of how the adoption of games and gasification would influence the process of decision-making in organisations across different industry. This article provides a structured review of existing literature on the use of games in the business environment, and seeks to consolidate findings to address research questions regarding their perception, proven efficacy, and identifies key areas for future work. The findings highlight that serious games can have positive and effective impacts in multiple areas of a business, including training, decision-support, and consumer outreach. They also emphasise the challenges and pitfalls of applying serious games and gamification principles within a business context, and discuss the implications of development and evaluation methodologies on the success of a game-based solution.
Resumo:
Introduction: There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis: Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and dissemination: The study does not require ethical approval as primary data will not be collected. The results of the study will be published in a peer-reviewed journal and presented at relevant conferences.
Resumo:
Incomplete pairwise comparison matrix was introduced by Harker in 1987 for the case in which the decision maker does not fill in the whole matrix completely due to, e.g., time limitations. However, incomplete matrices occur in a natural way even if the decision maker provides a completely filled in matrix in the end. In each step of the total n(n–1)/2, an incomplete pairwise comparison is given, except for the last one where the matrix turns into complete. Recent results on incomplete matrices make it possible to estimate inconsistency indices CR and CM by the computation of tight lower bounds in each step of the filling in process. Additional information on ordinal inconsistency is also provided. Results can be applied in any decision support system based on pairwise comparison matrices. The decision maker gets an immediate feedback in case of mistypes, possibly causing a high level of inconsistency.
Resumo:
The traditional approach to crisis management suggest autocratic leadership, that has risks anyway (leader is the bottle-neck of problem solving, single-loop learning, crisis management is a matter of efficiency). However, managing nowadays crisis is rather effectiveness issue, and requires double-loop learning (second-order change) and leadership role in the sense of Kotter’s theory. Paper discusses the top-management’s leadership responsibilities, and their special tasks in the problem solving process of change. Inappropriate perception of leadership responsibilities and insisting upon first-order change strategy results in becoming part of the problem, rather that part of the solution of the problem.
Resumo:
The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. ^ For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver.^ The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. ^ The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.^
Resumo:
Construction projects are complex endeavors that require the involvement of different professional disciplines in order to meet various project objectives that are often conflicting. The level of complexity and the multi-objective nature of construction projects lend themselves to collaborative design and construction such as integrated project delivery (IPD), in which relevant disciplines work together during project conception, design and construction. Traditionally, the main objectives of construction projects have been to build in the least amount of time with the lowest cost possible, thus the inherent and well-established relationship between cost and time has been the focus of many studies. The importance of being able to effectively model relationships among multiple objectives in building construction has been emphasized in a wide range of research. In general, the trade-off relationship between time and cost is well understood and there is ample research on the subject. However, despite sustainable building designs, relationships between time and environmental impact, as well as cost and environmental impact, have not been fully investigated. The objectives of this research were mainly to analyze and identify relationships of time, cost, and environmental impact, in terms of CO2 emissions, at different levels of a building: material level, component level, and building level, at the pre-use phase, including manufacturing and construction, and the relationships of life cycle cost and life cycle CO2 emissions at the usage phase. Additionally, this research aimed to develop a robust simulation-based multi-objective decision-support tool, called SimulEICon, which took construction data uncertainty into account, and was capable of incorporating life cycle assessment information to the decision-making process. The findings of this research supported the trade-off relationship between time and cost at different building levels. Moreover, the time and CO2 emissions relationship presented trade-off behavior at the pre-use phase. The results of the relationship between cost and CO2 emissions were interestingly proportional at the pre-use phase. The same pattern continually presented after the construction to the usage phase. Understanding the relationships between those objectives is a key in successfully planning and designing environmentally sustainable construction projects.
Resumo:
The increasing emphasis on mass customization, shortened product lifecycles, synchronized supply chains, when coupled with advances in information system, is driving most firms towards make-to-order (MTO) operations. Increasing global competition, lower profit margins, and higher customer expectations force the MTO firms to plan its capacity by managing the effective demand. The goal of this research was to maximize the operational profits of a make-to-order operation by selectively accepting incoming customer orders and simultaneously allocating capacity for them at the sales stage. For integrating the two decisions, a Mixed-Integer Linear Program (MILP) was formulated which can aid an operations manager in an MTO environment to select a set of potential customer orders such that all the selected orders are fulfilled by their deadline. The proposed model combines order acceptance/rejection decision with detailed scheduling. Experiments with the formulation indicate that for larger problem sizes, the computational time required to determine an optimal solution is prohibitive. This formulation inherits a block diagonal structure, and can be decomposed into one or more sub-problems (i.e. one sub-problem for each customer order) and a master problem by applying Dantzig-Wolfe’s decomposition principles. To efficiently solve the original MILP, an exact Branch-and-Price algorithm was successfully developed. Various approximation algorithms were developed to further improve the runtime. Experiments conducted unequivocally show the efficiency of these algorithms compared to a commercial optimization solver. The existing literature addresses the static order acceptance problem for a single machine environment having regular capacity with an objective to maximize profits and a penalty for tardiness. This dissertation has solved the order acceptance and capacity planning problem for a job shop environment with multiple resources. Both regular and overtime resources is considered. The Branch-and-Price algorithms developed in this dissertation are faster and can be incorporated in a decision support system which can be used on a daily basis to help make intelligent decisions in a MTO operation.
Resumo:
Il lavoro presentato in questo elaborato tratterà lo sviluppo di un sistema di alerting che consenta di monitorare proattivamente una o più sorgenti dati aziendali, segnalando le eventuali condizioni di irregolarità rilevate; questo verrà incluso all'interno di sistemi già esistenti dedicati all'analisi dei dati e alla pianificazione, ovvero i cosiddetti Decision Support Systems. Un sistema di supporto alle decisioni è in grado di fornire chiare informazioni per tutta la gestione dell'impresa, misurandone le performance e fornendo proiezioni sugli andamenti futuri. Questi sistemi vengono catalogati all'interno del più ampio ambito della Business Intelligence, che sottintende l'insieme di metodologie in grado di trasformare i dati di business in informazioni utili al processo decisionale. L'intero lavoro di tesi è stato svolto durante un periodo di tirocinio svolto presso Iconsulting S.p.A., IT System Integrator bolognese specializzato principalmente nello sviluppo di progetti di Business Intelligence, Enterprise Data Warehouse e Corporate Performance Management. Il software che verrà illustrato in questo elaborato è stato realizzato per essere collocato all'interno di un contesto più ampio, per rispondere ai requisiti di un cliente multinazionale leader nel settore della telefonia mobile e fissa.
Resumo:
This paper derives a theoretical framework for consideration of both the technologically driven dimensions of mobile payment solutions, and the associated value proposition for customers. Banks promote traditional payment instruments whose value proposition is the management of risk for both consumers and merchants. These instruments are centralised, costly and lack decision support functionality. The ubiquity of the mobile phone has provided a decentralised platform for managing payment processes in a new way, but the value proposition for customers has yet to be elaborated clearly. This inertia has stalled the design of sustainable revenue models for a mobile payments ecosystem. Merchants and consumers in the meantime are being seduced by the convenience of on-line and mobile payment solutions. Adopting the purchase and payment process as the unit of analysis, the current mobile payment landscape is reviewed with respect to the creation and consumption of customer value. From this analysis, a framework is derived juxtaposing customer value, related to what is being paid for, with payment integration, related to how payments are being made. The framework provides a theoretical and practical basis for considering the contribution of mobile technologies to the payment industry. The framework is then used to describe the components of a mobile payments pilot project being run on a trial population of 250 students on a campus in Ireland. In this manner, weaknesses in the value proposition for consumers and merchants were highlighted. Limitations of the framework as a research tool are also discussed.
Resumo:
Precision medicine is an emerging approach to disease treatment and prevention that considers variability in patient genes, environment, and lifestyle. However, little has been written about how such research impacts emergency care. Recent advances in analytical techniques have made it possible to characterize patients in a more comprehensive and sophisticated fashion at the molecular level, promising highly individualized diagnosis and treatment. Among these techniques are various systematic molecular phenotyping analyses (e.g., genomics, transcriptomics, proteomics, and metabolomics). Although a number of emergency physicians use such techniques in their research, widespread discussion of these approaches has been lacking in the emergency care literature and many emergency physicians may be unfamiliar with them. In this article, we briefly review the underpinnings of such studies, note how they already impact acute care, discuss areas in which they might soon be applied, and identify challenges in translation to the emergency department (ED). While such techniques hold much promise, it is unclear whether the obstacles to translating their findings to the ED will be overcome in the near future. Such obstacles include validation, cost, turnaround time, user interface, decision support, standardization, and adoption by end-users.
Resumo:
BACKGROUND: It is unclear whether diagnostic protocols based on cardiac markers to identify low-risk chest pain patients suitable for early release from the emergency department can be applied to patients older than 65 years or with traditional cardiac risk factors. METHODS AND RESULTS: In a single-center retrospective study of 231 consecutive patients with high-risk factor burden in which a first cardiac troponin (cTn) level was measured in the emergency department and a second cTn sample was drawn 4 to 14 hours later, we compared the performance of a modified 2-Hour Accelerated Diagnostic Protocol to Assess Patients with Chest Pain Using Contemporary Troponins as the Only Biomarker (ADAPT) rule to a new risk classification scheme that identifies patients as low risk if they have no known coronary artery disease, a nonischemic electrocardiogram, and 2 cTn levels below the assay's limit of detection. Demographic and outcome data were abstracted through chart review. The median age of our population was 64 years, and 75% had Thrombosis In Myocardial Infarction risk score ≥2. Using our risk classification rule, 53 (23%) patients were low risk with a negative predictive value for 30-day cardiac events of 98%. Applying a modified ADAPT rule to our cohort, 18 (8%) patients were identified as low risk with a negative predictive value of 100%. In a sensitivity analysis, the negative predictive value of our risk algorithm did not change when we relied only on undetectable baseline cTn and eliminated the second cTn assessment. CONCLUSIONS: If confirmed in prospective studies, this less-restrictive risk classification strategy could be used to safely identify chest pain patients with more traditional cardiac risk factors for early emergency department release.
Resumo:
The amount and quality of available biomass is a key factor for the sustainable livestock industry and agricultural management related decision making. Globally 31.5% of land cover is grassland while 80% of Ireland’s agricultural land is grassland. In Ireland, grasslands are intensively managed and provide the cheapest feed source for animals. This dissertation presents a detailed state of the art review of satellite remote sensing of grasslands, and the potential application of optical (Moderate–resolution Imaging Spectroradiometer (MODIS)) and radar (TerraSAR-X) time series imagery to estimate the grassland biomass at two study sites (Moorepark and Grange) in the Republic of Ireland using both statistical and state of the art machine learning algorithms. High quality weather data available from the on-site weather station was also used to calculate the Growing Degree Days (GDD) for Grange to determine the impact of ancillary data on biomass estimation. In situ and satellite data covering 12 years for the Moorepark and 6 years for the Grange study sites were used to predict grassland biomass using multiple linear regression, Neuro Fuzzy Inference Systems (ANFIS) models. The results demonstrate that a dense (8-day composite) MODIS image time series, along with high quality in situ data, can be used to retrieve grassland biomass with high performance (R2 = 0:86; p < 0:05, RMSE = 11.07 for Moorepark). The model for Grange was modified to evaluate the synergistic use of vegetation indices derived from remote sensing time series and accumulated GDD information. As GDD is strongly linked to the plant development, or phonological stage, an improvement in biomass estimation would be expected. It was observed that using the ANFIS model the biomass estimation accuracy increased from R2 = 0:76 (p < 0:05) to R2 = 0:81 (p < 0:05) and the root mean square error was reduced by 2.72%. The work on the application of optical remote sensing was further developed using a TerraSAR-X Staring Spotlight mode time series over the Moorepark study site to explore the extent to which very high resolution Synthetic Aperture Radar (SAR) data of interferometrically coherent paddocks can be exploited to retrieve grassland biophysical parameters. After filtering out the non-coherent plots it is demonstrated that interferometric coherence can be used to retrieve grassland biophysical parameters (i. e., height, biomass), and that it is possible to detect changes due to the grass growth, and grazing and mowing events, when the temporal baseline is short (11 days). However, it not possible to automatically uniquely identify the cause of these changes based only on the SAR backscatter and coherence, due to the ambiguity caused by tall grass laid down due to the wind. Overall, the work presented in this dissertation has demonstrated the potential of dense remote sensing and weather data time series to predict grassland biomass using machine-learning algorithms, where high quality ground data were used for training. At present a major limitation for national scale biomass retrieval is the lack of spatial and temporal ground samples, which can be partially resolved by minor modifications in the existing PastureBaseIreland database by adding the location and extent ofeach grassland paddock in the database. As far as remote sensing data requirements are concerned, MODIS is useful for large scale evaluation but due to its coarse resolution it is not possible to detect the variations within the fields and between the fields at the farm scale. However, this issue will be resolved in terms of spatial resolution by the Sentinel-2 mission, and when both satellites (Sentinel-2A and Sentinel-2B) are operational the revisit time will reduce to 5 days, which together with Landsat-8, should enable sufficient cloud-free data for operational biomass estimation at a national scale. The Synthetic Aperture Radar Interferometry (InSAR) approach is feasible if there are enough coherent interferometric pairs available, however this is difficult to achieve due to the temporal decorrelation of the signal. For repeat-pass InSAR over a vegetated area even an 11 days temporal baseline is too large. In order to achieve better coherence a very high resolution is required at the cost of spatial coverage, which limits its scope for use in an operational context at a national scale. Future InSAR missions with pair acquisition in Tandem mode will minimize the temporal decorrelation over vegetation areas for more focused studies. The proposed approach complements the current paradigm of Big Data in Earth Observation, and illustrates the feasibility of integrating data from multiple sources. In future, this framework can be used to build an operational decision support system for retrieval of grassland biophysical parameters based on data from long term planned optical missions (e. g., Landsat, Sentinel) that will ensure the continuity of data acquisition. Similarly, Spanish X-band PAZ and TerraSAR-X2 missions will ensure the continuity of TerraSAR-X and COSMO-SkyMed.
Resumo:
Accessibility concepts are increasingly acknowledged as fundamental to understand cities and urban regions. Accordingly, accessibility instruments have been recognised as valuable support tools for land-use and transport planning. However, despite the relatively large number of instruments available in the literature, they are not widely used in planning practice. This paper aims to explore why accessibility instruments are not widely used in planning practice. To this end, we focus our research on perceived user-friendliness and usefulness of accessibility instruments. First, we surveyed some instrument developers, providing an overview of the characteristics of accessibility instruments available and on developers’ perceptions of their user-friendliness in planning practice. Second, we brought together developers and planning practitioners in some local workshops across Europe and Australia, where participants were asked to use insights provided by accessibility instruments for the development of planning strategies. We found that most practitioners are convinced of the usefulness of accessibility instruments in planning practice, as they generate new and relevant insights for planners. Findings suggest that not only user-friendliness problems but mainly organisational barriers and lack of institutionalisation of accessibility instruments, are the main causes of the implementation gap. Thus user-friendliness improvement may provide limited contributions to the successful implementation of accessibility concepts in planning practice. In fact, there seems to be more to gain from the active and continued engagement of instrument developers with planning practitioners and the institutionalisation of accessibility planning.
Resumo:
Abstract: Decision support systems have been widely used for years in companies to gain insights from internal data, thus making successful decisions. Lately, thanks to the increasing availability of open data, these systems are also integrating open data to enrich decision making process with external data. On the other hand, within an open-data scenario, decision support systems can be also useful to decide which data should be opened, not only by considering technical or legal constraints, but other requirements, such as "reusing potential" of data. In this talk, we focus on both issues: (i) open data for decision making, and (ii) decision making for opening data. We will first briefly comment some research problems regarding using open data for decision making. Then, we will give an outline of a novel decision-making approach (based on how open data is being actually used in open-source projects hosted in Github) for supporting open data publication. Bio of the speaker: Jose-Norberto Mazón holds a PhD from the University of Alicante (Spain). He is head of the "Cátedra Telefónica" on Big Data and coordinator of the Computing degree at the University of Alicante. He is also member of the WaKe research group at the University of Alicante. His research work focuses on open data management, data integration and business intelligence within "big data" scenarios, and their application to the tourism domain (smart tourism destinations). He has published his research in international journals, such as Decision Support Systems, Information Sciences, Data & Knowledge Engineering or ACM Transaction on the Web. Finally, he is involved in the open data project in the University of Alicante, including its open data portal at http://datos.ua.es