825 resultados para decision support tool
Resumo:
Acoustics is a rich source of environmental information that can reflect the ecological dynamics. To deal with the escalating acoustic data, a variety of automated classification techniques have been used for acoustic patterns or scene recognition, including urban soundscapes such as streets and restaurants; and natural soundscapes such as raining and thundering. It is common to classify acoustic patterns under the assumption that a single type of soundscapes present in an audio clip. This assumption is reasonable for some carefully selected audios. However, only few experiments have been focused on classifying simultaneous acoustic patterns in long-duration recordings. This paper proposes a binary relevance based multi-label classification approach to recognise simultaneous acoustic patterns in one-minute audio clips. By utilising acoustic indices as global features and multilayer perceptron as a base classifier, we achieve good classification performance on in-the-field data. Compared with single-label classification, multi-label classification approach provides more detailed information about the distributions of various acoustic patterns in long-duration recordings. These results will merit further biodiversity investigations, such as bird species surveys.
Resumo:
This thesis investigates factors that impact the energy efficiency of a mining operation. An innovative mathematical framework and solution approach are developed to model, solve and analyse an open-pit coal mine. A case study in South East Queensland is investigated to validate the approach and explore the opportunities for using it to aid long, medium and short term decision makers.
Resumo:
The aim of this study was to identify and describe the clinical reasoning characteristics of diagnostic experts. A group of 21 experienced general practitioners were asked to complete the Diagnostic Thinking Inventory (DTI) and a set of 10 clinical reasoning problems (CRPs) to evaluate their clinical reasoning. Both the DTI and the CRPs were scored, and the CRP response patterns of each GP examined in terms of the number and type of errors contained in them. Analysis of these data showed that six GPs were able to reach the correct diagnosis using significantly less clinical information than their colleagues. These GPs also made significantly fewer interpretation errors but scored lower on both the DTI and the CRPs. Additionally, this analysis showed that more than 20% of misdiagnoses occurred despite no errors being made in the identification and interpretation of relevant clinical information. These results indicate that these six GPs diagnose efficiently, effectively and accurately using relatively few clinical data and can therefore be classified as diagnostic experts. They also indicate that a major cause of misdiagnoses is failure to properly integrate clinical data. We suggest that increased emphasis on this step in the reasoning process should prove beneficial to the development of clinical reasoning skill in undergraduate medical students.
Resumo:
Modern database systems incorporate a query optimizer to identify the most efficient "query execution plan" for executing the declarative SQL queries submitted by users. A dynamic-programming-based approach is used to exhaustively enumerate the combinatorially large search space of plan alternatives and, using a cost model, to identify the optimal choice. While dynamic programming (DP) works very well for moderately complex queries with up to around a dozen base relations, it usually fails to scale beyond this stage due to its inherent exponential space and time complexity. Therefore, DP becomes practically infeasible for complex queries with a large number of base relations, such as those found in current decision-support and enterprise management applications. To address the above problem, a variety of approaches have been proposed in the literature. Some completely jettison the DP approach and resort to alternative techniques such as randomized algorithms, whereas others have retained DP by using heuristics to prune the search space to computationally manageable levels. In the latter class, a well-known strategy is "iterative dynamic programming" (IDP) wherein DP is employed bottom-up until it hits its feasibility limit, and then iteratively restarted with a significantly reduced subset of the execution plans currently under consideration. The experimental evaluation of IDP indicated that by appropriate choice of algorithmic parameters, it was possible to almost always obtain "good" (within a factor of twice of the optimal) plans, and in the few remaining cases, mostly "acceptable" (within an order of magnitude of the optimal) plans, and rarely, a "bad" plan. While IDP is certainly an innovative and powerful approach, we have found that there are a variety of common query frameworks wherein it can fail to consistently produce good plans, let alone the optimal choice. This is especially so when star or clique components are present, increasing the complexity of th- e join graphs. Worse, this shortcoming is exacerbated when the number of relations participating in the query is scaled upwards.
Resumo:
The Intelligent Decision Support System (IDSS), also called an expert system, is explained. It was then applied to choose the right composition and firing temperature of a ZnO based varistor. 17 refs.
Resumo:
IRTORISKI-hankkeessa tutkittiin, miten kustannus–hyötyanalyysin käyttöä ilmastonmuutoksen sopeutumissuunnittelussa voitaisiin helpottaa niin, että sitä pystyttäisiin hyödyntämään kustannustehokkaasti sekä ilmastonmuutokseen liittyvien vaarojen priorisoinnissa että ennaltaehkäisevien toimenpiteiden vertailussa. Tutkimuksessa käytettiin esimerkkitapauksina jokitulvaa ja rankkasateiden aiheuttamaa tulvaa kaupunkiolosuhteissa. Tapahtumapuuanalyysia laajennettiin siten, että siitä käyvät ilmi sekä suorat vahingot että lopulliset makrotaloudelliset vaikutukset. Arviot suorista taloudellisista vahingoista perustuivat aikaisempiin tutkimuksiin, kun taas makrotaloudellisia vaikutuksia simuloitiin yleisen tasapainon mallin avulla. Tapaustutkimusten valinnasta, tapahtumapuun käytöstä, sen laajennusosasta sekä lasketuista makrotaloudellisista vaikutuksista keskusteltiin sidosryhmien edustajien kanssa kolmessa asiantuntijaistunnossa.
Resumo:
Reduction of carbon emissions is of paramount importance in the context of global warming. Countries and global companies are now engaged in understanding systematic ways of achieving well defined emission targets. In fact, carbon credits have become significant and strategic instruments of finance for countries and global companies. In this paper, we formulate and suggest a solution to the carbon allocation problem, which involves determining a cost minimizing allocation of carbon credits among different emitting agents. We address this challenge in the context of a global company which is faced with the challenge of determining an allocation of carbon credit caps among its divisions in a cost effective way. The problem is formulated as a reverse auction problem where the company plays the role of a buyer or carbon planning authority and the different divisions within the company are the emitting agents that specify cost curves for carbon credit reductions. Two natural variants of the problem: (a) with unlimited budget and (b) with limited budget are considered. Suitable assumptions are made on the cost curves and in each of the two cases we show that the resulting problem formulation is a knapsack problem that can be solved optimally using a greedy heuristic. The solution of the allocation problem provides critical decision support to global companies engaged seriously in green programs.
Resumo:
Growing concern over the status of global and regional bioenergy resources has necessitated the analysis and monitoring of land cover and land use parameters on spatial and temporal scales. The knowledge of land cover and land use is very important in understanding natural resources utilization, conversion and management. Land cover, land use intensity and land use diversity are land quality indicators for sustainable land management. Optimal management of resources aids in maintaining the ecosystem balance and thereby ensures the sustainable development of a region. Thus sustainable development of a region requires a synoptic ecosystem approach in the management of natural resources that relates to the dynamics of natural variability and the effects of human intervention on key indicators of biodiversity and productivity. Spatial and temporal tools such as remote sensing (RS), geographic information system (GIS) and global positioning system (GPS) provide spatial and attribute data at regular intervals with functionalities of a decision support system aid in visualisation, querying, analysis, etc., which would aid in sustainable management of natural resources. Remote sensing data and GIS technologies play an important role in spatially evaluating bioresource availability and demand. This paper explores various land cover and land use techniques that could be used for bioresources monitoring considering the spatial data of Kolar district, Karnataka state, India. Slope and distance based vegetation indices are computed for qualitative and quantitative assessment of land cover using remote spectral measurements. Differentscale mapping of land use pattern in Kolar district is done using supervised classification approaches. Slope based vegetation indices show area under vegetation range from 47.65 % to 49.05% while distance based vegetation indices shoes its range from 40.40% to 47.41%. Land use analyses using maximum likelihood classifier indicate that 46.69% is agricultural land, 42.33% is wasteland (barren land), 4.62% is built up, 3.07% of plantation, 2.77% natural forest and 0.53% water bodies. The comparative analysis of various classifiers, indicate that the Gaussian maximum likelihood classifier has least errors. The computation of talukwise bioresource status shows that Chikballapur Taluk has better availability of resources compared to other taluks in the district.
Resumo:
It is a well-known fact that most of the developing countries have intermittent water supply and the quantity of water supplied from the source is also not distributed equitably among the consumers. Aged pipelines, pump failures, and improper management of water resources are some of the main reasons for it. This study presents the application of a nonlinear control technique to overcome this problem in different zones in the city of Bangalore. The water is pumped to the city from a large distance of approximately 100km over a very high elevation of approximately 400m. The city has large undulating terrain among different zones, which leads to unequal distribution of water. The Bangalore, inflow water-distribution system (WDS) has been modeled. A dynamic inversion (DI) nonlinear controller with proportional integral derivative (PID) features (DI-PID) is used for valve throttling to achieve the target flows to different zones of the city. This novel approach of equitable water distribution using DI-PID controllers that can be used as a decision support system is discussed in this paper.
Resumo:
Most pattern mining methods yield a large number of frequent patterns, and isolating a small relevant subset of patterns is a challenging problem of current interest. In this paper, we address this problem in the context of discovering frequent episodes from symbolic time-series data. Motivated by the Minimum Description Length principle, we formulate the problem of selecting relevant subset of patterns as one of searching for a subset of patterns that achieves best data compression. We present algorithms for discovering small sets of relevant non-redundant episodes that achieve good data compression. The algorithms employ a novel encoding scheme and use serial episodes with inter-event constraints as the patterns. We present extensive simulation studies with both synthetic and real data, comparing our method with the existing schemes such as GoKrimp and SQS. We also demonstrate the effectiveness of these algorithms on event sequences from a composable conveyor system; this system represents a new application area where use of frequent patterns for compressing the event sequence is likely to be important for decision support and control.
Resumo:
This publication introduces the methods and results of a research project that has developed a set of decision-support tools to identify places and sets of conditions for which a particular target aquaculture technology is considered feasible and therefore good to promote. The tools also identify the nature of constraints to aquaculture development and thereby shed light on appropriate interventions to realize the potential of the target areas. The project results will be useful for policy planners and decision makers in national, regional and local governments and development funding agencies, aquaculture extension workers in regional and local governments, and researchers in aquaculture systems and rural livelihoods. (Document contains 40 pages)
Resumo:
This monograph is a result of a 3-year project to produce a decision-support toolkit with supporting databases and case studies to help researchers, planners and extension agents working on freshwater pond aquaculture. The purpose of the work was to provide tools and information to help practitioners identify places and conditions where pond aquaculture can benefit the poor, both as producers and as consumers of fish. This monograph is the case study for Malawi. Written in three parts, it describes the historical background, practices, stakeholder profiles, production levels, economic and institutional environment, policy issues, and prospects for aquaculture in the country. First, it documents the history and current status of the aquaculture in the country. Second, it assesses the technologies and approaches that either succeeded or failed to foster aquaculture development and discusses why. Third, it identifies the key reasons for aquaculture adoption.
Resumo:
Harmful Algal Research and Response: A Human Dimensions Strategy (HARR-HD) justifies and guides a coordinated national commitment to human dimensions research critical to prevent and respond to impacts of harmful algal blooms (HABs). Beyond HABs, it serves as a framework for developing hu-man dimensions research as a cross-cutting priority of ecosystem science supporting coastal and ocean management, including hazard research and mitigation planning. Measuring and promoting commu-nity resilience to hazards require human dimensions research outcomes such as effective risk commu-nication strategies; assessment of community vulnerability; identification of susceptible populations; comprehensive assessment of environmental, sociocultural, and economic impacts; development of effective decision support tools; and improved coordination among agencies and stakeholders. HARR-HD charts a course for human dimensions research to achieve these and other priorities through co-ordinated implementation by the Joint Subcommittee on Ocean Science and Technology (JSOST) In-teragency Working Group on HABs, Hypoxia and Human Health (IWG-4H); national HAB funding programs; national research and response programs; and state research and monitoring programs. (PDF contains 72 pages)
Resumo:
The implementation of various types of marine protected areas is one of several management tools available for conserving representative examples of the biological diversity within marine ecosystems in general and National Marine Sanctuaries in particular. However, deciding where and how many sites to establish within a given area is frequently hampered by incomplete knowledge of the distribution of organisms and an understanding of the potential tradeoffs that would allow planners to address frequently competing interests in an objective manner. Fortunately, this is beginning to change. Recent studies on the continental shelf of the northeastern United States suggest that substrate and water mass characteristics are highly correlated with the composition of benthic communities and may therefore, serve as proxies for the distribution of biological biodiversity. A detailed geo-referenced interpretative map of major sediment types within Stellwagen Bank National Marine Sanctuary (SBNMS) has recently been developed, and computer-aided decision support tools have reached new levels of sophistication. We demonstrate the use of simulated annealing, a type of mathematical optimization, to identify suites of potential conservation sites within SBNMS that equally represent 1) all major sediment types and 2) derived habitat types based on both sediment and depth in the smallest amount of space. The Sanctuary was divided into 3610 0.5 min2 sampling units. Simulations incorporated constraints on the physical dispersion of sampling units to varying degrees such that solutions included between one and four site clusters. Target representation goals were set at 5, 10, 15, 20, and 25 percent of each sediment type, and 10 and 20 percent of each habitat type. Simulations consisted of 100 runs, from which we identified the best solution (i.e., smallest total area) and four nearoptimal alternates. We also plotted total instances in which each sampling unit occurred in solution sets of the 100 runs as a means of gauging the variety of spatial configurations available under each scenario. Results suggested that the total combined area needed to represent each of the sediment types in equal proportions was equal to the percent representation level sought. Slightly larger areas were required to represent all habitat types at the same representation levels. Total boundary length increased in direct proportion to the number of sites at all levels of representation for simulations involving sediment and habitat classes, but increased more rapidly with number of sites at higher representation levels. There were a large number of alternate spatial configurations at all representation levels, although generally fewer among one and two versus three- and four-site solutions. These differences were less pronounced among simulations targeting habitat representation, suggesting that a similar degree of flexibility is inherent in the spatial arrangement of potential protected area systems containing one versus several sites for similar levels of habitat representation. We attribute these results to the distribution of sediment and depth zones within the Sanctuary, and to the fact that even levels of representation were sought in each scenario. (PDF contains 33 pages.)
Resumo:
Since the early years of the 21st century, and in particular since 2007, the U.S. has been awakening rapidly to the fact that climate change is underway and that even if stringent efforts are undertaken to mitigate greenhouse gas emissions, adaptation to the unavoidable impacts from the existing commitment to climate change is still needed and needs to be begun now. This report provides an historical overview of the public, political, and scientific concern with adaptation in the United States. It begins by briefly distinguishing ongoing, historical adaptation to environmental circumstances from deliberate adaptation to human‐induced climate change. It then describes the shift from the early concerns with climate change and adaptation to the more recent awakening to the need for a comprehensive approach to managing the risks from climate change. Ranging from the treatment of the topic in the news media to the drafting of bills in Congress, to state and local government activities with considerable engagement of NGOs, scientists and consultants, it is apparent that adaptation has finally, and explosively, emerged on the political agenda as a legitimate and needed subject for debate. At the same time, the current policy rush is not underlain by widespread public engagement and mobilization nor does it rest on a solid research foundation. Funding for vulnerability and adaptation research, establishing adequate decision support institutions, as well as the building of the necessary capacity in science, the consulting world, and in government agencies, lags far behind the need. (PDF contains 42 pages)