898 resultados para Decision Theory
Resumo:
Climate change is a major threat to global biodiversity, and its impacts can act synergistically to heighten the severity of other threats. Most research on projecting species range shifts under climate change has not been translated to informing priority management strategies on the ground. We develop a prioritization framework to assess strategies for managing threats to biodiversity under climate change and apply it to the management of invasive animal species across one-sixth of the Australian continent, the Lake Eyre Basin. We collected information from key stakeholders and experts on the impacts of invasive animals on 148 of the region's most threatened species and 11 potential strategies. Assisted by models of current distributions of threatened species and their projected distributions, experts estimated the cost, feasibility, and potential benefits of each strategy for improving the persistence of threatened species with and without climate change. We discover that the relative cost-effectiveness of invasive animal control strategies is robust to climate change, with the management of feral pigs being the highest priority for conserving threatened species overall. Complementary sets of strategies to protect as many threatened species as possible under limited budgets change when climate change is considered, with additional strategies required to avoid impending extinctions from the region. Overall, we find that the ranking of strategies by cost-effectiveness was relatively unaffected by including climate change into decision-making, even though the benefits of the strategies were lower. Future climate conditions and impacts on range shifts become most important to consider when designing comprehensive management plans for the control of invasive animals under limited budgets to maximize the number of threatened species that can be protected.
Resumo:
This thesis evaluates the effectiveness of the prescribed design and distribution requirements of the Australian Government's home loan key facts sheets (KFS) aimed at helping borrowers compare loan costs. The findings show that despite effectively improving borrower decision-making, few borrowers were aware of their existence and function. It was also demonstrated that KFS have had limited market impact over the four year window since introduction, likely due to the requirement that KFS provision is not required unless formally requested by a borrower. Recommendations include transferring the burden of disclosure to lenders in the first instance to address this information gap.
Resumo:
Long-running datasets from aerial surveys of kangaroos (Macropus giganteus, Macropus [uliginosus, Macropus robustus and Macropus rufus) across Queensland, New South Wales and South Australia have been analysed, seeking better predictors of rates of increase which would allow aerial surveys to be undertaken less frequently than annually. Early models of changes in kangaroo numbers in response to rainfall had shown great promise, but much variability. We used normalised difference vegetation index (NDVI) instead, reasoning that changes in pasture condition would provide a better predictor than rainfall. However, except at a fine scale, NDVI proved no better; although two linked periods of rainfall proved useful predictors of rates of increase, this was only in some areas for some species. The good correlations reported in earlier studies were a consequence of data dominated by large droughtinduced adult mortality, whereas over a longer time frame and where changes between years are less dramatic, juvenile survival has the strongest influence on dynamics. Further, harvesting, density dependence and competition with domestic stock are additional and important influences and it is now clear that kangaroo movement has a greater influence on population dynamics than had been assumed. Accordingly, previous conclusions about kangaroo populations as simple systems driven by rainfall need to be reassessed. Examination of this large dataset has permitted descriptions of shifts in distribution of three species across eastern Australia, changes in dispersion in response to rainfall, and an evaluation of using harvest statistics as an index of density and harvest rate. These results have been combined into a risk assessment and decision theory framework to identify optimal monitoring strategies.
Resumo:
The aim of this study was to identify and describe the types of errors in clinical reasoning that contribute to poor diagnostic performance at different levels of medical training and experience. Three cohorts of subjects, second- and fourth- (final) year medical students and a group of general practitioners, completed a set of clinical reasoning problems. The responses of those whose scores fell below the 25th centile were analysed to establish the stage of the clinical reasoning process - identification of relevant information, interpretation or hypothesis generation - at which most errors occurred and whether this was dependent on problem difficulty and level of medical experience. Results indicate that hypothesis errors decrease as expertise increases but that identification and interpretation errors increase. This may be due to inappropriate use of pattern recognition or to failure of the knowledge base. Furthermore, although hypothesis errors increased in line with problem difficulty, identification and interpretation errors decreased. A possible explanation is that as problem difficulty increases, subjects at all levels of expertise are less able to differentiate between relevant and irrelevant clinical features and so give equal consideration to all information contained within a case. It is concluded that the development of clinical reasoning in medical students throughout the course of their pre-clinical and clinical education may be enhanced by both an analysis of the clinical reasoning process and a specific focus on each of the stages at which errors commonly occur.
Resumo:
We consider the problem of optimally scheduling a processor executing a multilayer protocol in an intelligent Network Interface Controller (NIC). In particular, we assume a typical LAN environment with class 4 transport service, a connectionless network service, and a class 1 link level protocol. We develop a queuing model for the problem. In the most general case this becomes a cyclic queuing network in which some queues have dedicated servers, and the others have a common schedulable server. We use sample path arguments and Markov decision theory to determine optimal service schedules. The optimal throughputs are compared with those obtained with simple policies. The optimal policy yields upto 25% improvement in some cases. In some other cases, the optimal policy does only slightly better than much simpler policies.
Resumo:
We consider a visual search problem studied by Sripati and Olson where the objective is to identify an oddball image embedded among multiple distractor images as quickly as possible. We model this visual search task as an active sequential hypothesis testing problem (ASHT problem). Chernoff in 1959 proposed a policy in which the expected delay to decision is asymptotically optimal. The asymptotics is under vanishing error probabilities. We first prove a stronger property on the moments of the delay until a decision, under the same asymptotics. Applying the result to the visual search problem, we then propose a ``neuronal metric'' on the measured neuronal responses that captures the discriminability between images. From empirical study we obtain a remarkable correlation (r = 0.90) between the proposed neuronal metric and speed of discrimination between the images. Although this correlation is lower than with the L-1 metric used by Sripati and Olson, this metric has the advantage of being firmly grounded in formal decision theory.
Resumo:
Structural design is a decision-making process in which a wide spectrum of requirements, expectations, and concerns needs to be properly addressed. Engineering design criteria are considered together with societal and client preferences, and most of these design objectives are affected by the uncertainties surrounding a design. Therefore, realistic design frameworks must be able to handle multiple performance objectives and incorporate uncertainties from numerous sources into the process.
In this study, a multi-criteria based design framework for structural design under seismic risk is explored. The emphasis is on reliability-based performance objectives and their interaction with economic objectives. The framework has analysis, evaluation, and revision stages. In the probabilistic response analysis, seismic loading uncertainties as well as modeling uncertainties are incorporated. For evaluation, two approaches are suggested: one based on preference aggregation and the other based on socio-economics. Both implementations of the general framework are illustrated with simple but informative design examples to explore the basic features of the framework.
The first approach uses concepts similar to those found in multi-criteria decision theory, and directly combines reliability-based objectives with others. This approach is implemented in a single-stage design procedure. In the socio-economics based approach, a two-stage design procedure is recommended in which societal preferences are treated through reliability-based engineering performance measures, but emphasis is also given to economic objectives because these are especially important to the structural designer's client. A rational net asset value formulation including losses from uncertain future earthquakes is used to assess the economic performance of a design. A recently developed assembly-based vulnerability analysis is incorporated into the loss estimation.
The presented performance-based design framework allows investigation of various design issues and their impact on a structural design. It is a flexible one that readily allows incorporation of new methods and concepts in seismic hazard specification, structural analysis, and loss estimation.
Resumo:
[ES] Este trabajo de investigación pretende proponer un modelo de cuantificación de la competitividad portuaria que puede alcanzar un puerto comercial para un tráfico de contenedores. Derivado de este objetivo, podría conseguirse una Estrategia a la medida para un puerto concreto orientada a la búsqueda de mercados, fundamentalmente. Para este último aspecto, esta investigación se centra en el Puerto de Bilbao y los servicios directos de línea regular en contenedor en las rutas intercontinentales que conectan Europa con América y el Caribe. La finalidad principal es la elaboración de un modelo teórico tratado dentro del área de la ciencia de la Economía de la Empresa como antecedente inmediato y básico de la Dirección Estratégica. Se consideran varios factores en el proceso de selección de un puerto que pueden contribuir a las preferencias de elección, tanto los relativos a las características del mismo, como a su hinterland para los grupos de usuarios. También se analizarán los estudios sobre modelos y métodos de competitividad portuaria considerados por la literatura especializada en los últimos años. La metodología a utilizar será una combinación de distintos métodos, para la consecución de cada uno de los objetivos. La investigación se apoya en el análisis de Importance–Performance (IPa) que consiste en evaluar tanto la satisfacción del cliente y la calidad del servicio. El análisis IPa tiene preponderancia para la obtención de un índice de competitividad de un puerto de contenedores. Dicho índice podría ser calculado mediante el uso de postulados y reglas de la matemática, en concreto de la Teoría de la Decisión/ Investigación de Operaciones.
Resumo:
Ao longo de quase cinco anos de trabalho, foi desenvolvido o Índice de Qualidade dos Municípios - Verde, pela Fundação CIDE. O trabalho buscou retratar as características da fragmentação florestal fluminense. Como um elemento para apoiar a gestão ambiental do território, o projeto identificou corredores ecológicos prioritários para a interligação de fragmentos florestais. A grande contribuição do trabalho, do ponto de vista conceitual, foi reorientar o debate acerca da fragmentação florestal no Estado do Rio de Janeiro. O projeto IQM - Verde apresentou, exaustivamente, lugares onde ocorreram perdas e ganhos de estoques de vegetação com porte arbóreo, num recorte por municípios, bacias hidrográficas e Unidades de Conservação. Existem importantes questões que foram levantadas e ainda aguardam maiores e melhores respostas. Uma delas é tentar explicar, a partir da ecologia de paisagens, quais são os mecanismos que facilitam ou dificultam o processo natural de sucessão florestal. A situação da sucessão florestal é completamente diferente de uma região para outra do Estado. No Noroeste do Estado existem indícios claros de retração e fragmentação dos remanescentes enquanto na região Serrana do Sul Fluminense aparecem sinais claros de recuperação e recomposição florestal. Novos conceitos de gestão ambiental procuram minimizar os efeitos decorrentes da fragmentação e do isolamento espacial das espécies. O aumento da conectividade através de corredores ecológicos entre unidades de conservação e até mesmo entre os fragmentos mais bem conservados é apontado por muitos pesquisadores como uma das formas mais eficazes de promover a manutenção dos remanescentes florestais - a longo prazo - e até mesmo promover a recuperação funcional de determinadas unidades ecológicas atualmente ilhadas. A atual geração de pesquisadores e gestores públicos está diante do problema do controle dos processos que desencadeiam a fragmentação florestal. Portanto, é urgente a necessidade de entender todas as consequências associadas ao processo de fragmentação florestal e, ao mesmo tempo, descobrir os efeitos inibidores deste complexo fenômeno que possui raízes físicas, naturais e sociais. O objetivo central da tese é, a partir de elementos da História das Mentalidades e da Teoria da Decisão, construir cenários de pressão antrópica sobre os remanescentes florestais e propor um programa possível de intervenção econômica, jurídica e política, denominado no presente estudo como bolsa floresta, capaz de aliviar o atual processo de fragmentação florestal.
Resumo:
In order to generate skilled and efficient actions, the motor system must find solutions to several problems inherent in sensorimotor control, including nonlinearity, nonstationarity, delays, redundancy, uncertainty, and noise. We review these problems and five computational mechanisms that the brain may use to limit their deleterious effects: optimal feedback control, impedance control, predictive control, Bayesian decision theory, and sensorimotor learning. Together, these computational mechanisms allow skilled and fluent sensorimotor behavior.
Resumo:
Humans have been shown to adapt to the temporal statistics of timing tasks so as to optimize the accuracy of their responses, in agreement with the predictions of Bayesian integration. This suggests that they build an internal representation of both the experimentally imposed distribution of time intervals (the prior) and of the error (the loss function). The responses of a Bayesian ideal observer depend crucially on these internal representations, which have only been previously studied for simple distributions. To study the nature of these representations we asked subjects to reproduce time intervals drawn from underlying temporal distributions of varying complexity, from uniform to highly skewed or bimodal while also varying the error mapping that determined the performance feedback. Interval reproduction times were affected by both the distribution and feedback, in good agreement with a performance-optimizing Bayesian observer and actor model. Bayesian model comparison highlighted that subjects were integrating the provided feedback and represented the experimental distribution with a smoothed approximation. A nonparametric reconstruction of the subjective priors from the data shows that they are generally in agreement with the true distributions up to third-order moments, but with systematically heavier tails. In particular, higher-order statistical features (kurtosis, multimodality) seem much harder to acquire. Our findings suggest that humans have only minor constraints on learning lower-order statistical properties of unimodal (including peaked and skewed) distributions of time intervals under the guidance of corrective feedback, and that their behavior is well explained by Bayesian decision theory.
Resumo:
In the paper through extensive study and design, the technical plan for establishing the exploration database center is made to combine imported and self developed techniques. By research and repeated experiment a modern database center has been set up with its hardware and network having advanced performance, its system well configured, its data store and management complete, and its data support being fast and direct. Through study on the theory, method and model of decision an exploration decision assistant schema is designed with one decision plan of well location decision support system being evaluated and put into action. 1. Study on the establishment of Shengli exploration database center Research is made on the hardware configuration of the database center including its workstations and all connected hardware and system. The hardware of the database center is formed by connecting workstations, microcomputer workstations, disk arrays, and those equipments used for seismic processing and interpretation. Research on the data store and management includes the analysis of the contents to be managed, data flow, data standard, data QC, data backup and restore policy, optimization of database system. A reasonable data management regulation and workflow is made and the scientific exploration data management system is created. Data load is done by working out a schedule firstly and at last 200 more projects of seismic surveys has been loaded amount to 25TB. 2. Exploration work support system and its application Seismic data processing system support has the following features, automatic extraction of seismic attributes, GIS navigation, data order, extraction of any sized data cube, pseudo huge capacity disk array, standard output exchange format etc. The prestack data can be accessed by the processing system or data can be transferred to other processing system through standard exchange format. For supporting seismic interpretation system the following features exist such as auto scan and store of interpretation result, internal data quality control etc. the interpretation system is connected directly with database center to get real time support of seismic data, formation data and well data. Comprehensive geological study support is done through intranet with the ability to query or display data graphically on the navigation system under some geological constraints. Production management support system is mainly used to collect, analyze and display production data with its core technology on the controlled data collection and creation of multiple standard forms. 3. exploration decision support system design By classification of workflow and data flow of all the exploration stages and study on decision theory and method, target of each decision step, decision model and requirement, three concept models has been formed for the Shengli exploration decision support system including the exploration distribution support system, the well location support system and production management support system. the well location decision support system has passed evaluation and been put into action. 4. Technical advance Hardware and software match with high performance for the database center. By combining parallel computer system, database server, huge capacity ATL, disk array, network and firewall together to create the first exploration database center in China with reasonable configuration, high performance and able to manage the whole data sets of exploration. Huge exploration data management technology is formed where exploration data standards and management regulations are made to guarantee data quality, safety and security. Multifunction query and support system for comprehensive exploration information support. It includes support system for geological study, seismic processing and interpretation and production management. In the system a lot of new database and computer technology have been used to provide real time information support for exploration work. Finally is the design of Shengli exploration decision support system. 5. Application and benefit Data storage has reached the amount of 25TB with thousand of users in Shengli oil field to access data to improve work efficiency multiple times. The technology has also been applied by many other units of SINOPEC. Its application of providing data to a project named Exploration achievements and Evaluation of Favorable Targets in Hekou Area shortened the data preparation period from 30 days to 2 days, enriching data abundance 15 percent and getting information support from the database center perfectly. Its application to provide former processed result for a project named Pre-stack depth migration in Guxi fracture zone reduced the amount of repeated process and shortened work period of one month and improved processing precision and quality, saving capital investment of data processing of 30 million yuan. It application by providing project database automatically in project named Geological and seismic study of southern slope zone of Dongying Sag shortened data preparation time so that researchers have more time to do research, thus to improve interpretation precision and quality.
Resumo:
3D wave equation prestack depth migration is the effective tool for obtaining the exact imaging result of complex geology structures. It's a part of the 3D seismic data processing. 3D seismic data processing belongs to high dimension signal processing, and there are some difficult problems to do with. They are: How to process high dimension operators? How to improve the focusing? and how to construct the deconvolution operator? The realization of 3D wave equation prestack depth migration, not only realized the leap from poststack to prestack, but also provided the important means to solve the difficult problems in high dimension signal processing. In this thesis, I do a series research especially for the solve of the difficult problems around the 3D wave equation prestack depth migration and using it as a mean. So this thesis service for the realization of 3D wave equation prestack depth migration for one side and improve the migration effect for another side. This thesis expatiates in five departs. Summarizes the main contents as the follows: In the first part, I have completed the projection from 3D data point area to low dimension are using de big matrix transfer and trace rearrangement, and realized the liner processing of high dimension signal. Firstly, I present the mathematics expression of 3D seismic data and the mean according to physics, present the basic ideal of big matrix transfer and describe the realization of five transfer models for example. Secondly, I present the basic ideal and rules for the rearrange and parallel calculate of 3D traces, and give a example. In the conventional DMO focusing method, I recall the history of DM0 process firstly, give the fundamental of DMO process and derive the equation of DMO process and it's impulse response. I also prove the equivalence between DMO and prestack time migration, from the kinematic character of DMO. And derive the relationship between DMO base on wave equation and prestack time migration. Finally, I give the example of DMO process flow and synthetic data of theoretical models. In the wave equation prestak depth migration, I firstly recall the history of migration from time to depth, from poststack to prestack and from 2D to 3D. And conclude the main migration methods, point out their merit and shortcoming. Finally, I obtain the common image point sets using the decomposed migration program code.In the residual moveout, I firstly describe the Viterbi algorithm based on Markov process and compound decision theory and how to solve the shortest path problem using Viterbi algorithm. And based on this ideal, I realized the residual moveout of post 3D wave equation prestack depth migration. Finally, I give the example of residual moveout of real 3D seismic data. In the migration Green function, I firstly give the concept of migration Green function and the 2D Green function migration equation for the approximate of far field. Secondly, I prove the equivalence of wave equation depth extrapolation algorithms. And then I derive the equation of Green function migration. Finally, I present the response and migration result of Green function for point resource, analyze the effect of migration aperture to prestack migration result. This research is benefit for people to realize clearly the effect of migration aperture to migration result, and study on the Green function deconvolution to improve the focusing effect of migration.
Resumo:
The problem of terrorism can be analyzed by means of a wide array of research theories and models. There is however a question which of these may be regarded as especially useful to analyze different aspects of terrorism such as its reasons, characteristics and effects. Among concepts or theories which more or less fulfill the above-mentioned requirements, one can mention: chaos theory, decision theory, spatial competition theory, exchange theory, black box theory, theory of disaster, system model, model of billiard balls, core model, asymmetrical model, network model or concept of hybridity.
Resumo:
Making a decision is often a matter of listing and comparing positive and negative arguments. In such cases, the evaluation scale for decisions should be considered bipolar, that is, negative and positive values should be explicitly distinguished. That is what is done, for example, in Cumulative Prospect Theory. However, contrary to the latter framework that presupposes genuine numerical assessments, human agents often decide on the basis of an ordinal ranking of the pros and the cons, and by focusing on the most salient arguments. In other terms, the decision process is qualitative as well as bipolar. In this article, based on a bipolar extension of possibility theory, we define and axiomatically characterize several decision rules tailored for the joint handling of positive and negative arguments in an ordinal setting. The simplest rules can be viewed as extensions of the maximin and maximax criteria to the bipolar case, and consequently suffer from poor decisive power. More decisive rules that refine the former are also proposed. These refinements agree both with principles of efficiency and with the spirit of order-of-magnitude reasoning, that prevails in qualitative decision theory. The most refined decision rule uses leximin rankings of the pros and the cons, and the ideas of counting arguments of equal strength and cancelling pros by cons. It is shown to come down to a special case of Cumulative Prospect Theory, and to subsume the “Take the Best” heuristic studied by cognitive psychologists.