69 resultados para multicriteria decision tools
Resumo:
As an introduction to a series of articles focused on the exploration of particular tools and/or methods to bring together digital technology and historical research, the aim of this paper is mainly to highlight and discuss in what measure those methodological approaches can contribute to improve analytical and interpretative capabilities available to historians. In a moment when the digital world present us with an ever-increasing variety of tools to perform extraction, analysis and visualization of large amounts of text, we thought it would be relevant to bring the digital closer to the vast historical academic community. More than repeating an idea of digital revolution introduced in the historical research, something recurring in the literature since the 1980s, the aim was to show the validity and usefulness of using digital tools and methods, as another set of highly relevant tools that the historians should consider. For this several case studies were used, combining the exploration of specific themes of historical knowledge and the development or discussion of digital methodologies, in order to highlight some changes and challenges that, in our opinion, are already affecting the historians' work, such as a greater focus given to interdisciplinarity and collaborative work, and a need for the form of communication of historical knowledge to become more interactive.
Resumo:
Dissertation presented to obtain the Ph.D degree in Biology, Neuroscience
Resumo:
Forest managers, stakeholders and investors want to be able to evaluate economic, environmental and social benefits in order to improve the outcomes of their decisions and enhance sustainable forest management. This research developed a spatial decision support system that provides: (1) an approach to identify the most beneficial locations for agroforestry projects based on the biophysical properties and evaluate its economic, social and environmental impact; (2) a tool to inform prospective investors and stakeholders of the potential and opportunities for integrated agroforestry management; (3) a simulation environment that enables evaluation via a dashboard with the opportunity to perform interactive sensitivity analysis for key parameters of the project; (4) a 3D interactive geographic visualization of the economic, environmental and social outcomes, which facilitate understanding and eases planning. Although the tool and methodology presented are generic, a case study was performed in East Kalimantan, Indonesia. For the whole study area, it was simulated the most suitable location for three different plantation schemes: monoculture of timber, a specific recipe (cassava, banana and sugar palm) and different recipes per geographic unit. The results indicate that a mixed cropping plantation scheme, with different recipes applied to the most suitable location returns higher economic, environmental and social benefits.
Resumo:
Geographic information systems give us the possibility to analyze, produce, and edit geographic information. Furthermore, these systems fall short on the analysis and support of complex spatial problems. Therefore, when a spatial problem, like land use management, requires a multi-criteria perspective, multi-criteria decision analysis is placed into spatial decision support systems. The analytic hierarchy process is one of many multi-criteria decision analysis methods that can be used to support these complex problems. Using its capabilities we try to develop a spatial decision support system, to help land use management. Land use management can undertake a broad spectrum of spatial decision problems. The developed decision support system had to accept as input, various formats and types of data, raster or vector format, and the vector could be polygon line or point type. The support system was designed to perform its analysis for the Zambezi river Valley in Mozambique, the study area. The possible solutions for the emerging problems had to cover the entire region. This required the system to process large sets of data, and constantly adjust to new problems’ needs. The developed decision support system, is able to process thousands of alternatives using the analytical hierarchy process, and produce an output suitability map for the problems faced.
Resumo:
Spatial analysis and social network analysis typically take into consideration social processes in specific contexts of geographical or network space. The research in political science increasingly strives to model heterogeneity and spatial dependence. To better understand and geographically model the relationship between “non-political” events, streaming data from social networks, and political climate was the primary objective of the current study. Geographic information systems (GIS) are useful tools in the organization and analysis of streaming data from social networks. In this study, geographical and statistical analysis were combined in order to define the temporal and spatial nature of the data eminating from the popular social network Twitter during the 2014 FIFA World Cup. The study spans the entire globe because Twitter’s geotagging function, the fundamental data that makes this study possible, is not limited to a geographic area. By examining the public reactions to an inherenlty non-political event, this study serves to illuminate broader questions about social behavior and spatial dependence. From a practical perspective, the analyses demonstrate how the discussion of political topics fluсtuate according to football matches. Tableau and Rapidminer, in addition to a set basic statistical methods, were applied to find patterns in the social behavior in space and time in different geographic regions. It was found some insight into the relationship between an ostensibly non-political event – the World Cup - and public opinion transmitted by social media. The methodology could serve as a prototype for future studies and guide policy makers in governmental and non-governmental organizations in gauging the public opinion in certain geographic locations.
Resumo:
Retail services are a main contributor to municipal budget and are an activity that affects perceived quality-of-life, especially for those with mobility difficulties (e.g. the elderly, low income citizens). However, there is evidence of a decline in some of the services market towns provide to their citizens. In market towns, this decline has been reported all over the western world, from North America to Australia. The aim of this research was to understand retail decline and enlighten on some ways of addressing this decline, using a case study, Thornbury, a small town in the Southwest of England. Data collected came from two participatory approaches: photo-surveys and multicriteria mapping. The interpretation of data came from using participants as analysts, but also, using systems thinking (systems diagramming and social trap theory) for theory building. This research moves away from mainstream economic and town planning perspectives by making use of different methods and concepts used in anthropology and visual sociology (photo-surveys), decision-making and ecological economics (multicriteria mapping and social trap theory). In sum, this research has experimented with different methods, out of their context, to analyse retail decline in a small town. This research developed a conceptual model for retail decline and identified the existence of conflicting goals and interests and their implications for retail decline, as well as causes for these. Most of the potential causes have had little attention in the literature. This research also identified that some of the measures commonly used for dealing with retail decline may be contributing to the causes of retail decline itself. Additionally, this research reviewed some of the measures that can be used to deal with retail decline, implications for policy-making and reflected on the use of the data collection and analysis methods in the context of small to medium towns.
Resumo:
Based on the report for the unit “Foresight Methods Analysis” of the PhD programme on Technology Assessment at the Universidade Nova de Lisboa, under the supervision of Prof. Dr. António B. Moniz
Resumo:
RESUMO: Este trabalho tentou contribuir para a caracterização da fisiopatologia da microcirculação coronária em diferentes formas de patologia com o auxílio da ecocardiografia transtorácica. Com a aplicação da ecocardiografia Doppler transtorácica foi efectuado o estudo da reserva coronária da artéria descendente anterior e com a ecocardiografia de contraste do miocárdio foram analisados parâmetros de perfusão do miocárdio como a velocidade da microcirculação coronária, o volume de sangue miocárdico e a reserva de fluxo miocárdico. Estas técnicas foram utilizadas em diferentes situações fisiopatológicas com particular interesse na hipertrofia ventricular esquerda de diferentes etiologias como a hipertensão arterial, estenose aórtica e cardiomiopatia hipertrófica. Também na diabetes mellitus tipo 2 e na doença coronária aterosclerótica, estudámos as alterações da microcirculação coronária. Com a mesma técnica de ecocardiografia de contraste do miocárdio foi analisada a perfusão do miocárdio num modelo experimental animal sujeito a uma dieta aterogénica. Além das conclusões específicas em relação a cada um dos trabalhos efectuados há a referir como conclusões gerais a sua fácil aplicabilidade e exequibilidade em âmbito clínico, a sua reprodutibilidade e precisão. Quando comparadas com técnicas consideradas de referência mostraram resultados com significativa correlação estatística. Em todos os doentes e nos grupos controle foi possível comprovar e quantificar o gradiente de perfusão transmural em repouso e durante a acção de stress vasodilatador, relevando a importância da perfusão sub-endocárdica na função do ventrículo esquerdo. O estudo da microcirculação coronária no grupo de doentes com hipertrofia ventricular esquerda revelou que no grupo com hipertensão arterial existe disfunção da microcirculação coronária ainda antes de se observar aumento de massa do ventrículo esquerdo, e que esta disfunção é diferente em função da geometria ventricular. Nos doentes com estenose aórtica foi demonstrado que além da disfunção da microcirculação coronária, explicada pelo fenómeno de hipertrofia, existe outro componente extrínseco que depois de corrigido através de cirurgia de substituição valvular, conduziu a uma parcial normalização dos valores de reserva coronária. Na cardiomiopatia hipertrófica observou-se uma grande heterogeneidade de perfusão transmural e foi documentado, em imagens de ecocardiografia de contraste do miocárdio e após análise paramétrica, a ausência de perfusão do miocárdio na região sub-endocárdica durante o stress vasodilatador de reserva coronária diminuídos em fases precoces de evolução da doença. Foi demonstrado que a reserva coronária na DM2 em fases mais avançadas estava significativamente diminuída. Descrevemos também em doentes com DM2 e sem doença coronária angiográfica a existência de disfunção da microcirculação coronária. Durante o stress vasodilatador, observámos e documentámos neste grupo de doentes, a existência de defeitos de perfusão transitórios ou de diminuição da velocidade da microcirculação coronária. No grupo de doentes com doença coronária confirmámos o interesse da avaliação da reserva coronária após intervenção percutânea na definição de prognóstico pós EAM, em termos de recuperação funcional do ventrículo esquerdo. Em doentes com BCRE e de difícil estratificação de risco, foi possível calcular o valor de reserva coronária e estratificar o risco de doença coronária. Num modelo experimental animal demonstrámos a exequibilidade da técnica de ECM, e verificámos que nessas condições experimentais, uma sobrecarga aterogénica na dieta, ao fim de 6 semanas, comprometia severamente a reserva coronária. Estes resultados foram parcialmente reversíveis quando à dieta foi adicionada uma estatina. Estas técnicas pela sua não invasibilidade, fácil acesso, repetibilidade e inocuidade perspectivam-se de grande utilidade na caracterização de doentes com disfunção da microcirculação coronária, nas diferentes áreas de diagnóstico, terapêutica e prevenção. A possibilidade de adaptar a técnica em modelos experimentais animais também nos parece poder vir a ter grande utilidade em investigação.----------------ABSTRACT: This work is intended to be a contribution to the study of coronary microcirculation applying new echocardiographic techniques as transthoracic Doppler echocardiography of coronary arteries and myocardial contrast echocardiography. Coronary flow reserve may be assessed by transthoracic Doppler echocardiography, and important functional microcirculation parameters as microcirculation flow velocity, myocardial blood volume and myocardial flow reserve may be evaluated through myocardial contrast echocardiography. Microcirculation was analysed in different pathophysiological settings. We addressed situations with increased left ventricular mass as systemic arterial hypertension, aortic stenosis and hypertrophic cardiomyopathy. Also coronary microcirculation was studied in type 2 Diabetes and in different clinical forms of atherosclerotic coronary artery disease. Specific and detailed conclusions were withdrawn from each experimental work. In the overall it was concluded that these two techniques were important tools to easily assess specific pathophysiological information about coronary microcirculation at bed side which would be difficult to get through other techniques. When compared with gold standard techniques, similar sensibility and specificity was found. Because of their better temporal and spatial resolution it was possible to analyse the importance of transmural perfusion gradients, both in basal and during vasodilatation, and their relation to ischemia, and mechanical wall kinetics, as wall thickening and motion. Coronary microcirculation dysfunction was found in systemic arterial hypertension early evolution stages, also related to different left ventricular geometric patterns. Different etiopathogenical explanations for aortic stenosis coronary microcirculation dysfunction were analysed and compared after aortic valve replacement. Transmural myocardial perfusion heterogeneity pattern was observed in hypertrophic cardiomyopathy which was aggravated during adenosine challenge. Coronary microcirculation dysfunction was diagnosed in type 2 diabetes both with coronary artery disease and with normal angiographic coronary arteries. Dynamic transitory subendocardial perfusion defects with adenosine vasodilatation were visualized in these patients.In patients with left branch block, transthoracic Doppler echocardiography was able to suggest a coronary reserve cut-off value for risk stratification. Also it was possible with this technique to calculate coronary flow reserve and predict restenosis after PTCA Again, in an experimental animal model, applying myocardial contrast echocardiography technique it was possible to study the consequences of an atherogenic diet and statins action on the coronary microcirculation function. Because these techniques are easily performed at bed side, are harmless, use no ionizing radiation and because of their repeatability, reproducibility and accuracythey are promissory tools to assess coronary microcirculation. Both in clinic and research areas these techniques will probably have a role in clinical diagnosis, prevention and therapeutically decision.
Resumo:
Field lab: Consumer insights
Resumo:
This work project has the objective of exploring the importance of making good decisions on supplier selection, so that the purchasing department can contribute to the success of a company. For that it is presented a short bibliography review of the latest insights that were found relevant, on the subjects of purchasing, technology, outsourcing, supplier selection and decision-making techniques. For a better understating on how to deal with a decision-making situation, a case study is also presented: Digital Printing Solutions (DPS) is a Portuguese company that provides complete and integrated printing solutions and has been planning to contract a software supplier. DPS has no formal supplier-selection model and it has to choose between 2 suppliers. The case study was solved using the M-MACBETH software. I have found that complex decisions-making situations can be easily overcome by using the M-MACBETH decision model. Moreover, the usage of a model, instead of decision that follows no formal procedure, provides the decision maker with insights that can be useful to negotiate with the supplier.
Resumo:
The aim of this study is to assess the institutionalized children’s skills as consumers but also to assess how we can improve their knowledge through an intervention. The sample was composed of two subgroups (38 institutionalized children and 36 non-institutionalized children). In order to assess children’s knowledge, a questionnaire and an interview were used. The method used as intervention was a 30-minute class. Results suggested that institutionalized children have lower levels of knowledge regarding consumption-related practices and lower levels of accuracy at estimating prices than non-institutionalized children. However, results also showed that the attitudes of institutionalized children towards advertising and making decisions based on price/quantity evaluation or based on the use of the same strategy in different situations are not significantly different from the non-institutionalized children. Regarding the intervention, it was possible to conclude that one class is not the best method to improve children’s knowledge. Institutionalized children need a longer and more practical intervention.
Resumo:
Due to external constraints (opposed by the market and legal system) and internal changes nonprofit organizations have been converting to for-profit entities combining commercial revenue and social value creation. To create an understanding of the conversion process considering its challenges, the reasons, the decision-making process and key success factors of a conversion are examined. Therefore, a two-step research procedure is used combining literature research and a multiple case study approach based on expert interviews with known companies. The outcome is a helpful guideline (including a decision matrix) for social entrepreneurs that might face a conversion.
Resumo:
In the current paper, the determinants of firm international relocation decision in twenty-six European countries during the period 2004-2014 are analyzed. We demonstrate, at light of three different but complementary theories that neoclassical, behavioural and institutional „push‟ factors have an impact in a firm decision-making process. Findings support that firm size, access to a global network, foreign capital, and negative internal growth in the workforce induce firm relocation. On the other hand, the degree of sunk assets has a negative effect on the probability of relocation. Delocalization decisions are also sector-dependent with low-tech manufacturing firms paying high salaries relocating abroad with a greater likelihood.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).