961 resultados para Set covering theory
Resumo:
Double Degree. A Work Project presented as part of the requirements for the Award of a Master Degree in Management from the NOVA – School of Business and Economics and a Master Degree in Business Engineering from Louvain School of Management
Resumo:
Software as a service (SaaS) is a service model in which the applications are accessible from various client devices through internet. Several studies report possible factors driving the adoption of SaaS but none have considered the perception of the SaaS features and the pressures existing in the organization’s environment. We propose an integrated research model that combines the process virtualization theory (PVT) and the institutional theory (INT). PVT seeks to explain whether SaaS processes are suitable for migration into virtual environments via an information technology-based mechanism. INT seeks to explain the effects of the institutionalized environment on the structure and actions of the organization. The research makes three contributions. First, it addresses a gap in the SaaS adoption literature by studying the internal perception of the technical features of SaaS and external coercive, normative, and mimetic pressures faced by an organization. Second, it empirically tests many of the propositions of PVT and INT in the SaaS context, thereby helping to determine how the theory operates in practice. Third, the integration of PVT and INT contributes to the information system (IS) discipline, deepening the applicability and strengths of these theories.
Resumo:
In this work I propose an additional test to be implemented in EDP’s residential electricity use feedback trials, under InovCity’s project scope. The proposed product to be tested consists of an interface between the smart meter and the television, through a set-top box. I provide a theoretical framework of the importance of feedback, an analysis of results from past studies involving smart metering, and a detailed description of my proposal. The results of a self-developed questionnaire related to the proposal and segmentation issues are also analyzed. Finally, general conclusions are drawn and potential future improvements and challenges are presented.
Resumo:
Economics is a social science which, therefore, focuses on people and on the decisions they make, be it in an individual context, or in group situations. It studies human choices, in face of needs to be fulfilled, and a limited amount of resources to fulfill them. For a long time, there was a convergence between the normative and positive views of human behavior, in that the ideal and predicted decisions of agents in economic models were entangled in one single concept. That is, it was assumed that the best that could be done in each situation was exactly the choice that would prevail. Or, at least, that the facts that economics needed to explain could be understood in the light of models in which individual agents act as if they are able to make ideal decisions. However, in the last decades, the complexity of the environment in which economic decisions are made and the limits on the ability of agents to deal with it have been recognized, and incorporated into models of decision making in what came to be known as the bounded rationality paradigm. This was triggered by the incapacity of the unboundedly rationality paradigm to explain observed phenomena and behavior. This thesis contributes to the literature in three different ways. Chapter 1 is a survey on bounded rationality, which gathers and organizes the contributions to the field since Simon (1955) first recognized the necessity to account for the limits on human rationality. The focus of the survey is on theoretical work rather than the experimental literature which presents evidence of actual behavior that differs from what classic rationality predicts. The general framework is as follows. Given a set of exogenous variables, the economic agent needs to choose an element from the choice set that is avail- able to him, in order to optimize the expected value of an objective function (assuming his preferences are representable by such a function). If this problem is too complex for the agent to deal with, one or more of its elements is simplified. Each bounded rationality theory is categorized according to the most relevant element it simplifes. Chapter 2 proposes a novel theory of bounded rationality. Much in the same fashion as Conlisk (1980) and Gabaix (2014), we assume that thinking is costly in the sense that agents have to pay a cost for performing mental operations. In our model, if they choose not to think, such cost is avoided, but they are left with a single alternative, labeled the default choice. We exemplify the idea with a very simple model of consumer choice and identify the concept of isofin curves, i.e., sets of default choices which generate the same utility net of thinking cost. Then, we apply the idea to a linear symmetric Cournot duopoly, in which the default choice can be interpreted as the most natural quantity to be produced in the market. We find that, as the thinking cost increases, the number of firms thinking in equilibrium decreases. More interestingly, for intermediate levels of thinking cost, an equilibrium in which one of the firms chooses the default quantity and the other best responds to it exists, generating asymmetric choices in a symmetric model. Our model is able to explain well-known regularities identified in the Cournot experimental literature, such as the adoption of different strategies by players (Huck et al. , 1999), the inter temporal rigidity of choices (Bosch-Dom enech & Vriend, 2003) and the dispersion of quantities in the context of di cult decision making (Bosch-Dom enech & Vriend, 2003). Chapter 3 applies a model of bounded rationality in a game-theoretic set- ting to the well-known turnout paradox in large elections, pivotal probabilities vanish very quickly and no one should vote, in sharp contrast with the ob- served high levels of turnout. Inspired by the concept of rhizomatic thinking, introduced by Bravo-Furtado & Côrte-Real (2009a), we assume that each per- son is self-delusional in the sense that, when making a decision, she believes that a fraction of the people who support the same party decides alike, even if no communication is established between them. This kind of belief simplifies the decision of the agent, as it reduces the number of players he believes to be playing against { it is thus a bounded rationality approach. Studying a two-party first-past-the-post election with a continuum of self-delusional agents, we show that the turnout rate is positive in all the possible equilibria, and that it can be as high as 100%. The game displays multiple equilibria, at least one of which entails a victory of the bigger party. The smaller one may also win, provided its relative size is not too small; more self-delusional voters in the minority party decreases this threshold size. Our model is able to explain some empirical facts, such as the possibility that a close election leads to low turnout (Geys, 2006), a lower margin of victory when turnout is higher (Geys, 2006) and high turnout rates favoring the minority (Bernhagen & Marsh, 1997).
Resumo:
This thesis proposes a methodology for modelling business interoperability in a context of cooperative industrial networks. The purpose is to develop a methodology that enables the design of cooperative industrial network platforms that are able to deliver business interoperability and the analysis of its impact on the performance of these platforms. To achieve the proposed objective, two modelling tools have been employed: the Axiomatic Design Theory for the design of interoperable platforms; and Agent-Based Simulation for the analysis of the impact of business interoperability. The sequence of the application of the two modelling tools depends on the scenario under analysis, i.e. whether the cooperative industrial network platform exists or not. If the cooperative industrial network platform does not exist, the methodology suggests first the application of the Axiomatic Design Theory to design different configurations of interoperable cooperative industrial network platforms, and then the use of Agent-Based Simulation to analyse or predict the business interoperability and operational performance of the designed configurations. Otherwise, one should start by analysing the performance of the existing platform and based on the achieved results, decide whether it is necessary to redesign it or not. If the redesign is needed, simulation is once again used to predict the performance of the redesigned platform. To explain how those two modelling tools can be applied in practice, a theoretical modelling framework, a theoretical Axiomatic Design model and a theoretical Agent-Based Simulation model are proposed. To demonstrate the applicability of the proposed methodology and/or to validate the proposed theoretical models, a case study regarding a Portuguese Reverse Logistics cooperative network (Valorpneu network) and a case study regarding a Portuguese construction project (Dam Baixo Sabor network) are presented. The findings of the application of the proposed methodology to these two case studies suggest that indeed the Axiomatic Design Theory can effectively contribute in the design of interoperable cooperative industrial network platforms and that Agent-Based Simulation provides an effective set of tools for analysing the impact of business interoperability on the performance of those platforms. However, these conclusions cannot be generalised as only two case studies have been carried out. In terms of relevance to theory, this is the first time that the network effect is addressed in the analysis of the impact of business interoperability on the performance of networked companies and also the first time that a holistic approach is proposed to design interoperable cooperative industrial network platforms. Regarding the practical implications, the proposed methodology is intended to provide industrial managers a management tool that can guide them easily, and in practical and systematic way, in the design of configurations of interoperable cooperative industrial network platforms and/or in the analysis of the impact of business interoperability on the performance of their companies and the networks where their companies operate.
Resumo:
Making the transition between plans and unexpected occurrences is something organizations are used to doing every day. However, not much is known about how actors cope with unanticipated events and how they accommodate them within predefined schedules. In this study, we draw on an inductive analysis of aspiring filmmakers’ film sets to elaborate on how they plan their shooting activities every day, only to adjust them when unforeseen complications arise. We discover that film crews anchor their expectations for the day based on a planned shooting schedule, yet they incorporate a built-in assumption that it will inevitably be disrupted. We argue that they resort to triage processes and “troubleshooting protocols” that help decipher incoming problems. Familiar problems are solved by making use of experience obtained from past situations, whereas unprecedented problems are solved through a tacit protocol used as a tool to quickly devise an appropriate game plan. This study contributes to the literature on sense-making and provides valuable information about the unexplored world of filmmaking.
Resumo:
Promotions can make you happy if you get the “best” deal or miserable if you miss it. Previous research on this topic has shown that people favor products associated with a past miss to products associated with a future miss, and people in a maximizing mind-set, i.e. people who search for the best in different domains, feel more regret in a consumption domain. This research confirms that consumers prefer purchasing a product associated with a past miss (Experiments 1 and 2) and that regret levels are higher when participants come across the future miss, under the maximizing mind-set (Experiment 2). These studies add to the notion that information on regret might prompt people to make decisions towards a more optimistic outcome.
Resumo:
Relatório de estágio de mestrado em Educação Pré-Escolar e Ensino do 1.º Ciclo do Ensino Básico
Resumo:
Tese de Doutoramento Tecnologias e Sistemas de Informação
Resumo:
Tese de Doutoramento em Ciências da Comunicação
Resumo:
Tese de Doutoramento em Ciências da Educação - Especialidade em Filosofia da Educação
Resumo:
This article describes a search for high-mass resonances decaying to a pair of photons using a sample of 20.3 fb−1 of pp collisions at s√=8 TeV recorded with the ATLAS detector at the Large Hadron Collider. The data are found to be in agreement with the Standard Model prediction, and limits are reported in the framework of the Randall-Sundrum model. This theory leads to the prediction of graviton states, the lightest of which could be observed at the Large Hadron Collider. A lower limit of 2.66 (1.41) TeV at 95% confidence level is set on the mass of the lightest graviton for couplings of k/M¯¯¯¯Pl=0.1 (0.01).
Resumo:
This paper deals with the problem of estimation maintenance costs for the case of the pitch controls system of wind farms turbines. Previous investigations have estimated these costs as (traditional) “crisp” values, simply ignoring the uncertainty nature of data and information available. This paper purposes an extended version of the estimation model by making use of the Fuzzy Set Theory. The results alert decision-makers to consequent uncertainty of the estimations along with their overall level, thus improving the information given to the mainte-nance support system.
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação
Resumo:
The inclusive jet cross-section is measured in proton--proton collisions at a centre-of-mass energy of 7 TeV using a data set corresponding to an integrated luminosity of 4.5 fb−1 collected with the ATLAS detector at the Large Hadron Collider in 2011. Jets are identified using the anti-kt algorithm with radius parameter values of 0.4 and 0.6. The double-differential cross-sections are presented as a function of the jet transverse momentum and the jet rapidity, covering jet transverse momenta from 100 GeV to 2 TeV. Next-to-leading-order QCD calculations corrected for non-perturbative effects and electroweak effects, as well as Monte Carlo simulations with next-to-leading-order matrix elements interfaced to parton showering, are compared to the measured cross-sections. A quantitative comparison of the measured cross-sections to the QCD calculations using several sets of parton distribution functions is performed.