877 resultados para Engineering Asset Management, Optimisation, Preventive Maintenance, Reliability Based Preventive Maintenance, Multiple Criteria Decision Making
Resumo:
Peer-to-peer information sharing has fundamentally changed customer decision-making process. Recent developments in information technologies have enabled digital sharing platforms to influence various granular aspects of the information sharing process. Despite the growing importance of digital information sharing, little research has examined the optimal design choices for a platform seeking to maximize returns from information sharing. My dissertation seeks to fill this gap. Specifically, I study novel interventions that can be implemented by the platform at different stages of the information sharing. In collaboration with a leading for-profit platform and a non-profit platform, I conduct three large-scale field experiments to causally identify the impact of these interventions on customers’ sharing behaviors as well as the sharing outcomes. The first essay examines whether and how a firm can enhance social contagion by simply varying the message shared by customers with their friends. Using a large randomized field experiment, I find that i) adding only information about the sender’s purchase status increases the likelihood of recipients’ purchase; ii) adding only information about referral reward increases recipients’ follow-up referrals; and iii) adding information about both the sender’s purchase as well as the referral rewards increases neither the likelihood of purchase nor follow-up referrals. I then discuss the underlying mechanisms. The second essay studies whether and how a firm can design unconditional incentive to engage customers who already reveal willingness to share. I conduct a field experiment to examine the impact of incentive design on sender’s purchase as well as further referral behavior. I find evidence that incentive structure has a significant, but interestingly opposing, impact on both outcomes. The results also provide insights about senders’ motives in sharing. The third essay examines whether and how a non-profit platform can use mobile messaging to leverage recipients’ social ties to encourage blood donation. I design a large field experiment to causally identify the impact of different types of information and incentives on donor’s self-donation and group donation behavior. My results show that non-profits can stimulate group effect and increase blood donation, but only with group reward. Such group reward works by motivating a different donor population. In summary, the findings from the three studies will offer valuable insights for platforms and social enterprises on how to engineer digital platforms to create social contagion. The rich data from randomized experiments and complementary sources (archive and survey) also allows me to test the underlying mechanism at work. In this way, my dissertation provides both managerial implication and theoretical contribution to the phenomenon of peer-to-peer information sharing.
Resumo:
Efficient crop monitoring and pest damage assessments are key to protecting the Australian agricultural industry and ensuring its leading position internationally. An important element in pest detection is gathering reliable crop data frequently and integrating analysis tools for decision making. Unmanned aerial systems are emerging as a cost-effective solution to a number of precision agriculture challenges. An important advantage of this technology is it provides a non-invasive aerial sensor platform to accurately monitor broad acre crops. In this presentation, we will give an overview on how unmanned aerial systems and machine learning can be combined to address crop protection challenges. A recent 2015 study on insect damage in sorghum will illustrate the effectiveness of this methodology. A UAV platform equipped with a high-resolution camera was deployed to autonomously perform a flight pattern over the target area. We describe the image processing pipeline implemented to create a georeferenced orthoimage and visualize the spatial distribution of the damage. An image analysis tool has been developed to minimize human input requirements. The computer program is based on a machine learning algorithm that automatically creates a meaningful partition of the image into clusters. Results show the algorithm delivers decision boundaries that accurately classify the field into crop health levels. The methodology presented in this paper represents a venue for further research towards automated crop protection assessments in the cotton industry, with applications in detecting, quantifying and monitoring the presence of mealybugs, mites and aphid pests.
Resumo:
Information Technology (IT) can be an important component for innovation since enabling e-learning it can provide conditions to which the organization can work with new business and improved processes. In this regard, the Learning Management Systems (LMS) allows communication and interaction between teachers and students in virtual spaces. However the literature indicates that there are gaps in the researches, especially concerning the use of IT for the management of e-learning. The purpose of this paper is to analyze the available literature about the application of LMS for the e-learning management, seeking to present possibilities for researches in the field. An integrative literature review was performed considering the Web of Science, Scopus, Ebsco and Scielo databases, where 78 references were found, of which 25 were full papers. This analysis derives interesting characteristics from scientific studies, highlighting gaps and guidelines for future research.
Resumo:
The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.
Resumo:
As paisagens rurais portuguesas construídas e mantidas ao longo dos tempos por sistemas agrícolas tradicionais, estão hoje ameaçadas por motivos tão diversos como o envelhecimento da população, o abandono rural, a intensificação, a homogeneização dos sistemas de produção e a perda de competitividade. Mas apesar destes problemas, estas paisagens agrícolas, suportam ainda várias funções não produtivas, nomeadamente, constituem, um importante suporte de biodiversidade, pelo que a sua manutenção é importante para a conservação destes habitats e espécies. Eventualmente novas formas de gestão destas paisagens devem ser criadas, nomeadamente com base na combinação de várias funções numa perspectiva de multifuncionalidade, através de uma adaptação e integração de políticas públicas. Estando actualmente em discussão o novo programa de Desenvolvimento Rural e a definição das futuras Medidas Agro-Ambientais, e a gestão e o financiamento da Rede Natura 2000, estamos portanto num momento crítico para decisões futuras, que terão forçosamente que interligar, a agricultura, o ambiente e desenvolvimento das zonas rurais portuguesas. Com o intuito de melhor compreender estas problemáticas, em particular, as transformações em curso na paisagem rural, o papel das Medidas Agro-Ambientais e apresentar possíveis soluções, foi efectuado este estudo de caso no concelho de Marvão, concelho típico das áreas marginais agrícolas do interior sul de Portugal. ABSTRACT; The Portuguese rural landscapes built up and kept throughout the times by traditional agricultural systems, are today threatened by so diverse reasons as the ageing of the population, the agricultural abandonment, the intensification, the homogenization of the production systems and the loss of competitiveness. But despite these problems. These agricultural landscapes still support a multitude of non-commodity functions, and particularly they still constitute an important support of biodiversity and thus their maintenance is important for the conservation of these habitats and species. Probably new management forms must be created based on the combination of various functions and the adaptation and integration of public policies. Being currently in discussion the new program of Rural Development and the definition of the future Agri-Environmental Measures, and the management and the financing of the Natura 2000 Network, we are therefore at a critical moment for future decisions that will forcibly have to establish connections, between the agriculture, the environment and the development of the portuguese agricultural areas. With the intention of better understanding these problems and questions, , the transformations taking place in Portuguese peripheric rural areas, and in particular role of the Agri-Environmental Measures, and also for presenting possible solutions, a case study was analyzed in municipality of Marvão, characteristic of the agricultural marginal areas of the interior Southern Portugal.
Resumo:
Understanding and predicting patterns of distribution and abundance of marine resources is important for con- servation and management purposes in small-scale artisanal fisheries and industrial fisheries worldwide. The goose barnacle (Pollicipes pollicipes) is an important shellfish resource and its distribution is closely related to wave exposure at different spatial scales. We modelled the abundance (percent coverage) of P. pollicipes as a function of a simple wave exposure index based on fetch estimates from digitized coastlines at different spatial scales. The model accounted for 47.5% of the explained deviance and indicated that barnacle abundance increases non-linearly with wave exposure at both the smallest (metres) and largest (kilometres) spatial scales considered in this study. Distribution maps were predicted for the study region in SW Portugal. Our study suggests that the relationship between fetch-based exposure indices and P. pollicipes percent cover may be used as a simple tool for providing stakeholders with information on barnacle distribution patterns. This information may improve assessment of harvesting grounds and the dimension of exploitable areas, aiding management plans and support- ing decision making on conservation, harvesting pressure and surveillance strategies for this highly appreciated and socio- economically important marine resource.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.
Resumo:
When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.
Resumo:
Polymer aluminum electrolytic capacitors were introduced to provide an alternative to liquid electrolytic capacitors. Polymer electrolytic capacitor electric parameters of capacitance and ESR are less temperature dependent than those of liquid aluminum electrolytic capacitors. Furthermore, the electrical conductivity of the polymer used in these capacitors (poly-3,4ethylenedioxithiophene) is orders of magnitude higher than the electrolytes used in liquid aluminum electrolytic capacitors, resulting in capacitors with much lower equivalent series resistance which are suitable for use in high ripple-current applications. The presence of the moisture-sensitive polymer PEDOT introduces concerns on the reliability of polymer aluminum capacitors in high humidity conditions. Highly accelerated stress testing (or HAST) (110ºC, 85% relative humidity) of polymer aluminum capacitors in which the parts were subjected to unbiased HAST conditions for 700 hours was done to understand the design factors that contribute to the susceptibility to degradation of a polymer aluminum electrolytic capacitor exposed to HAST conditions. A large scale study involving capacitors of different electrical ratings (2.5V – 16V, 100µF – 470 µF), mounting types (surface-mount and through-hole) and manufacturers (6 different manufacturers) was done to determine a relationship between package geometry and reliability in high temperature-humidity conditions. A Geometry-Based HAST test in which the part selection limited variations between capacitor samples to geometric differences only was done to analyze the effect of package geometry on humidity-driven degradation more closely. Raman spectroscopy, x-ray imaging, environmental scanning electron microscopy, and destructive analysis of the capacitors after HAST exposure was done to determine the failure mechanisms of polymer aluminum capacitors under high temperature-humidity conditions.
Resumo:
Esta dissertação foi desenvolvida no âmbito do 2º ano do Mestrado em Engenharia Mecânica – Ramo de Gestão Industrial no Instituto Superior de Engenharia do Porto. Este projeto realizou-se em ambiente industrial, nomeadamente na Tubembal, S.A. uma empresa localizada no concelho da Trofa, distrito do Porto. Esta empresa dedica-se à transformação de papel e comércio de embalagens, produz tubos e cantoneiras de cartão e é atualmente a maior empresa do sector na Península Ibérica. Esta dissertação baseia-se na aplicação de ferramentas Lean, numa perspetiva de melhoria de um ambiente produtivo industrial, melhorando o desempenho dos processos existentes e consequentemente a produtividade da empresa em estudo, com o objetivo de a tornar mais competitiva num ambiente global. A metodologia Lean tem como principal objetivo a eliminação de desperdício em toda a cadeia de valor e neste sentido surge como fundamental numa cultura de melhoria contínua e focalização no cliente, que se pretende instalar nesta empresa. Foi realizada uma análise profunda a toda a cadeia de valor como forma de identificar os maiores desperdícios e posteriormente apresentadas medidas para combater estes mesmos desperdícios, podendo assim reduzir custos. No projeto de melhoria apresentado à organização constam como principais ações, a implementação da metodologia 5S’s como ferramenta essencial para mudança de hábitos dos funcionários e integração e envolvimento de todos num mesmo projeto comum, na busca da melhoria contínua. Procedeu-se ainda à simulação de algumas propostas de reorganização do layout de forma a encontrar aquela que minimizasse os custos com movimentações e garantisse um fluxo controlado e em segurança dos produtos e pessoas dentro da fábrica. As propostas apresentadas mostram que a reorganização do layout da fábrica pode trazer ganhos significativos para a empresa, redução direta no tempo perdido em deslocações e maior disponibilidade dos meios e consequente direta redução dos custos. Todas as propostas apresentadas visam a adaptação da empresa a um modelo mais dinâmico de negócio, capaz de responder rápida e eficazmente aos seus clientes, adaptando-se ao mercado e garantindo a sua sustentabilidade num futuro próximo.
Resumo:
The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
This dissertation investigates customer behavior modeling in service outsourcing and revenue management in the service sector (i.e., airline and hotel industries). In particular, it focuses on a common theme of improving firms’ strategic decisions through the understanding of customer preferences. Decisions concerning degrees of outsourcing, such as firms’ capacity choices, are important to performance outcomes. These choices are especially important in high-customer-contact services (e.g., airline industry) because of the characteristics of services: simultaneity of consumption and production, and intangibility and perishability of the offering. Essay 1 estimates how outsourcing affects customer choices and market share in the airline industry, and consequently the revenue implications from outsourcing. However, outsourcing decisions are typically endogenous. A firm may choose whether to outsource or not based on what a firm expects to be the best outcome. Essay 2 contributes to the literature by proposing a structural model which could capture a firm’s profit-maximizing decision-making behavior in a market. This makes possible the prediction of consequences (i.e., performance outcomes) of future strategic moves. Another emerging area in service operations management is revenue management. Choice-based revenue systems incorporate discrete choice models into traditional revenue management algorithms. To successfully implement a choice-based revenue system, it is necessary to estimate customer preferences as a valid input to optimization algorithms. The third essay investigates how to estimate customer preferences when part of the market is consistently unobserved. This issue is especially prominent in choice-based revenue management systems. Normally a firm only has its own observed purchases, while those customers who purchase from competitors or do not make purchases are unobserved. Most current estimation procedures depend on unrealistic assumptions about customer arriving. This study proposes a new estimation methodology, which does not require any prior knowledge about the customer arrival process and allows for arbitrary demand distributions. Compared with previous methods, this model performs superior when the true demand is highly variable.
Resumo:
Mestrado Mediterranean Forestry and Natural Resources Management - Instituto Superior de Agronomia - UL
Resumo:
In this study, we examine the relationship between good corporate governance practices and the creation of value/performance of credit unions from 2010 to 2012. The objective was to create and validate a corporate governance index for credit unions, and to then analyse the relationship between good governance practices and the creation of value/performance. The problem question is: do good corporate governance practices provide value creation for credit unions? The research started by creating indices from factor analysis to identify latent dependent variables related to value creation and performance; next indices were created from the principal component analysis for the creation of independent latent variables related to corporate governance. Finally, based on panel data from regression models, the influence of the variables and indices related to corporate governance on the indices of value creation and performance was verified. Based on the research, it became evident that the Corporate Governance Index (IGC) is mainly impacted by Executive Management, with 40.31% of the IGC value, followed by the Representation and Participation dimension, with 34.07% of the IGC value. The contribution for academics was the creation of the Corporate Governance Index (IGC) applied for credit unions. As for the contribution to the system of credit unions, the highlight was the effectiveness of the mechanisms for economic-financial and asset management adopted by BACEN, credit unions and OCEMG.