948 resultados para management tools
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.
Resumo:
This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.
Resumo:
International audience
Resumo:
The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.
Resumo:
International audience
Resumo:
Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.
Resumo:
Part 15: Performance Management Frameworks
Resumo:
Part 14: Interoperability and Integration
Resumo:
Part 2: Behaviour and Coordination
Resumo:
Doutoramento em Engenharia Agronómica - Instituto Superior de Agronomia - UL
Resumo:
Dissertação de Mestrado, Economia do Turismo e Desenvolvimento Regional, Faculdade de Economia, Universidade do Algarve, 2016
Resumo:
Benefitting from Web 2.0 features, Social Media allows organisations to be where the users are, creating proximity, talking to them, and knowing what they want. Going viral and word-of-mouth become easier, as these platforms allow us to share, to like, and to use multimedia and convergence – as they can interact with each other, communicating on a large scale. Given that online portals provide for a highly competitive environment, players strive to get more visits, better search rankings, and even aspire to be the homepage for the Web universe. We discuss the integration of Social Media tools in a Web Portal, and explore how using these together may improve the competitiveness of a Web Portal. A large Web Portal was selected to develop this case study. We found that, although for this particular Web Portal conditions were created to accommodate and integrate the chosen Social Media platforms, this was done in an organic and fluid way, with great focus on community construction and less focus on absorptive capacity. Based on the findings of this case study, we propose a dynamic cycle of benefits for integrating Social Media tools in a Web Portal.
Resumo:
Bubo bubo is the largest owl in the world, showing a wide geographical distribution throughout the Palaearctic region. It underwent a demographic decline in many European countries during the last century and was considered “vulnerable” (Annex II of the CITES). Nowadays, it is classified as “Least Concern” according to IUCN. Despite its ecological importance and conservation status, few polymorphic molecular markers are available to study its diversity and population genetics. We report on the isolation and development of 10 new microsatellites for the Eagle owl, B. bubo. All loci (10 tetra-nucleotide) are characterized by high polymorphism levels. Number of alleles ranged from 5 to 13 and expected heterozygosity varied from 0.733 to 0.840. These microsatellites would be very useful to assess the genetic diversity, connectivity patterns and parentage of B. bubo. This information will allow to establish new conservation strategies and improve the management of the species.
Resumo:
A better understanding of grapevine responses to drought and high air temperatures can help to optimize vineyard management to improve water use efficiency, yield and berry quality. Faster and robust field phenotyping tools are needed in modern precision viticulture, in particular in dry and hot regions such as the Mediterranean. Canopy temperature (Tc) is commonly used to monitor water stress in plants/crops and to characterize stomatal physiology in different woody species including Vitis vinifera. Thermography permits remote determination of leaf surface or canopy temperature in the field and also to assess the range and spatial distribution of temperature from different parts of the canopies. Our hypothesis is that grapevine genotypes may show different Tc patterns along the day due to different stomatal behaviour and heat dissipation strategies. We have monitored the diurnal and seasonal course of Tc in two grapevine genotypes, Aragonez (syn. Tempranillo) and Touriga Nacional subjected to deficit irrigation under typical Mediterranean climate conditions. Temperature measurements were complemented by determination of the diurnal course of leaf water potential (ψleaf) and leaf gas exchange. Measurements were done in two seasons (2013 and 2014) at different phenological stages: i) mid-June (green berry stage), ii) mid-July (veraison), iii) early August (early ripening) and iv) before harvest (late ripening). Correlations between Tc and minimal stomatal conductance will be presented for the two genotypes along the day. Results are discussed over the use of thermal imagery to derive information on genotype physiology in response to changing environmental conditions and to mild water stress induced by deficit irrigation. Strategies to optimize the use of thermal imaging in field conditions are also proposed