961 resultados para complexity management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased complexity in large design and manufacturing organisations requires improvements at the operations management (OM)–applied service (AS) interface areas to improve project effectiveness. The aim of this paper is explore the role of Lean in improving the longitudinal efficiency of the OM–AS interface within a large aerospace organisation using Lean principles and boundary spanning theory. The methodology was an exploratory longitudinal case approach including exploratory interviews (n = 21), focus groups (n = 2), facilitated action-research workshops (n = 2) and two trials or experiments using longitudinal data involving both OM and AS personnel working at the interface. The findings draw upon Lean principles and boundary spanning theory to guide and interpret the findings. It was found that misinterpretation, and forced implementation, of OM-based Lean terminology and practice in the OM–AS interface space led to delays and misplaced resources. Rather both OM and AS staff were challenged to develop a cross boundary understanding of Lean-based boundary (knowledge) objects in interpreting OM requests. The longitudinal findings from the experiments showed that the development of Lean Performance measurements and lean Value Stream constructs was more successful when these Lean constructs were treated as boundary (knowledge) objects requiring transformation over time to orchestrate improved effectiveness and in leading to consistent terminology and understanding between the OM–AS boundary spanning team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer-based simulation games (CSG) are a form of innovation in learning and teaching. CGS are used more pervasively in various ways such as a class activity (formative exercises) and as part of summative assessments (Leemkuil and De Jong, 2012; Zantow et al., 2005). This study investigates the current and potential use of CGS in Worcester Business School’s (WBS) Business Management undergraduate programmes. The initial survey of off-the-shelf simulation reveals that there are various categories of simulations, with each offering varying levels of complexity and learning opportunities depending on the field of study. The findings suggest that whilst there is marginal adoption of the use CSG in learning and teaching, there is significant opportunity to increase the use of CSG in enhancing learning and learner achievement, especially in Level 5 modules. The use of CSG is situational and its adoption should be undertaken on a case-by-case basis. WBS can play a major role by creating an environment that encourages and supports the use of CSG as well as other forms of innovative learning and teaching methods. Thus the key recommendation involves providing module teams further support in embedding and integrating CSG into their modules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the spatial and temporal variation in nitrogen dioxide (NO2) levels in Guernsey and the impacts on pre-existing asthmatics. Whilst air quality in Guernsey is generally good, the levels of NO2 exceed UK standards in several locations. The evidence indicates that people suffering from asthma have exacerbation of their symptoms if exposed to elevated levels of air pollutants including NO2, although this research has never been carried out in Guernsey before. In addition, exposure assessment of individuals is rarely carried out and research in this area is limited due to the complexity of undertaking such a study, which will include a combination of exposures in the home, the workplace and ambient exposures, which vary depending on the individual daily experience. For the first time in Guernsey, this research has examined NO2 levels in correlation with asthma patient admissions to hospital, assessment of NO2 exposures in typical homes and typical workplaces in Guernsey. The data showed a temporal correlation between NO2 levels and the number of hospital admissions and the trend from 2008-2012 was upwards. Statistical analysis of the data did not show a significant linear correlation due to the small size of the data sets. Exposure assessment of individuals showed a spatial variation in exposures in Guernsey and assessment in indoor environments showed that real-time analysis of NO2 levels needs to be undertaken if indoor micro environments for NO2 are the be assessed adequately. There was temporal and spatial variation in NO2 concentrations measured using diffusion tubes, which provide a monthly mean value, and analysers measuring NO2 concentrations in real time. The research shows that building layout and design are important factors for good air flow and ventilation and the dispersion of NO2 indoors. Environmental Health Officers have statutory responsibilities for ambient air quality, hygiene of buildings and workplace environments and this role needs to be co-ordinated with healthcare professionals to improve health outcomes for asthmatics. The outcome of the thesis was the development of a risk management framework for pre-existing asthmatics at work for use by regulators of workplaces and an information leaflet to assist in improving health outcomes for asthmatics in Guernsey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation mainly focuses on coordinated pricing and inventory management problems, where the related background is provided in Chapter 1. Several periodic-review models are then discussed in Chapters 2,3,4 and 5, respectively. Chapter 2 analyzes a deterministic single-product model, where a price adjustment cost incurs if the current selling price is changed from the previous period. We develop exact algorithms for the problem under different conditions and find out that computation complexity varies significantly associated with the cost structure. %Moreover, our numerical study indicates that dynamic pricing strategies may outperform static pricing strategies even when price adjustment cost accounts for a significant portion of the total profit. Chapter 3 develops a single-product model in which demand of a period depends not only on the current selling price but also on past prices through the so-called reference price. Strongly polynomial time algorithms are designed for the case without no fixed ordering cost, and a heuristic is proposed for the general case together with an error bound estimation. Moreover, our illustrates through numerical studies that incorporating reference price effect into coordinated pricing and inventory models can have a significant impact on firms' profits. Chapter 4 discusses the stochastic version of the model in Chapter 3 when customers are loss averse. It extends the associated results developed in literature and proves that the reference price dependent base-stock policy is proved to be optimal under a certain conditions. Instead of dealing with specific problems, Chapter 5 establishes the preservation of supermodularity in a class of optimization problems. This property and its extensions include several existing results in the literature as special cases, and provide powerful tools as we illustrate their applications to several operations problems: the stochastic two-product model with cross-price effects, the two-stage inventory control model, and the self-financing model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Current thinking about ‘patient safety’ emphasises the causal relationship between the work environment and the delivery of clinical care. This research draws on the theory of Normal Accidents to extend this analysis and better understand the ‘organisational factors’ that threaten safety. Methods: Ethnographic research methods were used, with observations of the operating department setting for 18 month and interviews with 80 members of hospital staff. The setting for the study was the Operating Department of a large teaching hospital in the North-West of England. Results: The work of the operating department is determined by inter-dependant, ‘tightly coupled’ organisational relationships between hospital departments based upon the timely exchange of information, services and resources required for the delivery of care. Failures within these processes, manifest as ‘breakdowns’ within inter-departmental relationships lead to situations of constraint, rapid change and uncertainty in the work of the operating department that require staff to break with established routines and work with increased time and emotional pressures. This means that staff focus on working quickly, as opposed to working safely. Conclusion: Analysis of safety needs to move beyond a focus on the immediate work environment and individual practice, to consider the more complex and deeply structured organisational systems of hospital activity. For departmental managers the scope for service planning to control for safety may be limited as the structured ‘real world’ situation of service delivery is shaped by inter-department and organisational factors that are perhaps beyond the scope of departmental management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The anticipated growth of air traffic worldwide requires enhanced Air Traffic Management (ATM) technologies and procedures to increase the system capacity, efficiency, and resilience, while reducing environmental impact and maintaining operational safety. To deal with these challenges, new automation and information exchange capabilities are being developed through different modernisation initiatives toward a new global operational concept called Trajectory Based Operations (TBO), in which aircraft trajectory information becomes the cornerstone of advanced ATM applications. This transformation will lead to higher levels of system complexity requiring enhanced Decision Support Tools (DST) to aid humans in the decision making processes. These will rely on accurate predicted aircraft trajectories, provided by advanced Trajectory Predictors (TP). The trajectory prediction process is subject to stochastic effects that introduce uncertainty into the predictions. Regardless of the assumptions that define the aircraft motion model underpinning the TP, deviations between predicted and actual trajectories are unavoidable. This thesis proposes an innovative method to characterise the uncertainty associated with a trajectory prediction based on the mathematical theory of Polynomial Chaos Expansions (PCE). Assuming univariate PCEs of the trajectory prediction inputs, the method describes how to generate multivariate PCEs of the prediction outputs that quantify their associated uncertainty. Arbitrary PCE (aPCE) was chosen because it allows a higher degree of flexibility to model input uncertainty. The obtained polynomial description can be used in subsequent prediction sensitivity analyses thanks to the relationship between polynomial coefficients and Sobol indices. The Sobol indices enable ranking the input parameters according to their influence on trajectory prediction uncertainty. The applicability of the aPCE-based uncertainty quantification detailed herein is analysed through a study case. This study case represents a typical aircraft trajectory prediction problem in ATM, in which uncertain parameters regarding aircraft performance, aircraft intent description, weather forecast, and initial conditions are considered simultaneously. Numerical results are compared to those obtained from a Monte Carlo simulation, demonstrating the advantages of the proposed method. The thesis includes two examples of DSTs (Demand and Capacity Balancing tool, and Arrival Manager) to illustrate the potential benefits of exploiting the proposed uncertainty quantification method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply chains are ubiquitous in any commercial delivery systems. The exchange of goods and services, from different supply points to distinct destinations scattered along a given geographical area, requires the management of stocks and vehicles fleets in order to minimize costs while maintaining good quality services. Even if the operating conditions remain constant over a given time horizon, managing a supply chain is a very complex task. Its complexity increases exponentially with both the number of network nodes and the dynamical operational changes. Moreover, the management system must be adaptive in order to easily cope with several disturbances such as machinery and vehicles breakdowns or changes in demand. This work proposes the use of a model predictive control paradigm in order to tackle the above referred issues. The obtained simulation results suggest that this strategy promotes an easy tasks rescheduling in case of disturbances or anticipated changes in operating conditions. © Springer International Publishing Switzerland 2017

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabajo exploratorio estudia al movimiento político Mesa de la Unidad Democrática (MUD), creada con el fin de oponerse la Gobierno socialista existente en venezuela. La crítica que este documento realiza, parte desde el punto de vista de la Ciencia de la Complejidad. Algunos conceptos clave de sistemas complejos han sido utilizados para explicar el funcionamiento y organización de la MUD, esto con el objetivo de generar un diagnóstico integral de los problemas que enfrenta, y evidenciar las nuevas percepciones sobre comportamientos perjudiciales que el partido tiene actualmente. Con el enfoque de la complejidad se pretende ayudar a comprender mejor el contexto que enmarca al partido y, para, finalmente aportar una serie de soluciones a los problemas de cohesión que presen

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduce a new innovation methodology called IDEAS(R)EVOLUTION that was developed according to an on-going experimental research project started in 2007. This new approach to innovation has initial based on Design thinking for innovation theory and practice. The concept of design thinking for innovation has received much attention in recent years. This innovation approach has climbed from the design and designers knowledge field towards other knowledge areas, mainly business management and marketing. Human centered approach, radical collaboration, creativity and breakthrough thinking are the main founding principles of Design thinking that were adapted by those knowledge areas due to their assertively and fitness to the business context and market complexity evolution. Also Open innovation, User-centered innovation and later on Living Labs models emerge as answers to the market and consumers pressure and desire for new products, new services or new business models. Innovation became the principal business management focus and strategic orientation. All this changes had an impact also in the marketing theory. It is possible now to have better strategies, communications plans and continuous dialogue systems with the target audience, incorporating their insights and promoting them to the main dissemination ambassadors of our innovations in the market. Drawing upon data from five case studies, the empirical findings in this dissertation suggest that companies need to shift from Design thinking for innovation approach to an holistic, multidimensional and integrated innovation system. The innovation context it is complex, companies need deeper systems then the success formulas that “commercial “Design thinking for innovation “preaches”. They need to learn how to change their organization culture, how to empower their workforce and collaborators, how to incorporate external stakeholders in their innovation processes, hoe to measure and create key performance indicators throughout the innovation process to give them better decision making data, how to integrate meaning and purpose in their innovation philosophy. Finally they need to understand that the strategic innovation effort it is not a “one shot” story it is about creating a continuous flow of interaction and dialogue with their clients within a “value creation chain“ mindset; RESUMO: Metodologia de co-criação de um produto/marca cruzando Marketing, Design Thinking, Criativity and Management - IDEAS(R)EVOLUTION. Esta dissertação apresenta uma nova metodologia de inovação chamada IDEAS(R)EVOLUTION, que foi desenvolvida segundo um projecto de investigação experimental contínuo que teve o seu início em 2007. Esta nova abordagem baseou-se, inicialmente, na teoria e na práctica do Design thinking para a inovação. Actualmente o conceito do Design Thinking para a inovação “saiu” do dominio da area de conhecimento do Design e dos Designers, tendo despertado muito interesse noutras áreas como a Gestão e o Marketing. Uma abordagem centrada na Pessoa, a colaboração radical, a criatividade e o pensamento disruptivo são principios fundadores do movimento do Design thinking que têm sido adaptados por essas novas áreas de conhecimento devido assertividade e adaptabilidade ao contexto dos negócios e à evolução e complexidade do Mercado. Também os modelos de Inovação Aberta, a inovação centrada no utilizador e mais tarde os Living Labs, emergem como possiveis soluções para o Mercado e para a pressão e desejo dos consumidores para novos productos, serviços ou modelos de negócio. A inovação passou a ser o principal foco e orientação estratégica na Gestão. Todas estas mudanças também tiveram impacto na teoria do Marketing. Hoje é possivel criar melhores estratégias, planos de comunicação e sistemas continuos de diálogo com o público alvo, incorporando os seus insights e promovendo os consumidores como embaixadores na disseminação da inovação das empresas no Mercado Os resultados empiricos desta tese, construídos com a informação obtida nos cinco casos realizados, sugerem que as empresas precisam de se re-orientar do paradigma do Design thinking para a inovação, para um sistema de inovação mais holistico, multidimensional e integrado. O contexto da Inovação é complexo, por isso as empresas precisam de sistemas mais profundos e não apenas de “fórmulas comerciais” como o Design thinking para a inovação advoga. As Empresas precisam de aprender como mudar a sua cultura organizacional, como capacitar sua força de trabalho e colaboradores, como incorporar os públicos externos no processo de inovação, como medir o processo de inovação criando indicadores chave de performance e obter dados para um tomada de decisão mais informada, como integrar significado e propósito na sua filosofia de inovação. Por fim, precisam de perceber que uma estratégia de inovação não passa por ter “sucesso uma vez”, mas sim por criar um fluxo contínuo de interação e diálogo com os seus clientes com uma mentalidade de “cadeia de criação de valor”

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we analyze an optimal control problem for a system of two hydroelectric power stations in cascade with reversible turbines. The objective is to optimize the profit of power production while respecting the system’s restrictions. Some of these restrictions translate into state constraints and the cost function is nonconvex. This increases the complexity of the optimal control problem. The problem is solved numerically and two different approaches are adopted. These approaches focus on global optimization techniques (Chen-Burer algorithm) and on a projection estimation refinement method (PERmethod). PERmethod is used as a technique to reduce the dimension of the problem. Results and execution time of the two procedures are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades the automotive sector has seen a technological revolution, due mainly to the more restrictive regulation, the newly introduced technologies and, as last, to the poor resources of fossil fuels remaining on Earth. Promising solution in vehicles’ propulsion are represented by alternative architectures and energy sources, for example fuel-cells and pure electric vehicles. The automotive transition to new and green vehicles is passing through the development of hybrid vehicles, that usually combine positive aspects of each technology. To fully exploit the powerful of hybrid vehicles, however, it is important to manage the powertrain’s degrees of freedom in the smartest way possible, otherwise hybridization would be worthless. To this aim, this dissertation is focused on the development of energy management strategies and predictive control functions. Such algorithms have the goal of increasing the powertrain overall efficiency and contextually increasing the driver safety. Such control algorithms have been applied to an axle-split Plug-in Hybrid Electric Vehicle with a complex architecture that allows more than one driving modes, including the pure electric one. The different energy management strategies investigated are mainly three: the vehicle baseline heuristic controller, in the following mentioned as rule-based controller, a sub-optimal controller that can include also predictive functionalities, referred to as Equivalent Consumption Minimization Strategy, and a vehicle global optimum control technique, called Dynamic Programming, also including the high-voltage battery thermal management. During this project, different modelling approaches have been applied to the powertrain, including Hardware-in-the-loop, and diverse powertrain high-level controllers have been developed and implemented, increasing at each step their complexity. It has been proven the potential of using sophisticated powertrain control techniques, and that the gainable benefits in terms of fuel economy are largely influenced by the chose energy management strategy, even considering the powerful vehicle investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work proposes different approaches to extend the mathematical methods of supervisory energy management used in terrestrial environments to the maritime sector, that diverges in constraints, variables and disturbances. The aim is to find the optimal real-time solution that includes the minimization of a defined track time, while maintaining the classical energetic approach. Starting from analyzing and modelling the powertrain and boat dynamics, the energy economy problem formulation is done, following the mathematical principles behind the optimal control theory. Then, an adaptation aimed in finding a winning strategy for the Monaco Energy Boat Challenge endurance trial is performed via ECMS and A-ECMS control strategies, which lead to a more accurate knowledge of energy sources and boat’s behaviour. The simulations show that the algorithm accomplishes fuel economy and time optimization targets, but the latter adds huge tuning and calculation complexity. In order to assess a practical implementation on real hardware, the knowledge of the previous approaches has been translated into a rule-based algorithm, that let it be run on an embedded CPU. Finally, the algorithm has been tuned and tested in a real-world race scenario, showing promising results.