839 resultados para uncertainty-based coordination


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of planning multiple vehicles deals with the design of an effective algorithm that can cause multiple autonomous vehicles on the road to communicate and generate a collaborative optimal travel plan. Our modelling of the problem considers vehicles to vary greatly in terms of both size and speed, which makes it suboptimal to have a faster vehicle follow a slower vehicle or for vehicles to drive with predefined speed lanes. It is essential to have a fast planning algorithm whilst still being probabilistically complete. The Rapidly Exploring Random Trees (RRT) algorithm developed and reported on here uses a problem specific coordination axis, a local optimization algorithm, priority based coordination, and a module for deciding travel speeds. Vehicles are assumed to remain in their current relative position laterally on the road unless otherwise instructed. Experimental results presented here show regular driving behaviours, namely vehicle following, overtaking, and complex obstacle avoidance. The ability to showcase complex behaviours in the absence of speed lanes is characteristic of the solution developed.

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The most widely used updating rule for non-additive probalities is the Dempster-Schafer rule. Schmeidles and Gilboa have developed a model of decision making under uncertainty based on non-additive probabilities, and in their paper “Updating Ambiguos Beliefs” they justify the Dempster-Schafer rule based on a maximum likelihood procedure. This note shows in the context of Schmeidler-Gilboa preferences under uncertainty, that the Dempster-Schafer rule is in general not ex-ante optimal. This contrasts with Brown’s result that Bayes’ rule is ex-ante optimal for standard Savage preferences with additive probabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Há um grau de incerteza que é próprio da atividade jurisdicional e não é possível de ser mitigado em razão da própria natureza dos juízos a respeito de normas jurídicas. Decisões judiciais não são e nem podem ser absolutamente previsíveis. Há, contudo, um grau de incerteza que é evitável e o deve ser evitado, por ser prejudicial à saúde de um sistema jurídico. Outros pesquisadores no Brasil trabalharam com esta noção, e foi muito bem sucedida a formulação dos conceitos de incerteza estrutural e incerteza patológica de Joaquim Falcão, Luís Fernando Schuartz e Diego Arguelhes. Contudo, acreditamos que a concepção de incerteza patológica apresentada dos autores precisa de reformulação, especialmente para que pudesse ser verificada a partir de elementos da decisão judicial e não apenas de elementos sociológicos e psicológicos. Propomos uma concepção de incerteza patológica calcada na qualidade da fundamentação das decisões judiciais e concluímos que o cultivo de uma cultura de precedentes é necessária no Brasil para mitigar os efeitos nocivos da incerteza patológica.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las transformaciones operadas por la globalización tienen un carácter multidimensional que trasciende la mera internacionalización de la economía y se traduce en la difusión de ideas, valores, modos de producción y gestión, fórmulas organizativas públicas y privadas y aun estilos de vida que se difunden a través de las fronteras nacionales. Aparece, entonces, un nuevo escenario para el conocimiento administrativo, que precisa ser definido en sus alcances dado que afecta tanto a prácticas y actores como a los valores subyacentes. Este trabajo se interroga sobre la influencia de las transformaciones posmodernas (analizadas aquí como quiebras o rupturas de la modernidad), en la crisis de las instituciones, entendidas como las prácticas extensivas que condensan los valores sociales -y los hacen operativos- a la vez que permiten reducir la incertidumbre. A partir de este concepto de crisis institucional y su tipificación, se analizan algunas modalidades de gestión que aparecen, bien como alterativas a la situación de crisis planteada, bien como respuesta de adaptación a la realidad de una sociedad globalizada.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabajo aborda el problema de modelizar sistemas din´amicos reales a partir del estudio de sus series temporales, usando una formulaci´on est´andar que pretende ser una abstracci´on universal de los sistemas din´amicos, independientemente de su naturaleza determinista, estoc´astica o h´ıbrida. Se parte de modelizaciones separadas de sistemas deterministas por un lado y estoc´asticos por otro, para converger finalmente en un modelo h´ıbrido que permite estudiar sistemas gen´ericos mixtos, esto es, que presentan una combinaci´on de comportamiento determinista y aleatorio. Este modelo consta de dos componentes, uno determinista consistente en una ecuaci´on en diferencias, obtenida a partir de un estudio de autocorrelaci´on, y otro estoc´astico que modeliza el error cometido por el primero. El componente estoc´astico es un generador universal de distribuciones de probabilidad, basado en un proceso compuesto de variables aleatorias, uniformemente distribuidas en un intervalo variable en el tiempo. Este generador universal es deducido en la tesis a partir de una nueva teor´ıa sobre la oferta y la demanda de un recurso gen´erico. El modelo resultante puede formularse conceptualmente como una entidad con tres elementos fundamentales: un motor generador de din´amica determinista, una fuente interna de ruido generadora de incertidumbre y una exposici´on al entorno que representa las interacciones del sistema real con el mundo exterior. En las aplicaciones estos tres elementos se ajustan en base al hist´orico de las series temporales del sistema din´amico. Una vez ajustados sus componentes, el modelo se comporta de una forma adaptativa tomando como inputs los nuevos valores de las series temporales del sistema y calculando predicciones sobre su comportamiento futuro. Cada predicci´on se presenta como un intervalo dentro del cual cualquier valor es equipro- bable, teniendo probabilidad nula cualquier valor externo al intervalo. De esta forma el modelo computa el comportamiento futuro y su nivel de incertidumbre en base al estado actual del sistema. Se ha aplicado el modelo en esta tesis a sistemas muy diferentes mostrando ser muy flexible para afrontar el estudio de campos de naturaleza dispar. El intercambio de tr´afico telef´onico entre operadores de telefon´ıa, la evoluci´on de mercados financieros y el flujo de informaci´on entre servidores de Internet son estudiados en profundidad en la tesis. Todos estos sistemas son modelizados de forma exitosa con un mismo lenguaje, a pesar de tratarse de sistemas f´ısicos totalmente distintos. El estudio de las redes de telefon´ıa muestra que los patrones de tr´afico telef´onico presentan una fuerte pseudo-periodicidad semanal contaminada con una gran cantidad de ruido, sobre todo en el caso de llamadas internacionales. El estudio de los mercados financieros muestra por su parte que la naturaleza fundamental de ´estos es aleatoria con un rango de comportamiento relativamente acotado. Una parte de la tesis se dedica a explicar algunas de las manifestaciones emp´ıricas m´as importantes en los mercados financieros como son los “fat tails”, “power laws” y “volatility clustering”. Por ´ultimo se demuestra que la comunicaci´on entre servidores de Internet tiene, al igual que los mercados financieros, una componente subyacente totalmente estoc´astica pero de comportamiento bastante “d´ocil”, siendo esta docilidad m´as acusada a medida que aumenta la distancia entre servidores. Dos aspectos son destacables en el modelo, su adaptabilidad y su universalidad. El primero es debido a que, una vez ajustados los par´ametros generales, el modelo se “alimenta” de los valores observables del sistema y es capaz de calcular con ellos comportamientos futuros. A pesar de tener unos par´ametros fijos, la variabilidad en los observables que sirven de input al modelo llevan a una gran riqueza de ouputs posibles. El segundo aspecto se debe a la formulaci´on gen´erica del modelo h´ıbrido y a que sus par´ametros se ajustan en base a manifestaciones externas del sistema en estudio, y no en base a sus caracter´ısticas f´ısicas. Estos factores hacen que el modelo pueda utilizarse en gran variedad de campos. Por ´ultimo, la tesis propone en su parte final otros campos donde se han obtenido ´exitos preliminares muy prometedores como son la modelizaci´on del riesgo financiero, los algoritmos de routing en redes de telecomunicaci´on y el cambio clim´atico. Abstract This work faces the problem of modeling dynamical systems based on the study of its time series, by using a standard language that aims to be an universal abstraction of dynamical systems, irrespective of their deterministic, stochastic or hybrid nature. Deterministic and stochastic models are developed separately to be merged subsequently into a hybrid model, which allows the study of generic systems, that is to say, those having both deterministic and random behavior. This model is a combination of two different components. One of them is deterministic and consisting in an equation in differences derived from an auto-correlation study and the other is stochastic and models the errors made by the deterministic one. The stochastic component is an universal generator of probability distributions based on a process consisting in random variables distributed uniformly within an interval varying in time. This universal generator is derived in the thesis from a new theory of offer and demand for a generic resource. The resulting model can be visualized as an entity with three fundamental elements: an engine generating deterministic dynamics, an internal source of noise generating uncertainty and an exposure to the environment which depicts the interactions between the real system and the external world. In the applications these three elements are adjusted to the history of the time series from the dynamical system. Once its components have been adjusted, the model behaves in an adaptive way by using the new time series values from the system as inputs and calculating predictions about its future behavior. Every prediction is provided as an interval, where any inner value is equally probable while all outer ones have null probability. So, the model computes the future behavior and its level of uncertainty based on the current state of the system. The model is applied to quite different systems in this thesis, showing to be very flexible when facing the study of fields with diverse nature. The exchange of traffic between telephony operators, the evolution of financial markets and the flow of information between servers on the Internet are deeply studied in this thesis. All these systems are successfully modeled by using the same “language”, in spite the fact that they are systems physically radically different. The study of telephony networks shows that the traffic patterns are strongly weekly pseudo-periodic but mixed with a great amount of noise, specially in the case of international calls. It is proved that the underlying nature of financial markets is random with a moderate range of variability. A part of this thesis is devoted to explain some of the most important empirical observations in financial markets, such as “fat tails”, “power laws” and “volatility clustering”. Finally it is proved that the communication between two servers on the Internet has, as in the case of financial markets, an underlaying random dynamics but with a narrow range of variability, being this lack of variability more marked as the distance between servers is increased. Two aspects of the model stand out as being the most important: its adaptability and its universality. The first one is due to the fact that once the general parameters have been adjusted , the model is “fed” on the observable manifestations of the system in order to calculate its future behavior. Despite the fact that the model has fixed parameters the variability in the observable manifestations of the system, which are used as inputs of the model, lead to a great variability in the possible outputs. The second aspect is due to the general “language” used in the formulation of the hybrid model and to the fact that its parameters are adjusted based on external manifestations of the system under study instead of its physical characteristics. These factors made the model suitable to be used in great variety of fields. Lastly, this thesis proposes other fields in which preliminary and promising results have been obtained, such as the modeling of financial risk, the development of routing algorithms for telecommunication networks and the assessment of climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La modelización es un proceso por el que se obtienen modelos de los procesos del ´mundo real´ mediante la utilización de simplificaciones. Sin embargo, las estimaciones obtenidas con el modelo llevan implícitas incertidumbre que se debe evaluar. Mediante un análisis de sensibilidad se puede mejorar la confianza en los resultados, sin embargo, este paso a veces no se realiza debido básicamente al trabajo que lleva consigo este tipo de análisis. Además, al crear un modelo, hay que mantener un equilibrio entre la obtención de resultados lo más exactos posible mediante un modelo lo más sencillo posible. Por ello, una vez creado un modelo, es imprescindible comprobar si es necesario o no incluir más procesos que en un principio no se habían incluido. Los servicios ecosistémicos son los procesos mediante los cuales los ecosistemas mantienen y satisfacen el bienestar humano. La importancia que los servicios ecosistémicos y sus beneficios asociados tienen, junto con la necesidad de realizar una buena gestión de los mismos, han estimulado la aparición de modelos y herramientas para cuantificarlos. InVEST (Integrated Valuation of Ecosystem Services and Tradoffs) es una de estas herramientas específicas para calcular servicios eco-sistémicos, desarrollada por Natural Capital Project (Universidad de Stanford, EEUU). Como resultado del creciente interés en calcular los servicios eco-sistémicos, se prevé un incremento en la aplicación del InVEST. La investigación desarrollada en esta Tesis pretende ayudar en esas otras importantes fases necesarias después de la creación de un modelo, abarcando los dos siguientes trabajos. El primero es la aplicación de un análisis de sensibilidad al modelo en una cuenca concreta mediante la metodología más adecuada. El segundo es relativo a los procesos dentro de la corriente fluvial que actualmente no se incluyen en el modelo mediante la creación y aplicación de una metodología que estudiara el papel que juegan estos procesos en el modelo InVEST de retención de nutrientes en el área de estudio. Los resultados de esta Tesis contribuirán a comprender la incertidumbre involucrada en el proceso de modelado. También pondrá de manifiesto la necesidad de comprobar el comportamiento de un modelo antes de utilizarlo y en el momento de interpretar los resultados obtenidos. El trabajo en esta Tesis contribuirá a mejorar la plataforma InVEST, que es una herramienta importante en el ámbito de los servicios de los ecosistemas. Dicho trabajo beneficiará a los futuros usuarios de la herramienta, ya sean investigadores (en investigaciones futuras), o técnicos (en futuros trabajos de toma de decisiones o gestión ecosistemas). ABSTRACT Modeling is the process to idealize real-world situations through simplifications in order to obtain a model. However, model estimations lead to uncertainties that have to be evaluated formally. The role of the sensitivity analysis (SA) is to assign model output uncertainty based on the inputs and can increase confidence in model, however, it is often omitted in modelling, usually as a result of the growing effort it involves. In addition, the balance between accuracy and simplicity is not easy to assess. For this reason, when a model is developed, it is necessary to test it in order to understand its behavior and to include, if necessary, more complexity to get a better response. Ecosystem services are the conditions and processes through which natural ecosystems, and their constituent species, sustain and fulfill human life. The relevance of ecosystem services and the need to better manage them and their associated benefits have stimulated the emergence of models and tools to measure them. InVEST, Integrated Valuation of Ecosystem Services and Tradoffs, is one of these ecosystem services-specific tools developed by the Natural Capital Project (Stanford University, USA). As a result of the growing interest in measuring ecosystem services, the use of InVEST is anticipated to grow exponentially in the coming years. However, apart from model development, making a model involves other crucial stages such as its evaluation and application in order to validate estimations. The work developed in this thesis tries to help in this relevant and imperative phase of the modeling process, and does so in two different ways. The first one is to conduct a sensitivity analysis of the model, which consists in choosing and applying a methodology in an area and analyzing the results obtained. The second is related to the in-stream processes that are not modeled in the current model, and consists in creating and applying a methodology for testing the streams role in the InVEST nutrient retention model in a case study, analyzing the results obtained. The results of this Thesis will contribute to the understanding of the uncertainties involved in the modeling process. It will also illustrate the need to check the behavior of every model developed before putting them in production and illustrate the importance of understanding their behavior in terms of correctly interpreting the results obtained in light of uncertainty. The work in this thesis will contribute to improve the InVEST platform, which is an important tool in the field of ecosystem services. Such work will benefit future users, whether they are researchers (in their future research), or technicians (in their future work in ecosystem conservation or management decisions).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the Light Controlled Factory part-to-part assembly and reduced weight will be enabled through the use of predictive fitting processes; low cost high accuracy reconfigurable tooling will be made possible by active compensation; improved control will allow accurate robotic machining; and quality will be improved through the use of traceable uncertainty based quality control throughout the production system. A number of challenges must be overcome before this vision will be realized; 1) controlling industrial robots for accurate machining; 2) compensation of measurements for thermal expansion; 3) Compensation of measurements for refractive index changes; 4) development of Embedded Metrology Tooling for in-tooling measurement and active tooling compensation; and 5) development of Software for the Planning and Control of Integrated Metrology Networks based on Quality Control with Uncertainty Evaluation and control systems for predictive processes. This paper describes how these challenges are being addressed, in particular the central challenge of developing large volume measurement process models within an integrated dimensional variation management (IDVM) system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Principal Topic: There is increasing recognition that the organizational configurations of corporate venture units should depend on the types of ventures the unit seeks to develop (Burgelman, 1984; Hill and Birkinshaw, 2008). Distinction have been made between internal and external as well as exploitative versus explorative ventures (Hill and Birkinshaw, 2008; Narayan et al., 2009; Schildt et al., 2005). Assuming that firms do not want to limit themselves to a single type of venture, but rather employ a portfolio of ventures, the logical consequence is that firms should employ multiple corporate venture units. Each venture unit tailor-made for the type of venture it seeks to develop. Surprisingly, there is limited attention in the literature for the challenges of managing multiple corporate venture units in a single firm. Maintaining multiple venture units within one firm provides easier access to funding for new ideas (Hamel, 1999). It allows for freedom and flexibility to tie the organizational systems (Rice et al., 2000), autonomy (Hill and Rothaermel, 2003), and involvement of management (Day, 1994; Wadwha and Kotha, 2006) to the requirements of the individual ventures. Yet, the strategic objectives of a venture may change when uncertainty around the venture is resolved (Burgelman, 1984). For example, firms may decide to spin-in external ventures (Chesbrough, 2002) or spun-out ventures that prove strategically unimportant (Burgelman, 1984). This suggests that ventures might need to be transferred between venture units, e.g. from a more internally-driven corporate venture division to a corporate venture capital unit. Several studies suggested that ventures require different managerial skills across their phase of development (Desouza et al., 2007; O'Connor and Ayers, 2005; Kazanjian and Drazin, 1990; Westerman et al., 2006). To facilitate effective transfer between venture units and manage the overall venturing process, it is important that firms set up and manage integrative linkages. Integrative linkages provide synergies and coordination between differentiated units (Lawrence and Lorsch, 1967). Prior findings pointed to the important role of senior management (Westerman et al., 2006; Gilbert, 2006) and a shared organizational vision (Burgers et al., 2009) to coordinate venture units with mainstream businesses. We will draw on these literatures to investigate the key question of how to integratively manage multiple venture units. ---------- Methodology/Key Propositions: In order to seek an answer to the research question, we employ a case study approach that provides unique insights into how firms can break up their venturing process. We selected three Fortune 500 companies that employ multiple venturing units, IBM, Royal Dutch/ Shell and Nokia, and investigated and compared their approaches. It was important that the case companies somewhat differed in the type of venture units they employed as well as the way they integrate and coordinate their venture units. The data are based on extensive interviews and a variety of internal and external company documents to triangulate our findings (Eisenhardt, 1989). The key proposition of the article is that firms can best manage their multiple venture units through an ambidextrous design of loosely coupled units. This provides venture units with sufficient flexibility to employ organizational configurations that best support the type of venture they seek to develop, as well as provides sufficient integration to facilitate smooth transfer of ventures between venture units. Based on the case findings, we develop a generic framework for a new way of managing the venturing process through multiple corporate venture units. ---------- Results and Implications: One of our main findings is that these firms tend to organize their venture units according to phases in the venture development process. That is, they tend to have venture units aimed at incubation of venture ideas as well as units aimed more at the commercialization of ventures into a new business unit for the firm or a start-up. The companies in our case studies tended to coordinate venture units through integrative management skills or a coordinative venture unit that spanned multiple phases. We believe this paper makes two significant contributions. First, we extend prior venturing literature by addressing how firms manage a portfolio of venture units, each achieving different strategic objectives. Second, our framework provides recommendations on how firms should manage such an approach towards venturing. This helps to increase the likelihood of success of their venturing programs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Distributed generators (DGs) are defined as generators that are connected to a distribution network. The direction of the power flow and short-circuit current in a network could be changed compared with one without DGs. The conventional protective relay scheme does not meet the requirement in this emerging situation. As the number and capacity of DGs in the distribution network increase, the problem of coordinating protective relays becomes more challenging. Given this background, the protective relay coordination problem in distribution systems is investigated, with directional overcurrent relays taken as an example, and formulated as a mixed integer nonlinear programming problem. A mathematical model describing this problem is first developed, and the well-developed differential evolution algorithm is then used to solve it. Finally, a sample system is used to demonstrate the feasiblity and efficiency of the developed method.