943 resultados para Conformal Field Models in String Theory
Resumo:
Microfinance institutions employ various kinds of incentive schemes but estimating the effect of each scheme is not easy due to endogeneity bias. We conducted field experiments in Vietnam to capture the role of joint liability, monitoring, cross-reporting, social sanctions, communication and group formation in borrowers’ repayment behavior. We find that joint liability contracts cause serious free-riding problems, inducing strategic default and lowering repayment rates. When group members observe each others’ investment returns, participants are more likely to choose strategic default. Even after introducing a cross-reporting system and/or penalties among borrowers, the default rates and the ratios of participants who chose strategic default under joint liability are still higher than those under individual lending. We also find that joint liability lending often failed to induce mutual insurance among borrowers. Those who had been helped or who had repaid a little in the previous round were more likely to default strategically and repay a little again in the current round and those who paid large amounts were always the same individuals.
Resumo:
Membrane systems are computational equivalent to Turing machines. However, their distributed and massively parallel nature obtains polynomial solutions opposite to traditional non-polynomial ones. At this point, it is very important to develop dedicated hardware and software implementations exploiting those two membrane systems features. Dealing with distributed implementations of P systems, the bottleneck communication problem has arisen. When the number of membranes grows up, the network gets congested. The purpose of distributed architectures is to reach a compromise between the massively parallel character of the system and the needed evolution step time to transit from one configuration of the system to the next one, solving the bottleneck communication problem. The goal of this paper is twofold. Firstly, to survey in a systematic and uniform way the main results regarding the way membranes can be placed on processors in order to get a software/hardware simulation of P-Systems in a distributed environment. Secondly, we improve some results about the membrane dissolution problem, prove that it is connected, and discuss the possibility of simulating this property in the distributed model. All this yields an improvement in the system parallelism implementation since it gets an increment of the parallelism of the external communication among processors. Proposed ideas improve previous architectures to tackle the communication bottleneck problem, such as reduction of the total time of an evolution step, increase of the number of membranes that could run on a processor and reduction of the number of processors.
Resumo:
At present there is much literature that refers to the advantages and disadvantages of different methods of statistical and dynamical downscaling of climate variables projected by climate models. Less attention has been paid to other indirect variables, like runoff, which play a significant role in evaluating the impact of climate change on hydrological systems. Runoff presents a much greater bias in climate models than other climate variables, like temperature or precipitation. It is very important to identify the methods that minimize bias while downscaling runoff from the gridded results of climate models to the basin scale
Resumo:
Innovations in the current interconnected world of organizations have lead to a focus on business models as a fundamental statement of direction and identity. Although industry transformations generally emanate from technological changes, recent examples suggest they may also be due to the introduction of new business models. In the past, different types of airline business models could be clearly separated from each other. However, this has changed in recent years partly due to the concentration process and partly to reaction caused by competitive pressure. At least it can be concluded that in future the distinction of different business models will remain less clear. To advance the use of business models as a concept, it is essential to be able to compare and perform analyses to identify the business models that may have the highest potential. This can essentially contribute to understanding the synergies and incompatibilities in the case of two airlines that are going in for a merger. This is illustrated by the example of Swiss Air-Lufthansa merger analysis. The idea is to develop quantitative methods and tools for comparing and analyzing Aeronautical/Airline business models. The paper identifies available methods of comparing airline business models and lays the ground work for a quantitative model of comparing airline business models. This can be a useful tool for business model analysis when two airlines are merged
Resumo:
Thanks to their inherent properties, probabilistic graphical models are one of the prime candidates for machine learning and decision making tasks especially in uncertain domains. Their capabilities, like representation, inference and learning, if used effectively, can greatly help to build intelligent systems that are able to act accordingly in different problem domains. Evolutionary algorithms is one such discipline that has employed probabilistic graphical models to improve the search for optimal solutions in complex problems. This paper shows how probabilistic graphical models have been used in evolutionary algorithms to improve their performance in solving complex problems. Specifically, we give a survey of probabilistic model building-based evolutionary algorithms, called estimation of distribution algorithms, and compare different methods for probabilistic modeling in these algorithms.
Resumo:
International Conference on Dynamics of the Media and Content Industry. European Forum for Science and Industry.
Resumo:
Wind farms have been extensively simulated through engineering models for the estimation of wind speed and power deficits inside wind farms. These models were designed initially for a few wind turbines located in flat terrain. Other models based on the parabolic approximation of Navier Stokes equations were developed, making more realistic and feasible the operational resolution of big wind farms in flat terrain and offshore sites. These models have demonstrated to be accurate enough when solving wake effects for this type of environments. Nevertheless, few analyses exist on how complex terrain can affect the behaviour of wind farm wake flow. Recent numerical studies have demonstrated that topographical wakes induce a significant effect on wind turbines wakes, compared to that on flat terrain. This circumstance has recommended the development of elliptic CFD models which allow global simulation of wind turbine wakes in complex terrain. An accurate simplification for the analysis of wind turbine wakes is the actuator disk technique. Coupling this technique with CFD wind models enables the estimation of wind farm wakes preserving the extraction of axial momentum present inside wind farms. This paper describes the analysis and validation of the elliptical wake model CFDWake 1.0 against experimental data from an operating wind farm located in complex terrain. The analysis also reports whether it is possible or not to superimpose linearly the effect of terrain and wind turbine wakes. It also represents one of the first attempts to observe the performance of engineering models compares in large complex terrain wind farms.
Resumo:
Four-dimensional flow in the phase space of three amplitudes of circularly polarized Alfven waves and one relative phase, resulting from a resonant three-wave truncation of the derivative nonlinear Schrödinger equation, has been analyzed; wave 1 is linearly unstable with growth rate , and waves 2 and 3 are stable with damping 2 and 3, respectively. The dependence of gross dynamical features on the damping model as characterized by the relation between damping and wave-vector ratios, 2 /3, k2 /k3, and the polarization of the waves, is discussed; two damping models, Landau k and resistive k2, are studied in depth. Very complex dynamics, such as multiple blue sky catastrophes and chaotic attractors arising from Feigenbaum sequences, and explosive bifurcations involving Intermittency-I chaos, are shown to be associated with the existence and loss of stability of certain fixed point P of the flow. Independently of the damping model, P may only exist as against flow contraction just requiring.In the case of right-hand RH polarization, point P may exist for all models other than Landau damping; for the resistive model, P may exist for RH polarization only if 2+3/2.
Resumo:
La computación molecular es una disciplina que se ocupa del diseño e implementación de dispositivos para el procesamiento de información sobre un sustrato biológico, como el ácido desoxirribonucleico (ADN), el ácido ribonucleico (ARN) o las proteínas. Desde que Watson y Crick descubrieron en los años cincuenta la estructura molecular del ADN en forma de doble hélice, se desencadenaron otros descubrimientos, como las enzimas de restricción o la reacción en cadena de la polimerasa (PCR), contribuyendo de manera determinante a la irrupción de la tecnología del ADN recombinante. Gracias a esta tecnología y al descenso vertiginoso de los precios de secuenciación y síntesis del ADN, la computación biomolecular pudo abandonar su concepción puramente teórica. El trabajo presentado por Adleman (1994) logró resolver un problema de computación NP-completo (El Problema del Camino de Hamilton dirigido) utilizando únicamente moléculas de ADN. La gran capacidad de procesamiento en paralelo ofrecida por las técnicas del ADN recombinante permitió a Adleman ser capaz de resolver dicho problema en tiempo polinómico, aunque a costa de un consumo exponencial de moléculas de ADN. Utilizando algoritmos de fuerza bruta similares al utilizado por Adleman se logró resolver otros problemas NP-completos, como por ejemplo el de Satisfacibilidad de Fórmulas Lógicas / SAT (Lipton, 1995). Pronto se comprendió que la computación biomolecular no podía competir en velocidad ni precisión con los ordenadores de silicio, por lo que su enfoque y objetivos se centraron en la resolución de problemas con aplicación biomédica (Simmel, 2007), dejando de lado la resolución de problemas clásicos de computación. Desde entonces se han propuesto diversos modelos de dispositivos biomoleculares que, de forma autónoma (sin necesidad de un bio-ingeniero realizando operaciones de laboratorio), son capaces de procesar como entrada un sustrato biológico y proporcionar una salida también en formato biológico: procesadores que aprovechan la extensión de la polimerasa (Hagiya et al., 1997), autómatas que funcionan con enzimas de restricción (Benenson et al., 2001) o con deoxiribozimas (Stojanovic et al., 2002), o circuitos de hibridación competitiva (Yurke et al., 2000). Esta tesis presenta un conjunto de modelos de dispositivos de ácidos nucleicos capaces de implementar diversas operaciones de computación lógica aprovechando técnicas de computación biomolecular (hibridación competitiva del ADN y reacciones enzimáticas) con aplicaciones en diagnóstico genético. El primer conjunto de modelos, presentados en el Capítulo 5 y publicados en Sainz de Murieta and Rodríguez-Patón (2012b), Rodríguez-Patón et al. (2010a) y Sainz de Murieta and Rodríguez-Patón (2010), define un tipo de biosensor que usa hebras simples de ADN para codificar reglas sencillas, como por ejemplo "SI hebra-ADN-1 Y hebra-ADN-2 presentes, ENTONCES enfermedad-B". Estas reglas interactúan con señales de entrada (ADN o ARN de cualquier tipo) para producir una señal de salida (también en forma de ácido nucleico). Dicha señal de salida representa un diagnóstico, que puede medirse mediante partículas fluorescentes técnicas FRET) o incluso ser un tratamiento administrado en respuesta a un conjunto de síntomas. El modelo presentado en el Capítulo 5, publicado en Rodríguez-Patón et al. (2011), es capaz de ejecutar cadenas de resolución sobre fórmulas lógicas en forma normal conjuntiva. Cada cláusula de una fórmula se codifica en una molécula de ADN. Cada proposición p se codifica asignándole una hebra simple de ADN, y la correspondiente hebra complementaria a la proposición ¬p. Las cláusulas se codifican incluyendo distintas proposiciones en la misma hebra de ADN. El modelo permite ejecutar programas lógicos de cláusulas Horn aplicando múltiples iteraciones de resolución en cascada, con el fin de implementar la función de un nanodispositivo autónomo programable. Esta técnica también puede emplearse para resolver SAP sin ayuda externa. El modelo presentado en el Capítulo 6 se ha publicado en publicado en Sainz de Murieta and Rodríguez-Patón (2012c), y el modelo presentado en el Capítulo 7 se ha publicado en (Sainz de Murieta and Rodríguez-Patón, 2013c). Aunque explotan métodos de computación biomolecular diferentes (hibridación competitiva de ADN en el Capítulo 6 frente a reacciones enzimáticas en el 7), ambos modelos son capaces de realizar inferencia Bayesiana. Funcionan tomando hebras simples de ADN como entrada, representando la presencia o la ausencia de un indicador molecular concreto (una evidencia). La probabilidad a priori de una enfermedad, así como la probabilidad condicionada de una señal (o síntoma) dada la enfermedad representan la base de conocimiento, y se codifican combinando distintas moléculas de ADN y sus concentraciones relativas. Cuando las moléculas de entrada interaccionan con las de la base de conocimiento, se liberan dos clases de hebras de ADN, cuya proporción relativa representa la aplicación del teorema de Bayes: la probabilidad condicionada de la enfermedad dada la señal (o síntoma). Todos estos dispositivos pueden verse como elementos básicos que, combinados modularmente, permiten la implementación de sistemas in vitro a partir de sensores de ADN, capaces de percibir y procesar señales biológicas. Este tipo de autómatas tienen en la actualidad una gran potencial, además de una gran repercusión científica. Un perfecto ejemplo fue la publicación de (Xie et al., 2011) en Science, presentando un autómata biomolecular de diagnóstico capaz de activar selectivamente el proceso de apoptosis en células cancerígenas sin afectar a células sanas.
Resumo:
In this paper, we investigate the real demand for climate protection when the purely individual perspective of existing revealed preference studies is relaxed. This is achieved in two treatments; first, we determine the information subjects receive about the demand revealed by other subjects in a similar decision making situation, second, collective action is implemented whereby all subjects are required to purchase the group?s median quantity at a given price. Participants in the experiment were offered the opportunity to contribute to climate protection by purchasing European Union Allowances. Allowances purchased were withdrawn from the European Emissions Trading Scheme. In our experiment, information about other subjects? behaviour has no treatment effect on the demand for climate protection. Under collective action however, the probability of purchasing allowances is higher compared to the reference treatment situation, an individual contribution mechanism. Furthermore, we observe a strong correlation between subjects? demand and their expectations about other participants? behaviour. When collective action is not available, subjects? e xpectations are consistent with free rider behaviour.
Resumo:
The objective of this study is to analyze the common pool resource appropriation and public good provisiondecisions in a dynamic setting, testing the differences in behavior and performance between lab and field subjects. We performeda total of 45 games in Nicaragua, including 88 villagers in rural communities and 92 undergraduate students. In order to analyze sequential decision making, we introduce a dynamic and asymmetric irrigation game that combines the typical social dilemmas associated to irrigation systems management.In addition, in 9 out of 22 villagers’ groups, we implemented a treatment that included the disclosure of subjects’ appropriation of the common pool resource. The results reveal that the provision of individuals’ appropriation level results in higher appropriation in subsequent rounds. In addition, the results show that non-treated villagers provide more public good than treated villagers but if compared with students the differences are not significant. The results also suggest that appropriation levels are below the Nash prediction of full appropriation, but above the social efficient level. This results in an efficiency loss in the game that can be explained to a large extent by individual decisions on appropriation and public good contribution and by group appropriation behavior.
Radar track segmentation with cubic splines for collision risk models in high density terminal areas
Resumo:
This paper presents a method to segment airplane radar tracks in high density terminal areas where the air traffic follows trajectories with several changes in heading, speed and altitude. The radar tracks are modelled with different types of segments, straight lines, cubic spline function and shape preserving cubic function. The longitudinal, lateral and vertical deviations are calculated for terminal manoeuvring area scenarios. The most promising model of the radar tracks resulted from a mixed interpolation using straight lines for linear segments and spline cubic functions for curved segments. A sensitivity analysis is used to optimise the size of the window for the segmentation process.