928 resultados para Boolean Networks Complexity Measures Automatic Design Robot Dynamics
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.
Resumo:
A new practical method to generate a subspace of active coordinates for quantum dynamics calculations is presented. These reduced coordinates are obtained as the normal modes of an analytical quadratic representation of the energy difference between excited and ground states within the complete active space self-consistent field method. At the Franck-Condon point, the largest negative eigenvalues of this Hessian correspond to the photoactive modes: those that reduce the energy difference and lead to the conical intersection; eigenvalues close to 0 correspond to bath modes, while modes with large positive eigenvalues are photoinactive vibrations, which increase the energy difference. The efficacy of quantum dynamics run in the subspace of the photoactive modes is illustrated with the photochemistry of benzene, where theoretical simulations are designed to assist optimal control experiments
Resumo:
This thesis studies robustness against large-scale failures in communications networks. If failures are isolated, they usually go unnoticed by users thanks to recovery mechanisms. However, such mechanisms are not effective against large-scale multiple failures. Large-scale failures may cause huge economic loss. A key requirement towards devising mechanisms to lessen their impact is the ability to evaluate network robustness. This thesis focuses on multilayer networks featuring separated control and data planes. The majority of the existing measures of robustness are unable to capture the true service degradation in such a setting, because they rely on purely topological features. One of the major contributions of this thesis is a new measure of functional robustness. The failure dynamics is modeled from the perspective of epidemic spreading, for which a new epidemic model is proposed. Another contribution is a taxonomy of multiple, large-scale failures, adapted to the needs and usage of the field of networking.
Resumo:
El objetivo de esta tesis es predecir el rendimiento de los estudiantes de doctorado en la Universidad de Girona según características personales (background), actitudinales y de redes sociales de los estudiantes. La población estudiada son estudiantes de tercer y cuarto curso de doctorado y sus directores de tesis doctoral. Para obtener los datos se ha diseño un cuestionario web especificando sus ventajas y teniendo en cuenta algunos problemas tradicionales de no cobertura o no respuesta. El cuestionario web se hizo debido a la complejidad que comportan de las preguntas de red social. El cuestionario electrónico permite, mediante una serie de instrucciones, reducir el tiempo para responder y hacerlo menos cargado. Este cuestionario web, además es auto administrado, lo cual nos permite, según la literatura, unas respuestas mas honestas que cuestionario con encuestador. Se analiza la calidad de las preguntas de red social en cuestionario web para datos egocéntricos. Para eso se calcula la fiabilidad y la validez de este tipo de preguntas, por primera vez a través del modelo Multirasgo Multimétodo (Multitrait Multimethod). Al ser datos egocéntricos, se pueden considerar jerárquicos, y por primera vez se una un modelo Multirasgo Multimétodo Multinivel (multilevel Multitrait Multimethod). Las la fiabilidad y validez se pueden obtener a nivel individual (within group component) o a nivel de grupo (between group component) y se usan para llevar a cabo un meta-análisis con otras universidades europeas para analizar ciertas características de diseño del cuestionario. Estas características analizan si para preguntas de red social hechas en cuestionarios web son más fiables y validas hechas "by questions" o "by alters", si son presentes todas las etiquetas de frecuencia para los ítems o solo la del inicio y final, o si es mejor que el diseño del cuestionario esté en con color o blanco y negro. También se analiza la calidad de la red social en conjunto, en este caso específico son los grupos de investigación de la universidad. Se tratan los problemas de los datos ausentes en las redes completas. Se propone una nueva alternativa a la solución típica de la red egocéntrica o los respondientes proxies. Esta nueva alternativa la hemos nombrado "Nosduocentered Network" (red Nosduocentrada), se basa en dos actores centrales en una red. Estimando modelos de regresión, esta "Nosduocentered network" tiene mas poder predictivo para el rendimiento de los estudiantes de doctorado que la red egocéntrica. Además se corrigen las correlaciones de las variables actitudinales por atenuación debido al pequeño tamaño muestral. Finalmente, se hacen regresiones de los tres tipos de variables (background, actitudinales y de red social) y luego se combinan para analizar cual para predice mejor el rendimiento (según publicaciones académicas) de los estudiantes de doctorado. Los resultados nos llevan a predecir el rendimiento académico de los estudiantes de doctorado depende de variables personales (background) i actitudinales. Asimismo, se comparan los resultados obtenidos con otros estudios publicados.
Resumo:
The thesis aims to understand the processes of entrepreneurship that try to create businesses or products with a high degree of complexity. This complexity comes from the fact that these products or initiatives can only be viable with the concurrence of a large number of heterogeneous actors (public, private, from different regions, etc..) which interact in a relational context. A case with these characteristics is the Camí dels Bons Homes. The thesis analyzes the evolution of the relational network from the point of view of its structure and content of its links. The results show and explain the observed changes in the network structure and the changes in the ties content. This analysis of the content of ties contributes to a new systematization and operationalization of ties’ content. Moreover this analysis takes in account negative ties, a less discussed issue in literature.
Resumo:
This thesis proposes a framework for identifying the root-cause of a voltage disturbance, as well as, its source location (upstream/downstream) from the monitoring place. The framework works with three-phase voltage and current waveforms collected in radial distribution networks without distributed generation. Real-world and synthetic waveforms are used to test it. The framework involves features that are conceived based on electrical principles, and assuming some hypothesis on the analyzed phenomena. Features considered are based on waveforms and timestamp information. Multivariate analysis of variance and rule induction algorithms are applied to assess the amount of meaningful information explained by each feature, according to the root-cause of the disturbance and its source location. The obtained classification rates show that the proposed framework could be used for automatic diagnosis of voltage disturbances collected in radial distribution networks. Furthermore, the diagnostic results can be subsequently used for supporting power network operation, maintenance and planning.
Resumo:
The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.
Resumo:
This thesis addresses the problem of learning in physical heterogeneous multi-agent systems (MAS) and the analysis of the benefits of using heterogeneous MAS with respect to homogeneous ones. An algorithm is developed for this task; building on a previous work on stability in distributed systems by Tad Hogg and Bernardo Huberman, and combining two phenomena observed in natural systems, task partition and hierarchical dominance. This algorithm is devised for allowing agents to learn which are the best tasks to perform on the basis of each agent's skills and the contribution to the team global performance. Agents learn by interacting with the environment and other teammates, and get rewards from the result of the actions they perform. This algorithm is specially designed for problems where all robots have to co-operate and work simultaneously towards the same goal. One example of such a problem is role distribution in a team of heterogeneous robots that form a soccer team, where all members take decisions and co-operate simultaneously. Soccer offers the possibility of conducting research in MAS, where co-operation plays a very important role in a dynamical and changing environment. For these reasons and the experience of the University of Girona in this domain, soccer has been selected as the test-bed for this research. In the case of soccer, tasks are grouped by means of roles. One of the most interesting features of this algorithm is that it endows MAS with a high adaptability to changes in the environment. It allows the team to perform their tasks, while adapting to the environment. This is studied in several cases, for changes in the environment and in the robot's body. Other features are also analysed, especially a parameter that defines the fitness (biological concept) of each agent in the system, which contributes to performance and team adaptability. The algorithm is applied later to allow agents to learn in teams of homogeneous and heterogeneous robots which roles they have to select, in order to maximise team performance. The teams are compared and the performance is evaluated in the games against three hand-coded teams and against the different homogeneous and heterogeneous teams built in this thesis. This section focuses on the analysis of performance and task partition, in order to study the benefits of heterogeneity in physical MAS. In order to study heterogeneity from a rigorous point of view, a diversity measure is developed building on the hierarchic social entropy defined by Tucker Balch. This is adapted to quantify physical diversity in robot teams. This tool presents very interesting features, as it can be used in the future to design heterogeneous teams on the basis of the knowledge on other teams.
Resumo:
The UK industry has been criticised for being slow to adopt construction process innovations. Research shows that the idiosyncrasies of participants, their roles in the system and the contextual differences between sections of the industry make this a highly complex problem. There is considerable evidence that informal social networks play a key role in diffusion of innovations. The aim is to identify informal communication networks of project participants and the role these play in the diffusion of construction innovations. The characteristics of this network will be analysed in order to understand how they can be used to accelerate innovation diffusion within and between projects. Social Network Analysis is used to determine informal communication routes. Control and experiment case study projects are used within two different organizations. This allows informal communication routes concerning innovations to be mapped, whilst testing if the informal routes can facilitate diffusion. Analysis will focus upon understanding the combination of informal strong and weak ties, and how these impede or facilitate the diffusion of the innovation. Initial work suggests the presence of an informal communication network. Actors within this informal network, and the organization's management are unaware of its' existence and their informal roles within it. Thus, the network remains an untapped medium regarding innovation diffusion. It is proposed that successful innovation diffusion is dependent upon understanding informal strong and weak ties, at project, organization and industry level.
Resumo:
The identification of criminal networks is not a routine exploratory process within the current practice of the law enforcement authorities; rather it is triggered by specific evidence of criminal activity being investigated. A network is identified when a criminal comes to notice and any associates who could also be potentially implicated would need to be identified if only to be eliminated from the enquiries as suspects or witnesses as well as to prevent and/or detect crime. However, an identified network may not be the one causing most harm in a given area.. This paper identifies a methodology to identify all of the criminal networks that are present within a Law Enforcement Area, and, prioritises those that are causing most harm to the community. Each crime is allocated a score based on its crime type and how recently the crime was committed; the network score, which can be used as decision support to help prioritise it for law enforcement purposes, is the sum of the individual crime scores.
Resumo:
Robot-mediated therapies offer a new approach to neurorehabilitation. This paper analyses the Fugl-Meyer data from the Gentle/S project and finds that the two intervention phases (sling suspension and robot mediated therapy) have approximately equal value to the further recovery of chronic stroke subjects (on average 27 months post stroke). Both sling suspension and robot mediated interventions show a recovery over baseline and further work is needed to establish the common factors in treatment, and to establish intervention protocols for each that will give individual subjects a maximum level of recovery.
Resumo:
We consider a fully complex-valued radial basis function (RBF) network for regression application. The locally regularised orthogonal least squares (LROLS) algorithm with the D-optimality experimental design, originally derived for constructing parsimonious real-valued RBF network models, is extended to the fully complex-valued RBF network. Like its real-valued counterpart, the proposed algorithm aims to achieve maximised model robustness and sparsity by combining two effective and complementary approaches. The LROLS algorithm alone is capable of producing a very parsimonious model with excellent generalisation performance while the D-optimality design criterion further enhances the model efficiency and robustness. By specifying an appropriate weighting for the D-optimality cost in the combined model selecting criterion, the entire model construction procedure becomes automatic. An example of identifying a complex-valued nonlinear channel is used to illustrate the regression application of the proposed fully complex-valued RBF network.
Resumo:
This paper describes a structural design technique for rehabilitation robot intended for upper-limb post-stroke therapy. First, a novel approach to a rehabilitation robot is proposed and the features of the robot are explained. Second, the direct kinematics and the inverse kinematics of the proposed robot structure are derived. Finally, a mechanical design procedure is explained that achieves a compromise between the required motion range and assuring the workspace safety. The suitability of a portable escort type structure for upper limb rehabilitation of both acute and chronic stroke is discussed