5 resultados para Network re-configuration
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Motivated by the need to understand which are the underlying forces that trigger network evolution, we develop a multilevel theoretical and empirically testable model to examine the relationship between changes in the external environment and network change. We refer to network change as the dissolution or replacement of an interorganizational tie, adding also the case of the formation of new ties with new or preexisting partners. Previous research has paid scant attention to the organizational consequences of quantum change enveloping entire industries in favor of an emphasis on continuous change. To highlight radical change we introduce the concept of environmental jolt. The September 11 terrorist attacks provide us with a natural experiment to test our hypotheses on the antecedents and the consequences of network change. Since network change can be explained at multiple levels, we incorporate firm-level variables as moderators. The empirical setting is the global airline industry, which can be regarded as a constantly changing network of alliances. The study reveals that firms react to environmental jolts by forming homophilous ties and transitive triads as opposed to the non jolt periods. Moreover, we find that, all else being equal, firms that adopt a brokerage posture will have positive returns. However, we find that in the face of an environmental jolt brokerage relates negatively to firm performance. Furthermore, we find that the negative relationship between brokerage and performance during an environmental jolt is more significant for larger firms. Our findings suggest that jolts are an important predictor of network change, that they significantly affect operational returns and should be thus incorporated in studies of network dynamics.
Resumo:
This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.
Resumo:
Multi-Processor SoC (MPSOC) design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. Scaling down of process technologies has increased process and dynamic variations as well as transistor wearout. Because of this, delay variations increase and impact the performance of the MPSoCs. The interconnect architecture inMPSoCs becomes a single point of failure as it connects all other components of the system together. A faulty processing element may be shut down entirely, but the interconnect architecture must be able to tolerate partial failure and variations and operate with performance, power or latency overhead. This dissertation focuses on techniques at different levels of abstraction to face with the reliability and variability issues in on-chip interconnection networks. By showing the test results of a GALS NoC testchip this dissertation motivates the need for techniques to detect and work around manufacturing faults and process variations in MPSoCs’ interconnection infrastructure. As a physical design technique, we propose the bundle routing framework as an effective way to route the Network on Chips’ global links. For architecture-level design, two cases are addressed: (I) Intra-cluster communication where we propose a low-latency interconnect with variability robustness (ii) Inter-cluster communication where an online functional testing with a reliable NoC configuration are proposed. We also propose dualVdd as an orthogonal way of compensating variability at the post-fabrication stage. This is an alternative strategy with respect to the design techniques, since it enforces the compensation at post silicon stage.
Resumo:
In this thesis we will see that the DNA sequence is constantly shaped by the interactions with its environment at multiple levels, showing footprints of DNA methylation, of its 3D organization and, in the case of bacteria, of the interaction with the host organisms. In the first chapter, we will see that analyzing the distribution of distances between consecutive dinucleotides of the same type along the sequence, we can detect epigenetic and structural footprints. In particular, we will see that CG distance distribution allows to distinguish among organisms of different biological complexity, depending on how much CG sites are involved in DNA methylation. Moreover, we will see that CG and TA can be described by the same fitting function, suggesting a relationship between the two. We will also provide an interpretation of the observed trend, simulating a positioning process guided by the presence and absence of memory. In the end, we will focus on TA distance distribution, characterizing deviations from the trend predicted by the best fitting function, and identifying specific patterns that might be related to peculiar mechanical properties of the DNA and also to epigenetic and structural processes. In the second chapter, we will see how we can map the 3D structure of the DNA onto its sequence. In particular, we devised a network-based algorithm that produces a genome assembly starting from its 3D configuration, using as inputs Hi-C contact maps. Specifically, we will see how we can identify the different chromosomes and reconstruct their sequences by exploiting the spectral properties of the Laplacian operator of a network. In the third chapter, we will see a novel method for source clustering and source attribution, based on a network approach, that allows to identify host-bacteria interaction starting from the detection of Single-Nucleotide Polymorphisms along the sequence of bacterial genomes.
Resumo:
Water Distribution Networks (WDNs) play a vital importance rule in communities, ensuring well-being band supporting economic growth and productivity. The need for greater investment requires design choices will impact on the efficiency of management in the coming decades. This thesis proposes an algorithmic approach to address two related problems:(i) identify the fundamental asset of large WDNs in terms of main infrastructure;(ii) sectorize large WDNs into isolated sectors in order to respect the minimum service to be guaranteed to users. Two methodologies have been developed to meet these objectives and subsequently they were integrated to guarantee an overall process which allows to optimize the sectorized configuration of WDN taking into account the needs to integrated in a global vision the two problems (i) and (ii). With regards to the problem (i), the methodology developed introduces the concept of primary network to give an answer with a dual approach, of connecting main nodes of WDN in terms of hydraulic infrastructures (reservoirs, tanks, pumps stations) and identifying hypothetical paths with the minimal energy losses. This primary network thus identified can be used as an initial basis to design the sectors. The sectorization problem (ii) has been faced using optimization techniques by the development of a new dedicated Tabu Search algorithm able to deal with real case studies of WDNs. For this reason, three new large WDNs models have been developed in order to test the capabilities of the algorithm on different and complex real cases. The developed methodology also allows to automatically identify the deficient parts of the primary network and dynamically includes new edges in order to support a sectorized configuration of the WDN. The application of the overall algorithm to the new real case studies and to others from literature has given applicable solutions even in specific complex situations.