759 resultados para Analytic Network Process (ANP)
Resumo:
The production of artistic prints in the sixteenth- and seventeenth-century Netherlands was an inherently social process. Turning out prints at any reasonable scale depended on the fluid coordination between designers, platecutters, and publishers; roles that, by the sixteenth century, were considered distinguished enough to merit distinct credits engraved on the plates themselves: invenit, fecit/sculpsit, and excudit. While any one designer, plate cutter, and publisher could potentially exercise a great deal of influence over the production of a single print, their individual decisions (Whom to select as an engraver? What subjects to create for a print design? What market to sell to?) would have been variously constrained or encouraged by their position in this larger network (Who do they already know? And who, in turn, do their contacts know?) This dissertation addresses the impact of these constraints and affordances through the novel application of computational social network analysis to major databases of surviving prints from this period. This approach is used to evaluate several questions about trends in early modern print production practices that have not been satisfactorily addressed by traditional literature based on case studies alone: Did the social capital demanded by print production result in centralized, or distributed production of prints? When, and to what extent, did printmakers and publishers in the Low countries favor international versus domestic collaborators? And were printmakers under the same pressure as painters to specialize in particular artistic genres? This dissertation ultimately suggests how simple professional incentives endemic to the practice of printmaking may, at large scales, have resulted in quite complex patterns of collaboration and production. The framework of network analysis surfaces the role of certain printmakers who tend to be neglected in aesthetically-focused histories of art. This approach also highlights important issues concerning art historians’ balancing of individual influence versus the impact of longue durée trends. Finally, this dissertation also raises questions about the current limitations and future possibilities of combining computational methods with cultural heritage datasets in the pursuit of historical research.
Resumo:
Permeability of a rock is a dynamic property that varies spatially and temporally. Fractures provide the most efficient channels for fluid flow and thus directly contribute to the permeability of the system. Fractures usually form as a result of a combination of tectonic stresses, gravity (i.e. lithostatic pressure) and fluid pressures. High pressure gradients alone can cause fracturing, the process which is termed as hydrofracturing that can determine caprock (seal) stability or reservoir integrity. Fluids also transport mass and heat, and are responsible for the formation of veins by precipitating minerals within open fractures. Veining (healing) thus directly influences the rock’s permeability. Upon deformation these closed factures (veins) can refracture and the cycle starts again. This fracturing-healing-refacturing cycle is a fundamental part in studying the deformation dynamics and permeability evolution of rock systems. This is generally accompanied by fracture network characterization focusing on network topology that determines network connectivity. Fracture characterization allows to acquire quantitative and qualitative data on fractures and forms an important part of reservoir modeling. This thesis highlights the importance of fracture-healing and veins’ mechanical properties on the deformation dynamics. It shows that permeability varies spatially and temporally, and that healed systems (veined rocks) should not be treated as fractured systems (rocks without veins). Field observations also demonstrate the influence of contrasting mechanical properties, in addition to the complexities of vein microstructures that can form in low-porosity and permeability layered sequences. The thesis also presents graph theory as a characterization method to obtain statistical measures on evolving network connectivity. It also proposes what measures a good reservoir should have to exhibit potentially large permeability and robustness against healing. The results presented in the thesis can have applications for hydrocarbon and geothermal reservoir exploration, mining industry, underground waste disposal, CO2 injection or groundwater modeling.
Resumo:
This thesis presents approximation algorithms for some NP-Hard combinatorial optimization problems on graphs and networks; in particular, we study problems related to Network Design. Under the widely-believed complexity-theoretic assumption that P is not equal to NP, there are no efficient (i.e., polynomial-time) algorithms that solve these problems exactly. Hence, if one desires efficient algorithms for such problems, it is necessary to consider approximate solutions: An approximation algorithm for an NP-Hard problem is a polynomial time algorithm which, for any instance of the problem, finds a solution whose value is guaranteed to be within a multiplicative factor of the value of an optimal solution to that instance. We attempt to design algorithms for which this factor, referred to as the approximation ratio of the algorithm, is as small as possible. The field of Network Design comprises a large class of problems that deal with constructing networks of low cost and/or high capacity, routing data through existing networks, and many related issues. In this thesis, we focus chiefly on designing fault-tolerant networks. Two vertices u,v in a network are said to be k-edge-connected if deleting any set of k − 1 edges leaves u and v connected; similarly, they are k-vertex connected if deleting any set of k − 1 other vertices or edges leaves u and v connected. We focus on building networks that are highly connected, meaning that even if a small number of edges and nodes fail, the remaining nodes will still be able to communicate. A brief description of some of our results is given below. We study the problem of building 2-vertex-connected networks that are large and have low cost. Given an n-node graph with costs on its edges and any integer k, we give an O(log n log k) approximation for the problem of finding a minimum-cost 2-vertex-connected subgraph containing at least k nodes. We also give an algorithm of similar approximation ratio for maximizing the number of nodes in a 2-vertex-connected subgraph subject to a budget constraint on the total cost of its edges. Our algorithms are based on a pruning process that, given a 2-vertex-connected graph, finds a 2-vertex-connected subgraph of any desired size and of density comparable to the input graph, where the density of a graph is the ratio of its cost to the number of vertices it contains. This pruning algorithm is simple and efficient, and is likely to find additional applications. Recent breakthroughs on vertex-connectivity have made use of algorithms for element-connectivity problems. We develop an algorithm that, given a graph with some vertices marked as terminals, significantly simplifies the graph while preserving the pairwise element-connectivity of all terminals; in fact, the resulting graph is bipartite. We believe that our simplification/reduction algorithm will be a useful tool in many settings. We illustrate its applicability by giving algorithms to find many trees that each span a given terminal set, while being disjoint on edges and non-terminal vertices; such problems have applications in VLSI design and other areas. We also use this reduction algorithm to analyze simple algorithms for single-sink network design problems with high vertex-connectivity requirements; we give an O(k log n)-approximation for the problem of k-connecting a given set of terminals to a common sink. We study similar problems in which different types of links, of varying capacities and costs, can be used to connect nodes; assuming there are economies of scale, we give algorithms to construct low-cost networks with sufficient capacity or bandwidth to simultaneously support flow from each terminal to the common sink along many vertex-disjoint paths. We further investigate capacitated network design, where edges may have arbitrary costs and capacities. Given a connectivity requirement R_uv for each pair of vertices u,v, the goal is to find a low-cost network which, for each uv, can support a flow of R_uv units of traffic between u and v. We study several special cases of this problem, giving both algorithmic and hardness results. In addition to Network Design, we consider certain Traveling Salesperson-like problems, where the goal is to find short walks that visit many distinct vertices. We give a (2 + epsilon)-approximation for Orienteering in undirected graphs, achieving the best known approximation ratio, and the first approximation algorithm for Orienteering in directed graphs. We also give improved algorithms for Orienteering with time windows, in which vertices must be visited between specified release times and deadlines, and other related problems. These problems are motivated by applications in the fields of vehicle routing, delivery and transportation of goods, and robot path planning.
Resumo:
The digital revolution of the 21st century contributed to stem the Internet of Things (IoT). Trillions of embedded devices using the Internet Protocol (IP), also called smart objects, will be an integral part of the Internet. In order to support such an extremely large address space, a new Internet Protocol, called Internet Protocol Version 6 (IPv6) is being adopted. The IPv6 over Low Power Wireless Personal Area Networks (6LoWPAN) has accelerated the integration of WSNs into the Internet. At the same time, the Constrained Application Protocol (CoAP) has made it possible to provide resource constrained devices with RESTful Web services functionalities. This work builds upon previous experience in street lighting networks, for which a proprietary protocol, devised by the Lighting Living Lab, was implemented and used for several years. The proprietary protocol runs on a broad range of lighting control boards. In order to support heterogeneous applications with more demanding communication requirements and to improve the application development process, it was decided to port the Contiki OS to the four channel LED driver (4LD) board from Globaltronic. This thesis describes the work done to adapt the Contiki OS to support the Microchip TM PIC24FJ128GA308 microprocessor and presents an IP based solution to integrate sensors and actuators in smart lighting applications. Besides detailing the system’s architecture and implementation, this thesis presents multiple results showing that the performance of CoAP based resource retrievals in constrained nodes is adequate for supporting networking services in street lighting networks.
Resumo:
Hematopoiesis is the tightly controlled and complex process in which the entire blood system is formed and maintained by a rare pool of hematopoietic stem cells (HSCs), and its dysregulation results in the formation of leukaemia. TRIB2, a member of the Tribbles family of serine/threonine pseudokinases, has been implicated in a variety of cancers and is a potent murine oncogene that induces acute myeloid leukaemia (AML) in vivo via modulation of the essential myeloid transcription factor CCAAT-enhancer binding protein α (C/EBPα). C/EBPα, which is crucial for myeloid cell differentiation, is commonly dysregulated in a variety of cancers, including AML. Two isoforms of C/EBPα exist - the full-length p42 isoform, and the truncated oncogenic p30 isoform. TRIB2 has been shown to selectively degrade the p42 isoform of C/EBPα and induce p30 expression in AML. In this study, overexpression of the p30 isoform in a bone marrow transplant (BMT) leads to perturbation of myelopoiesis, and in the presence of physiological levels of p42, this oncogene exhibited weak transformative ability. It was also shown by BMT that despite their degradative relationship, expression of C/EBPα was essential for TRIB2 mediated leukaemia. A conditional mouse model was used to demonstrate that oncogenic p30 cooperates with TRIB2 to reduce disease latency, only in the presence of p42. At the molecular level, a ubiquitination assay was used to show that TRIB2 degrades p42 by K48-mediated proteasomal ubiquitination and was unable to ubiquitinate p30. Mutation of a critical lysine residue in the C-terminus of C/EBPα abrogated TRIB2 mediated C/EBPα ubiquitination suggesting that this site, which is frequently mutated in AML, is the site at which TRIB2 mediates its degradative effects. The TRIB2-C/EBPα axis was effectively targeted by proteasome inhibition. AML is a very difficult disease to target therapeutically due to the extensive array of chromosomal translocations and genetic aberrations that contribute to the disease. The cell from which a specific leukaemia arises, or leukaemia initiating cell (LIC), can affect the phenotype and chemotherapeutic response of the resultant disease. The LIC has been elucidated for some common oncogenes but it is unknown for TRIB2. The data presented in this thesis investigate the ability of the oncogene TRIB2 to transform hematopoietic stem and progenitor cells in vitro and in vivo. TRIB2 overexpression conferred in vitro serially replating ability to all stem and progenitor cells studied. Upon transplantation, only TRIB2 overexpressing HSCs and granulocyte/macrophage progenitors (GMPs) resulted in the generation of leukaemia in vivo. TRIB2 induced a mature myeloid leukaemia from the GMP, and a mixed lineage leukaemia from the HSC. As such the role of TRIB2 in steady state hematopoiesis was also explored using a Trib2-/- mouse and it was determined that loss of Trib2 had no effect on lineage distribution in the hematopoietic compartment under steady-state conditions. The process of hematopoiesis is controlled by a host of lineage restricted transcription factors. Recently members of the Nuclear Factor 1 family of transcription factors (NFIA, NFIB, NFIC and NFIX) have been implicated in hematopoiesis. Little is known about the role of NFIX in lineage determination. Here we describe a novel role for NFIX in lineage fate determination. In human and murine datasets the expression of Nfix was shown to decrease as cells differentiated along the lymphoid pathway. NFIX overexpression resulted in enhanced myelopoiesis in vivo and in vitro and a block in B cell development at the pre-pro-B cell stage. Loss of NFIX resulted in disruption of myeloid and lymphoid differentiation in vivo. These effects on stem and progenitor cell fate correlated with changes in the expression levels of key transcription factors involved in hematopoietic differentiation including a 15-fold increase in Cebpa expression in Nfix overexpressing cells. The data presented support a role for NFIX as an important transcription factor influencing hematopoietic lineage specification. The identification of NFIX as a novel transcription factor influencing lineage determination will lead to further study of its role in hematopoiesis, and contribute to a better understanding of the process of differentiation. Elucidating the relationship between TRIB2 and C/EBPα not only impacts on our understanding of the pathophysiology of AML but is also relevant in other cancer types including lung and liver cancer. Thus in summary, the data presented in this thesis provide important insights into key areas which will facilitate the development of future therapeutic approaches in cancer treatment.
Resumo:
Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.
Resumo:
International audience
Resumo:
This PhD thesis is an empirical research project in the field of modern Polish history. The thesis focuses on Solidarity, the Network and the idea of workers’ self-management. In addition, the thesis is based on an in-depth analysis of Solidarity archival material. The Solidarity trade union was born in August 1980 after talks between the communist government and strike leaders at the Gdansk Lenin Shipyards. In 1981 a group called the Network rose up, due to cooperation between Poland’s great industrial factory plants. The Network grew out of Solidarity; it was made up of Solidarity activists, and the group acted as an economic partner to the union. The Network was the base of a grass-roots, nationwide workers’ self-management movement. Solidarity and the self-management movement were crushed by the imposition of Martial Law in December 1981. Solidarity revived itself immediately, and the union created an underground society. The Network also revived in the underground, and it continued to promote self-management activity where this was possible. When Solidarity regained its legal status in April 1989, workers’ self-management no longer had the same importance in the union. Solidarity’s new politico-economic strategy focused on free markets, foreign investment and privatization. This research project ends in July 1990, when the new Solidarity-backed government enacted a privatization law. The government decided to transform the property ownership structure through a centralized privatization process, which was a blow for supporters of workers’ self-management. This PhD thesis provides new insight into the evolution of the Solidarity union from 1980-1990 by analyzing the fate of workers’ self-management. This project also examines the role of the Network throughout the 1980s. There is analysis of the important link between workers’ self-management and the core ideas of Solidarity. In addition, the link between political and economic reform is an important theme in this research project. The Network was aware that authentic workers’ self-management required reforms to the authoritarian political system. Workers’ self-management competed against other politico-economic ideas during the 1980s in Poland. The outcome of this competition between different reform concepts has shaped modern-day Polish politics, economics and society.
Resumo:
Part 12: Collaboration Platforms
Resumo:
Part 8: Business Strategies Alignment
Resumo:
This paper considers a stochastic SIR (susceptible-infective-removed) epidemic model in which individuals may make infectious contacts in two ways, both within 'households' (which for ease of exposition are assumed to have equal size) and along the edges of a random graph describing additional social contacts. Heuristically-motivated branching process approximations are described, which lead to a threshold parameter for the model and methods for calculating the probability of a major outbreak, given few initial infectives, and the expected proportion of the population who are ultimately infected by such a major outbreak. These approximate results are shown to be exact as the number of households tends to infinity by proving associated limit theorems. Moreover, simulation studies indicate that these asymptotic results provide good approximations for modestly-sized finite populations. The extension to unequal sized households is discussed briefly.
Resumo:
Any other technology has never affected daily life at this level and witnessed as speedy adaptation as the mobile phone. At the same time, mobile media has developed to be a serious marketing tool for all kinds of businesses, and the industry has grown explosively in recent years. The objective of this thesis is to inspect the mobile marketing process of an international event. This thesis is a qualitative case study. The chosen case for this thesis is the mobile marketing process of Falun2015 FIS Nordic World Ski Championships due to researcher’s interest on the topic and contacts to the people around the event. The empirical findings were acquired by conducting two interviews with three experts from the case organisation and its partner organisation. The interviews were performed as semi-structured interviews utilising the themes arising from the chosen theoretical framework. The framework distinguished six phases in the process: (i) campaign initiation, (ii) campaign design, (iii) campaign creation, (iv) permission management, (v) delivery, and (vi) evaluation and analysis. Phases one and five were not examined in this thesis because campaign initiation was not purely seen as part of the campaign implementation, and investigating phase five would have required a very technical viewpoint to the study. In addition to the interviews, some pre-established documents were exploited as a supporting data. The empirical findings of this thesis mainly follow the theoretical framework utilised. However, some modifications to the model could be made mainly related to the order of different phases. In the revised model, the actions are categorised depending on the time they should be conducted, i.e. before, during or after the event. Regardless of the categorisation, the phases can be in different order and overlapping. In addition, the business network was highly emphasised by the empirical findings and is thus added to the modified model. Five managerial recommendations can be concluded from the empirical findings of this thesis: (i) the importance of a business network should be highly valued in a mobile marketing process; (ii) clear goals should be defined for mobile marketing actions in order to make sure that everyone involved is aware them; (iii) interactivity should be perceived as part of a mobile marketing communication; (iv) enough time should be allowed for the development of a mobile marketing process in order to exploit all the potential it can offer; and (v) attention should be paid to measuring and analysing matters that are of relevance
Resumo:
International audience
Resumo:
International audience