963 resultados para network effects
Resumo:
As it is defined in ATM 2000+ Strategy (Eurocontrol 2001), the mission of the Air Traffic Management (ATM) System is: “For all the phases of a flight, the ATM system should facilitate a safe, efficient, and expedite traffic flow, through the provision of adaptable ATM services that can be dimensioned in relation to the requirements of all the users and areas of the European air space. The ATM services should comply with the demand, be compatible, operate under uniform principles, respect the environment and satisfy the national security requirements.” The objective of this paper is to present a methodology designed to evaluate the status of the ATM system in terms of the relationship between the offered capacity and traffic demand, identifying weakness areas and proposing solutions. The first part of the methodology relates to the characterization and evaluation of the current system, while a second part proposes an approach to analyze the possible development limit. As part of the work, general criteria are established to define the framework in which the analysis and diagnostic methodology presented is placed. They are: the use of Air Traffic Control (ATC) sectors as analysis unit, the presence of network effects, the tactical focus, the relative character of the analysis, objectivity and a high level assessment that allows assumptions on the human and Communications, Navigation and Surveillance (CNS) elements, considered as the typical high density air traffic resources. The steps followed by the methodology start with the definition of indicators and metrics, like the nominal criticality or the nominal efficiency of a sector; scenario characterization where the necessary data is collected; network effects analysis to study the relations among the constitutive elements of the ATC system; diagnostic by means of the “System Status Diagram”; analytical study of the ATC system development limit; and finally, formulation of conclusions and proposal for improvement. This methodology was employed by Aena (Spanish Airports Manager and Air Navigation Service Provider) and INECO (Spanish Transport Engineering Company) in the analysis of the Spanish ATM System in the frame of the Spanish airspace capacity sustainability program, although it could be applied elsewhere.
Resumo:
In this paper we try to present the main trends of evolution of the ICT sector. Its dynamics, supported by a constant technical progress in ICs, compounded with “non convexities” such as network effects and high sunk costs, may either lead to a Schumpeter Mark I or Schumpeter Mark II competition regime. This means that in some segments, the market will be more competitive (Mark I), while in other it will be more monopolistic (Mark II). But a key trend is also the so called “convergence”. But digitization makes it cost effective to integrate different communications, information processing and entertainment systems and devices. Hence, Schumpeter Mark II grows at the core where software production dominates, while Schumpeter Mark I is established at the periphery. In this context, the European ICT industry is potentially smashed between two forces: the cost advantages of Asian countries on one hand, the inventiveness and dynamism of the US industry on the other hand. The way out of this very difficult situation is to create in Europe the conditions of restoring knowledge accumulation in a key sub-sector of ICT, that is software production. To do this, Europe can rely on its tradition of cooperation and knowledge sharing and on a set of institutions that have shown their ability to stimulate inter-regional cooperation. By concentrating on an ambitious project of open source software production in embarked systems and domestic networks, Europe could reach several objectives: to make freely accessible an essential facility, to stimulate competition, to help reaching the Lisbon objectives and to restore the European competitiveness in ICT.
Resumo:
We investigate knowledge exchange among commercial organizations, the rationale behind it, and its effects on the market. Knowledge exchange is known to be beneficial for industry, but in order to explain it, authors have used high-level concepts like network effects, reputation, and trust. We attempt to formalize a plausible and elegant explanation of how and why companies adopt information exchange and why it benefits the market as a whole when this happens. This explanation is based on a multiagent model that simulates a market of software providers. Even though the model does not include any high-level concepts, information exchange naturally emerges during simulations as a successful profitable behavior. The conclusions reached by this agent-based analysis are twofold: 1) a straightforward set of assumptions is enough to give rise to exchange in a software market, and 2) knowledge exchange is shown to increase the efficiency of the market.
Resumo:
This paper presents an analysis of whether a consumer's decision to switch from one mobile phone provider to another is driven by individual consumer characteristics or by actions of other consumers in her social network. Such consumption interdependences are estimated using a unique dataset, which contains transaction data based on anonymized call records from a large European mobile phone carrier to approximate a consumer's social network. Results show that network effects have an important impact on consumers' switching decisions: switching decisions are interdependent between consumers who interact with each other and this interdependence increases in the closeness between two consumers as measured by the calling data. In other words, if a subscriber switches carriers, she is also affecting the switching probabilities of other individuals in her social circle. The paper argues that such an approach is of high relevance to both switching of providers and to the adoption of new products. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
Gracias al crecimiento, expansión y popularización de la World Wide Web, su desarrollo tecnológico tiene una creciente importancia en la sociedad. La simbiosis que protagonizan estos dos entornos ha propiciado una mayor influencia social en las innovaciones de la plataforma y un enfoque mucho más práctico. Nuestro objetivo en este artículo es describir, caracterizar y analizar el surgimiento y difusión del nuevo estándar de hipertexto que rige la Web; HTML5. Al mismo tiempo exploramos este proceso a la luz de varias teorías que aúnan tecnología y sociedad. Dedicamos especial atención a los usuarios de la World Wide Web y al uso genérico que realizan de los Medios Sociales o "Social Media". Sugerimos que el desarrollo de los estándares web está influenciado por el uso cotidiano de este nuevo tipo de tecnologías y aplicaciones.
Resumo:
International regimes are composite historical constructions. They are built-up through bricolage, as resource-strapped officials combine operational capacities, frequently turning to outside assistance. Who wins and loses—and why—when organisations are added or subtracted? What happens when inter-organisational relations are recalibrated? Why do regimes cohere as they do? By comparing the development of financial-regulatory regimes and probing other illustrative cases, I offer an explanatory framework that emphasizes the importance of timing and sequencing in determining outcomes. Thinking beyond interstate network effects and switching costs, I distil new data and theoretical insights into how and why temporality matters in global politics. I find that time structures the strategic bargaining contexts that mediate the intense distributional struggles between organisations driving key institutional reforms. The explanatory power of this framework upsets conventional wisdom whereby the distribution of state power, and the dynamics of interstate bargaining, are assumed the critical sources of institutional reform.
Resumo:
The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.
Resumo:
Phenotypic plasticity can increase tolerance to heterogeneous environments but the elevations and slopes of reaction norms are often population specific. Disruption of locally adapted reaction norms through outcrossing can lower individual viability. Here, we sampled five genetically distinct populations of brown trout (Salmo trutta) from within a river network, crossed them in a full-factorial design, and challenged the embryos with the opportunistic pathogen Pseudomonas fluorescens. By virtue of our design, we were able to disentangle effects of genetic crossing distance from sire and dam effects on early life-history traits. While pathogen infection did not increase mortality, it was associated with delayed hatching of smaller larvae with reduced yolk sac reserves. We found no evidence of a relationship between genetic distance (W, FST) and the expression of early-life history traits. Moreover, hybrids did not differ in phenotypic means or reaction norms in comparison to offspring from within-population crosses. Heritable variation in early life-history traits was found to remain stable across the control and pathogen environments. Our findings show that outcrossing within a rather narrow geographical scale can have neutral effects on F1 hybrid viability at the embryonic stage, i.e. at a stage when environmental and genetic effects on phenotypes are usually large.
Resumo:
Higher risk for long-term behavioral and emotional sequelae, with attentional problems (with or without hyperactivity) is now becoming one of the hallmarks of extreme premature (EP) birth and birth after pregancy conditions leading to poor intra uterine growth restriction (IUGR) [1,2]. However, little is know so far about the neurostructural basis of these complexe brain functional abnormalities that seem to have their origins in early critical periods of brain development. The development of cortical axonal pathways happens in a series of sequential events. The preterm phase (24-36 post conecptional weeks PCW) is known for being crucial for growth of the thalamocortical fiber bundles as well as for the development of long projectional, commisural and projectional fibers [3]. Is it logical to expect, thus, that being exposed to altered intrauterine environment (altered nutrition) or to extrauterine environment earlier that expected, lead to alterations in the structural organization and, consequently, alter the underlying white matter (WM) structure. Understanding rate and variability of normal brain development, and detect differences from typical development may offer insight into the neurodevelopmental anomalies that can be imaged at later stages. Due to its unique ability to non-invasively visualize and quantify in vivo white matter tracts in the brain, in this study we used diffusion MRI (dMRI) tractography to derive brain graphs [4,5,6]. This relatively simple way of modeling the brain enable us to use graph theory to study topological properties of brain graphs in order to study the effects of EP and IUGR on childrens brain connectivity at age 6 years old.
Resumo:
In this paper we extend the results presented in (de Ponte, Mizrahi and Moussa 2007 Phys. Rev. A 76 032101) to treat quantitatively the effects of reservoirs at finite temperature in a bosonic dissipative network: a chain of coupled harmonic oscillators whatever its topology, i.e., whichever the way the oscillators are coupled together, the strength of their couplings and their natural frequencies. Starting with the case where distinct reservoirs are considered, each one coupled to a corresponding oscillator, we also analyze the case where a common reservoir is assigned to the whole network. Master equations are derived for both situations and both regimes of weak and strong coupling strengths between the network oscillators. Solutions of these master equations are presented through the normal ordered characteristic function. These solutions are shown to be significantly involved when temperature effects are considered, making difficult the analysis of collective decoherence and dispersion in dissipative bosonic networks. To circumvent these difficulties, we turn to the Wigner distribution function which enables us to present a technique to estimate the decoherence time of network states. Our technique proceeds by computing separately the effects of dispersion and the attenuation of the interference terms of the Wigner function. A detailed analysis of the dispersion mechanism is also presented through the evolution of the Wigner function. The interesting collective dispersion effects are discussed and applied to the analysis of decoherence of a class of network states. Finally, the entropy and the entanglement of a pure bipartite system are discussed.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
This work describes an application of a multilayer perceptron neural network technique to correct dome emission effects on longwave atmospheric radiation measurements carried out using an Eppley Precision Infrared Radiometer (PIR) pyrgeometer. It is shown that approximately 7-month-long measurements of dome and case temperatures and meteorological variables available in regular surface stations (global solar radiation, air temperature, and air relative humidity) are enough to train the neural network algorithm and correct the observed longwave radiation for dome temperature effects in surface stations with climates similar to that of the city of São Paulo, Brazil. The network was trained using data from 15 October 2003 to 7 January 2004 and verified using data, not present during the network-training period, from 8 January to 30 April 2004. The longwave radiation values generated by the neural network technique were very similar to the values obtained by Fairall et al., assumed here as the reference approach to correct dome emission effects in PIR pyrgeometers. Compared to the empirical approach the neural network technique is less limited to sensor type and time of day (allows nighttime corrections).
Resumo:
Vaquero AR, Ferreira NE, Omae SV, Rodrigues MV, Teixeira SK, Krieger JE, Pereira AC. Using gene-network landscape to dissect genotype effects of TCF7L2 genetic variant on diabetes and cardiovascular risk. Physiol Genomics 44: 903-914, 2012. First published August 7, 2012; doi:10.1152/physiolgenomics.00030.2012.-The single nucleotide polymorphism (SNP) within the TCF7L2 gene, rs7903146, is, to date, the most significant genetic marker associated with Type 2 diabetes mellitus (T2DM) risk. Nonetheless, its functional role in disease pathology is poorly understood. The aim of the present study was to investigate, in vascular smooth muscle cells from 92 patients undergoing aortocoronary bypass surgery, the contribution of this SNP in T2DM using expression levels and expression correlation comparison approaches, which were visually represented as gene interaction networks. Initially, the expression levels of 41 genes (seven TCF7L2 splice forms and 40 other T2DM relevant genes) were compared between rs7903146 wild-type (CC) and T2DM-risk (CT + TT) genotype groups. Next, we compared the expression correlation patterns of these 41 genes between groups to observe if the relationships between genes were different. Five TCF7L2 splice forms and nine genes showed significant expression differences between groups. RXR alpha gene was pinpointed as showing the most different expression correlation pattern with other genes. Therefore, T2DM risk alleles appear to be influencing TCF7L2 splice form's expression in vascular smooth muscle cells, and RXR alpha gene is pointed out as a treatment target candidate for risk reduction in individuals with high risk of developing T2DM, especially individuals harboring TCF7L2 risk genotypes.
Resumo:
OBJECTIVE: To determine the effect of glucosamine, chondroitin, or the two in combination on joint pain and on radiological progression of disease in osteoarthritis of the hip or knee. Design Network meta-analysis. Direct comparisons within trials were combined with indirect evidence from other trials by using a Bayesian model that allowed the synthesis of multiple time points. MAIN OUTCOME MEASURE: Pain intensity. Secondary outcome was change in minimal width of joint space. The minimal clinically important difference between preparations and placebo was prespecified at -0.9 cm on a 10 cm visual analogue scale. DATA SOURCES: Electronic databases and conference proceedings from inception to June 2009, expert contact, relevant websites. Eligibility criteria for selecting studies Large scale randomised controlled trials in more than 200 patients with osteoarthritis of the knee or hip that compared glucosamine, chondroitin, or their combination with placebo or head to head. Results 10 trials in 3803 patients were included. On a 10 cm visual analogue scale the overall difference in pain intensity compared with placebo was -0.4 cm (95% credible interval -0.7 to -0.1 cm) for glucosamine, -0.3 cm (-0.7 to 0.0 cm) for chondroitin, and -0.5 cm (-0.9 to 0.0 cm) for the combination. For none of the estimates did the 95% credible intervals cross the boundary of the minimal clinically important difference. Industry independent trials showed smaller effects than commercially funded trials (P=0.02 for interaction). The differences in changes in minimal width of joint space were all minute, with 95% credible intervals overlapping zero. Conclusions Compared with placebo, glucosamine, chondroitin, and their combination do not reduce joint pain or have an impact on narrowing of joint space. Health authorities and health insurers should not cover the costs of these preparations, and new prescriptions to patients who have not received treatment should be discouraged.