964 resultados para Traffic Conflict Techniques
Resumo:
This paper presents a statistical aircraft trajectory clustering approach aimed at discriminating between typical manned and expected unmanned traffic patterns. First, a resampled version of each trajectory is modelled using a mixture of Von Mises distributions (circular statistics). Second, the remodelled trajectories are globally aligned using tools from bioinformatics. Third, the alignment scores are used to cluster the trajectories using an iterative k-medoids approach and an appropriate distance function. The approach is then evaluated using synthetically generated unmanned aircraft flights combined with real air traffic position reports taken over a sector of Northern Queensland, Australia. Results suggest that the technique is useful in distinguishing between expected unmanned and manned aircraft traffic behaviour, as well as identifying some common conventional air traffic patterns.
Resumo:
Prediction of variable bit rate compressed video traffic is critical to dynamic allocation of resources in a network. In this paper, we propose a technique for preprocessing the dataset used for training a video traffic predictor. The technique involves identifying the noisy instances in the data using a fuzzy inference system. We focus on three prediction techniques, namely, linear regression, neural network and support vector regression and analyze their performance on H.264 video traces. Our experimental results reveal that data preprocessing greatly improves the performance of linear regression and neural network, but is not effective on support vector regression.
Resumo:
Radar services are occasionally affected by wind farms. This paper presents a comprehensive description of the effects that a wind farm may cause on the different radar services, and it compiles a review of the recent research results regarding the mitigation techniques to minimize this impact. Mitigation techniques to be applied at the wind farm and on the radar systems are described. The development of thorough impact studies before the wind farm is installed is presented as the best way to analyze in advance the potential for interference, and subsequently identify the possible solutions to allow the coexistence of wind farms and radar services.
Resumo:
The solution behavior of linear polymer chains is well understood, having been the subject of intense study throughout the previous century. As plastics have become ubiquitous in everyday life, polymer science has grown into a major field of study. The conformation of a polymer in solution depends on the molecular architecture and its interactions with the surroundings. Developments in synthetic techniques have led to the creation of precision-tailored polymeric materials with varied topologies and functionalities. In order to design materials with the desired properties, it is imperative to understand the relationships between polymer architecture and their conformation and behavior. To meet that need, this thesis investigates the conformation and self-assembly of three architecturally complex macromolecular systems with rich and varied behaviors driven by the resolution of intramolecular conflicts. First we describe the development of a robust and facile synthetic approach to reproducible bottlebrush polymers (Chapter 2). The method was used to produce homologous series of bottlebrush polymers with polynorbornene backbones, which revealed the effect of side-chain and backbone length on the overall conformation in both good and theta solvent conditions (Chapter 3). The side-chain conformation was obtained from a series of SANS experiments and determined to be indistinguishable from the behavior of free linear polymer chains. Using deuterium-labeled bottlebrushes, we were able for the first time to directly observe the backbone conformation of a bottlebrush polymer which showed self-avoiding walk behavior. Secondly, a series of SANS experiments was conducted on a homologous series of Side Group Liquid Crystalline Polymers (SGLCPs) in a perdeuterated small molecule liquid crystal (5CB). Monodomain, aligned, dilute samples of SGLCP-b-PS block copolymers were seen to self-assemble into complex micellar structures with mutually orthogonally oriented anisotropies at different length scales (Chapter 4). Finally, we present the results from the first scattering experiments on a set of fuel-soluble, associating telechelic polymers. We observed the formation of supramolecular aggregates in dilute (≤0.5wt%) solutions of telechelic polymers and determined that the choice of solvent has a significant effect on the strength of association and the size of the supramolecules (Chapter 5). A method was developed for the direct estimation of supramolecular aggregation number from SANS data. The insight into structure-property relationships obtained from this work will enable the more targeted development of these molecular architectures for their respective applications.
Resumo:
The safety of the flights, and in particular conflict resolution for separation assurance, is one of the main tasks of Air Traffic Control. Conflict resolution requires decision making in the face of the considerable levels of uncertainty inherent in the motion of aircraft. We present a Monte Carlo framework for conflict resolution which allows one to take into account such levels of uncertainty through the use of a stochastic simulator. A simulation example inspired by current air traffic control practice illustrates the proposed conflict resolution strategy. Copyright © 2005 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Resumo:
Accurate knowledge of traffic demands in a communication network enables or enhances a variety of traffic engineering and network management tasks of paramount importance for operational networks. Directly measuring a complete set of these demands is prohibitively expensive because of the huge amounts of data that must be collected and the performance impact that such measurements would impose on the regular behavior of the network. As a consequence, we must rely on statistical techniques to produce estimates of actual traffic demands from partial information. The performance of such techniques is however limited due to their reliance on limited information and the high amount of computations they incur, which limits their convergence behavior. In this paper we study a two-step approach for inferring network traffic demands. First we elaborate and evaluate a modeling approach for generating good starting points to be fed to iterative statistical inference techniques. We call these starting points informed priors since they are obtained using actual network information such as packet traces and SNMP link counts. Second we provide a very fast variant of the EM algorithm which extends its computation range, increasing its accuracy and decreasing its dependence on the quality of the starting point. Finally, we evaluate and compare alternative mechanisms for generating starting points and the convergence characteristics of our EM algorithm against a recently proposed Weighted Least Squares approach.
Resumo:
The conflict known as the oTroubleso in Northern Ireland began during the late 1960s and is defined by political and ethno-sectarian violence between state, pro-state, and anti-state forces. Reasons for the conflict are contested and complicated by social, religious, political, and cultural disputes, with much of the debate concerning the victims of violence hardened by competing propaganda-conditioning perspectives. This article introduces a database holding information on the location of individual fatalities connected with the contemporary Irish conflict. For each victim, it includes a demographic profile, home address, manner of death, and the organization responsible. Employing geographic information system (GIS) techniques, the database is used to measure, map, and analyze the spatial distribution of conflict-related deaths between 1966 and 2007 across Belfast, the capital city of Northern Ireland, with respect to levels of segregation, social and economic deprivation, and interfacing. The GIS analysis includes a kernel density estimator designed to generate smooth intensity surfaces of the conflict-related deaths by both incident and home locations. Neighborhoods with high-intensity surfaces of deaths were those with the highest levels of segregation ( 90 percent Catholic or Protestant) and deprivation, and they were located near physical barriers, the so-called peacelines, between predominantly Catholic and predominantly Protestant communities. Finally, despite the onset of peace and the formation of a power-sharing and devolved administration (the Northern Ireland Assembly), disagreements remain over the responsibility and ocommemorationo of victims, sentiments that still uphold division and atavistic attitudes between spatially divided Catholic and Protestant populations.
Resumo:
Active network scanning injects traffic into a network and observes responses to draw conclusions about the network. Passive network analysis works by looking at network meta data or by analyzing traffic as it traverses a fixed point on the network. It may be infeasible or inappropriate to scan critical infrastructure networks. Techniques exist to uniquely map assets without resorting to active scanning. In many cases, it is possible to characterize and identify network nodes by passively analyzing traffic flows. These techniques are considered in particular with respect to their application to power industry critical infrastructure.
Resumo:
O presente trabalho centra-se no estudo dos amplificadores de Raman em fibra ótica e suas aplicações em sistemas modernos de comunicações óticas. Abordaram-se tópicos específicos como a simulação espacial do amplificador de Raman, a equalização e alargamento do ganho, o uso de abordagens híbridas de amplificação através da associação de amplificadores de Raman em fibra ótica com amplificadores de fibra dopada com Érbio (EDFA) e os efeitos transitórios no ganho dos amplificadores. As actividades realizadas basearam-se em modelos teóricos, sendo os resultados validados experimentalmente. De entre as contribuições mais importantes desta tese, destaca-se (i) o desenvolvimento de um simulador eficiente para amplificadores de Raman que suporta arquitecturas de bombeamento contraprogantes e bidirecionais num contexto com multiplexagem no comprimento de onda (WDM); (ii) a implementação de um algoritmo de alocação de sinais de bombeamento usando a combinação do algoritmo genético com o método de Nelder- Mead; (iii) a apreciação de soluções de amplificação híbridas por associação dos amplificadores de Raman com EDFA em cenários de redes óticas passivas, nomeadamente WDM/TDM-PON com extensão a região espectral C+L; e (iv) a avaliação e caracterização de fenómenos transitórios em amplificadores para tráfego em rajadas/pacotes óticos e consequente desenvolvimento de soluções de mitigação baseadas em técnicas de clamping ótico.
Resumo:
Internet Tra c, Internet Applications, Internet Attacks, Tra c Pro ling, Multi-Scale Analysis abstract Nowadays, the Internet can be seen as an ever-changing platform where new and di erent types of services and applications are constantly emerging. In fact, many of the existing dominant applications, such as social networks, have appeared recently, being rapidly adopted by the user community. All these new applications required the implementation of novel communication protocols that present di erent network requirements, according to the service they deploy. All this diversity and novelty has lead to an increasing need of accurately pro ling Internet users, by mapping their tra c to the originating application, in order to improve many network management tasks such as resources optimization, network performance, service personalization and security. However, accurately mapping tra c to its originating application is a di cult task due to the inherent complexity of existing network protocols and to several restrictions that prevent the analysis of the contents of the generated tra c. In fact, many technologies, such as tra c encryption, are widely deployed to assure and protect the con dentiality and integrity of communications over the Internet. On the other hand, many legal constraints also forbid the analysis of the clients' tra c in order to protect their con dentiality and privacy. Consequently, novel tra c discrimination methodologies are necessary for an accurate tra c classi cation and user pro ling. This thesis proposes several identi cation methodologies for an accurate Internet tra c pro ling while coping with the di erent mentioned restrictions and with the existing encryption techniques. By analyzing the several frequency components present in the captured tra c and inferring the presence of the di erent network and user related events, the proposed approaches are able to create a pro le for each one of the analyzed Internet applications. The use of several probabilistic models will allow the accurate association of the analyzed tra c to the corresponding application. Several enhancements will also be proposed in order to allow the identi cation of hidden illicit patterns and the real-time classi cation of captured tra c. In addition, a new network management paradigm for wired and wireless networks will be proposed. The analysis of the layer 2 tra c metrics and the di erent frequency components that are present in the captured tra c allows an e cient user pro ling in terms of the used web-application. Finally, some usage scenarios for these methodologies will be presented and discussed.
Resumo:
Optical networks are under constant evolution. The growing demand for dynamism require devices that can accommodate different types of traffic. Thus the study of transparent optical networks arises. This approach makes optical networks more "elegant" , due to a more efficient use of network resources. In this thesis, the author proposes devices that intend to form alternative approaches both in the state of art of these same technologies both in the fitting of this technologies in transparent optical networks. Given that full transparency is difficult to achieve with current technology (perhaps with more developed optical computing this is possible), the author proposes techniques with different levels of transparency. On the topic of performance of optical networks, the author proposes two techniques for monitoring chromatic dispersion with different levels of transparency. In Chapter 3 the proposed technique seems to make more sense for long-haul optical transmission links and high transmission rates, not only due to its moderate complexity but also to its potential moderate/high cost. However it is proposed to several modulation formats, particularly those that have a protruding clock component. In Chapter 4 the transparency level was not tested for various modulation formats, however some transparency is achieved by not adding any electrical device after the receiver (other than an analog-digital converter). This allows that this technique can operate at high transmission rates in excess of 100 Gbit / s, if electro-optical asynchronous sampling is used before the optical receiver. Thus a low cost and low bandwidth photo-detector can be used. In chapter 5 is demonstrated a technique for simultaneously monitoring multiple impairments of the optical network by generating novel performance analysis diagrams and by use of artificial neural networks. In chapter 6 the author demonstrates an all-optical technique for controlling the optical state of polarization and an example of how all-optical signal processing can fully cooperate with optical performance monitoring.
Resumo:
Use of bridge deck overlays is important in maximizing bridge service life. Overlays can replace the deteriorated part of the deck, thus extending the bridge life. Even though overlay construction avoids the construction of a whole new bridge deck, construction still takes significant time in re-opening the bridge to traffic. Current processes and practices are time-consuming and multiple opportunities may exist to reduce overall construction time by modifying construction requirements and/or materials utilized. Reducing the construction time could have an effect on reducing the socioeconomic costs associated with bridge deck rehabilitation and the inconvenience caused to travelers. This work included three major tasks with literature review, field investigation, and laboratory testing. Overlay concrete mix used for present construction takes long curing hours and therefore an investigation was carried out to find fast-curing concrete mixes that could reduce construction time. Several fast-cuing concrete mixes were found and suggested for further evaluation. An on-going overlay construction project was observed and documented. Through these observations, several opportunities were suggested where small modifications in the process could lead to significant time savings. With current standards of the removal depth of substrate concrete in Iowa, it takes long hours for the removal process. Four different laboratory tests were performed with different loading conditions to determine the necessary substrate concrete removal depth for a proper bond between the substrate concrete and the new overlay concrete. Several parameters, such as failure load, bond stress, and stiffness, were compared for four different concrete removal depths. Through the results and observations of this investigation several conclusions were made which could reduce bridge deck overlay construction time.
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
This paper describes an urban traffic control system which aims at contributing to a more efficient traffic management system in the cities of Brazil. It uses fuzzy sets, case-based reasoning, and genetic algorithms to handle dynamic and unpredictable traffic scenarios, as well as uncertain, incomplete, and inconsistent information. The system is composed by one supervisor and several controller agents, which cooperate with each other to improve the system's results through Artificial Intelligence Techniques.