957 resultados para Traffic Speed Change.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper assesses whether two sustainability policies currently in effect in London, a congestion charge zone and a low emission zone, have affected freight operations and reduced vehicle kilometers travelled. It investigates responses by freight operators, including re-timing, re-routing, or reducing the number of trips, or replacing vehicles. Freight traffic trends from 1994 to 2012 were identified using road traffic estimates, cordon counts, and vehicle speed data and supplemented by interviews with freight industry experts and operators. Findings indicate that freight traffic increased throughout London during this timeframe, but declined in the central boroughs partly within the congestion charge zone. The congestion charge may have time-shifted some light goods trips, but most freight trips face a variety of constraints on operators’ delivery window. No evidence was found of re-routing of freight traffic or avoidance traffic around the charged zone. The low emission zone spurred higher levels of operational change than the congestion charge zone, and it was effective at spurring freight vehicle replacement. The paper also discusses freight operators’ perceptions of these policies and how they could be improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Avec les nouvelles technologies des réseaux optiques, une quantité de données de plus en plus grande peut être transportée par une seule longueur d'onde. Cette quantité peut atteindre jusqu’à 40 gigabits par seconde (Gbps). Les flots de données individuels quant à eux demandent beaucoup moins de bande passante. Le groupage de trafic est une technique qui permet l'utilisation efficace de la bande passante offerte par une longueur d'onde. Elle consiste à assembler plusieurs flots de données de bas débit en une seule entité de données qui peut être transporté sur une longueur d'onde. La technique demultiplexage en longueurs d'onde (Wavelength Division Multiplexing WDM) permet de transporter plusieurs longueurs d'onde sur une même fibre. L'utilisation des deux techniques : WDM et groupage de trafic, permet de transporter une quantité de données de l'ordre de terabits par seconde (Tbps) sur une même fibre optique. La protection du trafic dans les réseaux optiques devient alors une opération très vitale pour ces réseaux, puisqu'une seule panne peut perturber des milliers d'utilisateurs et engendre des pertes importantes jusqu'à plusieurs millions de dollars à l'opérateur et aux utilisateurs du réseau. La technique de protection consiste à réserver une capacité supplémentaire pour acheminer le trafic en cas de panne dans le réseau. Cette thèse porte sur l'étude des techniques de groupage et de protection du trafic en utilisant les p-cycles dans les réseaux optiques dans un contexte de trafic dynamique. La majorité des travaux existants considère un trafic statique où l'état du réseau ainsi que le trafic sont donnés au début et ne changent pas. En plus, la majorité de ces travaux utilise des heuristiques ou des méthodes ayant de la difficulté à résoudre des instances de grande taille. Dans le contexte de trafic dynamique, deux difficultés majeures s'ajoutent aux problèmes étudiés, à cause du changement continuel du trafic dans le réseau. La première est due au fait que la solution proposée à la période précédente, même si elle est optimisée, n'est plus nécessairement optimisée ou optimale pour la période courante, une nouvelle optimisation de la solution au problème est alors nécessaire. La deuxième difficulté est due au fait que la résolution du problème pour une période donnée est différente de sa résolution pour la période initiale à cause des connexions en cours dans le réseau qui ne doivent pas être trop dérangées à chaque période de temps. L'étude faite sur la technique de groupage de trafic dans un contexte de trafic dynamique consiste à proposer différents scénarios pour composer avec ce type de trafic, avec comme objectif la maximisation de la bande passante des connexions acceptées à chaque période de temps. Des formulations mathématiques des différents scénarios considérés pour le problème de groupage sont proposées. Les travaux que nous avons réalisés sur le problème de la protection considèrent deux types de p-cycles, ceux protégeant les liens (p-cycles de base) et les FIPP p-cycles (p-cycles protégeant les chemins). Ces travaux ont consisté d’abord en la proposition de différents scénarios pour gérer les p-cycles de protection dans un contexte de trafic dynamique. Ensuite, une étude sur la stabilité des p-cycles dans un contexte de trafic dynamique a été faite. Des formulations de différents scénarios ont été proposées et les méthodes de résolution utilisées permettent d’aborder des problèmes de plus grande taille que ceux présentés dans la littérature. Nous nous appuyons sur la méthode de génération de colonnes pour énumérer implicitement les cycles les plus prometteurs. Dans l'étude des p-cycles protégeant les chemins ou FIPP p-cycles, nous avons proposé des formulations pour le problème maître et le problème auxiliaire. Nous avons utilisé une méthode de décomposition hiérarchique du problème qui nous permet d'obtenir de meilleurs résultats dans un temps raisonnable. Comme pour les p-cycles de base, nous avons étudié la stabilité des FIPP p-cycles dans un contexte de trafic dynamique. Les travaux montrent que dépendamment du critère d'optimisation, les p-cycles de base (protégeant les liens) et les FIPP p-cycles (protégeant les chemins) peuvent être très stables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Timely detection of sudden change in dynamics that adversely affect the performance of systems and quality of products has great scientific relevance. This work focuses on effective detection of dynamical changes of real time signals from mechanical as well as biological systems using a fast and robust technique of permutation entropy (PE). The results are used in detecting chatter onset in machine turning and identifying vocal disorders from speech signal.Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. Here we propose the use of permutation entropy (PE), to detect the dynamical changes in two non linear processes, turning under mechanical system and speech under biological system.Effectiveness of PE in detecting the change in dynamics in turning process from the time series generated with samples of audio and current signals is studied. Experiments are carried out on a lathe machine for sudden increase in depth of cut and continuous increase in depth of cut on mild steel work pieces keeping the speed and feed rate constant. The results are applied to detect chatter onset in machining. These results are verified using frequency spectra of the signals and the non linear measure, normalized coarse-grained information rate (NCIR).PE analysis is carried out to investigate the variation in surface texture caused by chatter on the machined work piece. Statistical parameter from the optical grey level intensity histogram of laser speckle pattern recorded using a charge coupled device (CCD) camera is used to generate the time series required for PE analysis. Standard optical roughness parameter is used to confirm the results.Application of PE in identifying the vocal disorders is studied from speech signal recorded using microphone. Here analysis is carried out using speech signals of subjects with different pathological conditions and normal subjects, and the results are used for identifying vocal disorders. Standard linear technique of FFT is used to substantiate thc results.The results of PE analysis in all three cases clearly indicate that this complexity measure is sensitive to change in regularity of a signal and hence can suitably be used for detection of dynamical changes in real world systems. This work establishes the application of the simple, inexpensive and fast algorithm of PE for the benefit of advanced manufacturing process as well as clinical diagnosis in vocal disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, a new technique for grooming low-speed traffic demands into high-speed optical routes is proposed. This enhancement allows a transparent wavelength-routing switch (WRS) to aggregate traffic en route over existing optical routes without incurring expensive optical-electrical-optical (OEO) conversions. This implies that: a) an optical route may be considered as having more than one ingress node (all inline) and, b) traffic demands can partially use optical routes to reach their destination. The proposed optical routes are named "lighttours" since the traffic originating from different sources can be forwarded together in a single optical route, i.e., as taking a "tour" over different sources towards the same destination. The possibility of creating lighttours is the consequence of a novel WRS architecture proposed in this article, named "enhanced grooming" (G+). The ability to groom more traffic in the middle of a lighttour is achieved with the support of a simple optical device named lambda-monitor (previously introduced in the RingO project). In this article, we present the new WRS architecture and its advantages. To compare the advantages of lighttours with respect to classical lightpaths, an integer linear programming (ILP) model is proposed for the well-known multilayer problem: traffic grooming, routing and wavelength assignment The ILP model may be used for several objectives. However, this article focuses on two objectives: maximizing the network throughput, and minimizing the number of optical-electro-optical conversions used. Experiments show that G+ can route all the traffic using only half of the total OEO conversions needed by classical grooming. An heuristic is also proposed, aiming at achieving near optimal results in polynomial time

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les noves tecnologies a la xarxa ens permeten transportar, cada cop més, grans volums d' informació i trànsit de xarxa amb diferents nivells de prioritat. En aquest escenari, on s'ofereix una millor qualitat de servei, les conseqüències d'una fallada en un enllaç o en un node esdevenen més importants. Multiprotocol Lavel Switching (MPLS), juntament amb l'extensió a MPLS generalitzat (GMPLS), proporcionen mecanismes ràpids de recuperació de fallada establint camins, Label Switch Path (LSPs), redundants per ser utilitzats com a camins alternatius. En cas de fallada podrem utilitzar aquests camins per redireccionar el trànsit. El principal objectiu d'aquesta tesi ha estat millorar alguns dels actuals mecanismes de recuperació de fallades MPLS/GMPLS, amb l'objectiu de suportar els requeriments de protecció dels serveis proporcionats per la nova Internet. Per tal de fer aquesta avaluació s'han tingut en compte alguns paràmetres de qualitat de protecció com els temps de recuperació de fallada, les pèrdues de paquets o el consum de recursos. En aquesta tesi presentem una completa revisió i comparació dels principals mètodes de recuperació de fallada basats en MPLS. Aquest anàlisi inclou els mètodes de protecció del camí (backups globals, backups inversos i protecció 1+1), els mètodes de protecció locals i els mètodes de protecció de segments. També s'ha tingut en compte l'extensió d'aquests mecanismes a les xarxes òptiques mitjançant el pla de control proporcionat per GMPLS. En una primera fase d'aquest treball, cada mètode de recuperació de fallades és analitzat sense tenir en compte restriccions de recursos o de topologia. Aquest anàlisi ens dóna una primera classificació dels millors mecanismes de protecció en termes de pèrdues de paquets i temps de recuperació. Aquest primer anàlisi no és aplicable a xarxes reals. Per tal de tenir en compte aquest nou escenari, en una segona fase, s'analitzen els algorismes d'encaminament on sí tindrem en compte aquestes limitacions i restriccions de la xarxa. Es presenten alguns dels principals algorismes d'encaminament amb qualitat de servei i alguna de les principals propostes d'encaminament per xarxes MPLS. La majoria dels actual algorismes d'encaminament no tenen en compte l'establiment de rutes alternatives o utilitzen els mateixos objectius per seleccionar els camins de treball i els de protecció. Per millorar el nivell de protecció introduïm i formalitzem dos nous conceptes: la Probabilitat de fallada de la xarxa i l'Impacte de fallada. Un anàlisi de la xarxa a nivell físic proporciona un primer element per avaluar el nivell de protecció en termes de fiabilitat i disponibilitat de la xarxa. Formalitzem l'impacte d'una fallada, quant a la degradació de la qualitat de servei (en termes de retard i pèrdues de paquets). Expliquem la nostra proposta per reduir la probabilitat de fallada i l'impacte de fallada. Per últim fem una nova definició i classificació dels serveis de xarxa segons els valors requerits de probabilitat de fallada i impacte. Un dels aspectes que destaquem dels resultats d'aquesta tesi és que els mecanismes de protecció global del camí maximitzen la fiabilitat de la xarxa, mentre que les tècniques de protecció local o de segments de xarxa minimitzen l'impacte de fallada. Per tant podem assolir mínim impacte i màxima fiabilitat aplicant protecció local a tota la xarxa, però no és una proposta escalable en termes de consum de recursos. Nosaltres proposem un mecanisme intermig, aplicant protecció de segments combinat amb el nostre model d'avaluació de la probabilitat de fallada. Resumint, aquesta tesi presenta diversos mecanismes per l'anàlisi del nivell de protecció de la xarxa. Els resultats dels models i mecanismes proposats milloren la fiabilitat i minimitzen l'impacte d'una fallada en la xarxa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tropical Cyclones (TC) under different climate conditions in the Northern Hemisphere have been investigated with the Max Planck Institute (MPI) coupled (ECHAM5/MPIOM) and atmosphere (ECHAM5) climate models. The intensity and size of the TC depend crucially on resolution with higher wind speed and smaller scales at the higher resolutions. The typical size of the TC is reduced by a factor of 2.3 from T63 to T319 using the distance of the maximum wind speed from the centre of the storm as a measure. The full three dimensional structure of the storms becomes increasingly more realistic as the resolution is increased. For the T63 resolution, three ensemble runs are explored for the period 1860 until 2100 using the IPCC SRES scenario A1B and evaluated for three 30 year periods at the end of the 19th, 20th and 21st century, respectively. While there is no significant change between the 19th and the 20th century, there is a considerable reduction in the number of the TC by some 20% in the 21st century, but no change in the number of the more intense storms. Reduction in the number of storms occurs in all regions. A single additional experiment at T213 resolution was run for the two latter 30-year periods. The T213 is an atmospheric only experiment using the transient Sea Surface Temperatures (SST) of the T63 resolution experiment. Also in this case, there is a reduction by some 10% in the number of simulated TC in the 21st century compared to the 20th century but a marked increase in the number of intense storms. The number of storms with maximum wind speeds greater than 50ms-1 increases by a third. Most of the intensification takes place in 2 the Eastern Pacific and in the Atlantic where also the number of storms more or less stays the same. We identify two competing processes effecting TC in a warmer climate. First, the increase in the static stability and the reduced vertical circulation is suggested to contribute to the reduction in the number of storms. Second, the increase in temperature and water vapor provide more energy for the storms so that when favorable conditions occur, the higher SST and higher specific humidity will contribute to more intense storms. As the maximum intensity depends crucially on resolution, this will require higher resolution to have its full effect. The distribution of storms between different regions does not, at first approximation, depend on the temperature itself but on the distribution of the SST anomalies and their influence on the atmospheric circulation. Two additional transient experiments at T319 resolution where run for 20 years at the end of the 20th and 21st century, respectively using the same conditions as in the T213 experiments. The results are consistent with the T213 study. The total number of tropical cyclones were similar to the T213 experiment but were generally more intense. The change from the 20th to the 21st century was also similar with fewer TC in total but with more intense cyclones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes recent variations of the North Atlantic eddy-driven jet stream and analyzes the mean response of the jet to anthropogenic forcing in climate models. Jet stream changes are analyzed both using a direct measure of the near-surface westerly wind maximum and using an EOF-based approach. This allows jet stream changes to be related to the widely used leading patterns of variability: the North Atlantic Oscillation (NAO) and East Atlantic (EA) pattern. Viewed in NAO–EA state space, isolines of jet latitude and speed resemble a distorted polar coordinate system, highlighting the dependence of the jet stream quantities on both spatial patterns. Some differences in the results of the two methods are discussed, but both approaches agree on the general characteristics of the climate models. While there is some agreement between models on a poleward shift of the jet stream in response to anthropogenic forcing, there is still considerable spread between different model projections, especially in winter. Furthermore, the model responses to forcing are often weaker than their biases when compared to a reanalysis. Diagnoses of jet stream changes can be sensitive to the methodologies used, and several aspects of this are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High spatial resolution environmental data gives us a better understanding of the environmental factors affecting plant distributions at fine spatial scales. However, large environmental datasets dramatically increase compute times and output species model size stimulating the need for an alternative computing solution. Cluster computing offers such a solution, by allowing both multiple plant species Environmental Niche Models (ENMs) and individual tiles of high spatial resolution models to be computed concurrently on the same compute cluster. We apply our methodology to a case study of 4,209 species of Mediterranean flora (around 17% of species believed present in the biome). We demonstrate a 16 times speed-up of ENM computation time when 16 CPUs were used on the compute cluster. Our custom Java ‘Merge’ and ‘Downsize’ programs reduce ENM output files sizes by 94%. The median 0.98 test AUC score of species ENMs is aided by various species occurrence data filtering techniques. Finally, by calculating the percentage change of individual grid cell values, we map the projected percentages of plant species vulnerable to climate change in the Mediterranean region between 1950–2000 and 2020.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning is one of the key problems for autonomous vehicles operating in road scenarios. Present planning algorithms operate with the assumption that traffic is organised in predefined speed lanes, which makes it impossible to allow autonomous vehicles in countries with unorganised traffic. Unorganised traffic is though capable of higher traffic bandwidths when constituting vehicles vary in their speed capabilities and sizes. Diverse vehicles in an unorganised exhibit unique driving behaviours which are analysed in this paper by a simulation study. The aim of the work reported here is to create a planning algorithm for mixed traffic consisting of both autonomous and non-autonomous vehicles without any inter-vehicle communication. The awareness (e.g. vision) of every vehicle is restricted to nearby vehicles only and a straight infinite road is assumed for decision making regarding navigation in the presence of multiple vehicles. Exhibited behaviours include obstacle avoidance, overtaking, giving way for vehicles to overtake from behind, vehicle following, adjusting the lateral lane position and so on. A conflict of plans is a major issue which will almost certainly arise in the absence of inter-vehicle communication. Hence each vehicle needs to continuously track other vehicles and rectify plans whenever a collision seems likely. Further it is observed here that driver aggression plays a vital role in overall traffic dynamics, hence this has also been factored in accordingly. This work is hence a step forward towards achieving autonomous vehicles in unorganised traffic, while similar effort would be required for planning problems such as intersections, mergers, diversions and other modules like localisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The planning of semi-autonomous vehicles in traffic scenarios is a relatively new problem that contributes towards the goal of making road travel by vehicles free of human drivers. An algorithm needs to ensure optimal real time planning of multiple vehicles (moving in either direction along a road), in the presence of a complex obstacle network. Unlike other approaches, here we assume that speed lanes are not present and that different lanes do not need to be maintained for inbound and outbound traffic. Our basic hypothesis is to carry forward the planning task to ensure that a sufficient distance is maintained by each vehicle from all other vehicles, obstacles and road boundaries. We present here a 4-layer planning algorithm that consists of road selection (for selecting the individual roads of traversal to reach the goal), pathway selection (a strategy to avoid and/or overtake obstacles, road diversions and other blockages), pathway distribution (to select the position of a vehicle at every instance of time in a pathway), and trajectory generation (for generating a curve, smooth enough, to allow for the maximum possible speed). Cooperation between vehicles is handled separately at the different levels, the aim being to maximize the separation between vehicles. Simulated results exhibit behaviours of smooth, efficient and safe driving of vehicles in multiple scenarios; along with typical vehicle behaviours including following and overtaking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chaotic traffic, prevalent in many countries, is marked by a large number of vehicles driving with different speeds without following any predefined speed lanes. Such traffic rules out using any planning algorithm for these vehicles which is based upon the maintenance of speed lanes and lane changes. The absence of speed lanes may imply more bandwidth and easier overtaking in cases where vehicles vary considerably in both their size and speed. Inspired by the performance of artificial potential fields in the planning of mobile robots, we propose here lateral potentials as measures to enable vehicles to decide about their lateral positions on the road. Each vehicle is subjected to a potential from obstacles and vehicles in front, road boundaries, obstacles and vehicles to the side and higher speed vehicles to the rear. All these potentials are lateral and only govern steering the vehicle. A speed control mechanism is also used for longitudinal control of vehicle. The proposed system is shown to perform well for obstacle avoidance, vehicle following and overtaking behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning of autonomous vehicles in the absence of speed lanes is a less-researched problem. However, it is an important step toward extending the possibility of autonomous vehicles to countries where speed lanes are not followed. The advantages of having nonlane-oriented traffic include larger traffic bandwidth and more overtaking, which are features that are highlighted when vehicles vary in terms of speed and size. In the most general case, the road would be filled with a complex grid of static obstacles and vehicles of varying speeds. The optimal travel plan consists of a set of maneuvers that enables a vehicle to avoid obstacles and to overtake vehicles in an optimal manner and, in turn, enable other vehicles to overtake. The desired characteristics of this planning scenario include near completeness and near optimality in real time with an unstructured environment, with vehicles essentially displaying a high degree of cooperation and enabling every possible(safe) overtaking procedure to be completed as soon as possible. Challenges addressed in this paper include a (fast) method for initial path generation using an elastic strip, (re-)defining the notion of completeness specific to the problem, and inducing the notion of cooperation in the elastic strip. Using this approach, vehicular behaviors of overtaking, cooperation, vehicle following,obstacle avoidance, etc., are demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amounts of source gases with stratospheric sinks (CFCs, N2O, CH4) are affected by changes in Brewer–Dobson circulation. Source gases and their degradation products are important for atmospheric chemistry and climate. With a simple model, we examine how amounts and lifetimes of source gases and products depend on speed of the circulation. Transient results differ from steady-state and stratospheric results differ from those for stratosphere plus troposphere. Increases in speed increase the stratospheric burden of source gases, but reduce products and reduce total burdens and lifetimes of source gases

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current state of the art in the planning and coordination of autonomous vehicles is based upon the presence of speed lanes. In a traffic scenario where there is a large diversity between vehicles the removal of speed lanes can generate a significantly higher traffic bandwidth. Vehicle navigation in such unorganized traffic is considered. An evolutionary based trajectory planning technique has the advantages of making driving efficient and safe, however it also has to surpass the hurdle of computational cost. In this paper, we propose a real time genetic algorithm with Bezier curves for trajectory planning. The main contribution is the integration of vehicle following and overtaking behaviour for general traffic as heuristics for the coordination between vehicles. The resultant coordination strategy is fast and near-optimal. As the vehicles move, uncertainties may arise which are constantly adapted to, and may even lead to either the cancellation of an overtaking procedure or the initiation of one. Higher level planning is performed by Dijkstra's algorithm which indicates the route to be followed by the vehicle in a road network. Re-planning is carried out when a road blockage or obstacle is detected. Experimental results confirm the success of the algorithm subject to optimal high and low-level planning, re-planning and overtaking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A statistical–dynamical downscaling (SDD) approach for the regionalization of wind energy output (Eout) over Europe with special focus on Germany is proposed. SDD uses an extended circulation weather type (CWT) analysis on global daily mean sea level pressure fields with the central point being located over Germany. Seventy-seven weather classes based on the associated CWT and the intensity of the geostrophic flow are identified. Representatives of these classes are dynamically downscaled with the regional climate model COSMO-CLM. By using weather class frequencies of different data sets, the simulated representatives are recombined to probability density functions (PDFs) of near-surface wind speed and finally to Eout of a sample wind turbine for present and future climate. This is performed for reanalysis, decadal hindcasts and long-term future projections. For evaluation purposes, results of SDD are compared to wind observations and to simulated Eout of purely dynamical downscaling (DD) methods. For the present climate, SDD is able to simulate realistic PDFs of 10-m wind speed for most stations in Germany. The resulting spatial Eout patterns are similar to DD-simulated Eout. In terms of decadal hindcasts, results of SDD are similar to DD-simulated Eout over Germany, Poland, Czech Republic, and Benelux, for which high correlations between annual Eout time series of SDD and DD are detected for selected hindcasts. Lower correlation is found for other European countries. It is demonstrated that SDD can be used to downscale the full ensemble of the Earth System Model of the Max Planck Institute (MPI-ESM) decadal prediction system. Long-term climate change projections in Special Report on Emission Scenarios of ECHAM5/MPI-OM as obtained by SDD agree well to the results of other studies using DD methods, with increasing Eout over northern Europe and a negative trend over southern Europe. Despite some biases, it is concluded that SDD is an adequate tool to assess regional wind energy changes in large model ensembles.