50 resultados para Traffic Flow Regimes
Resumo:
We evaluate the performance of different optimization techniques developed in the context of optical flowcomputation with different variational models. In particular, based on truncated Newton methods (TN) that have been an effective approach for large-scale unconstrained optimization, we develop the use of efficient multilevel schemes for computing the optical flow. More precisely, we evaluate the performance of a standard unidirectional multilevel algorithm - called multiresolution optimization (MR/OPT), to a bidrectional multilevel algorithm - called full multigrid optimization (FMG/OPT). The FMG/OPT algorithm treats the coarse grid correction as an optimization search direction and eventually scales it using a line search. Experimental results on different image sequences using four models of optical flow computation show that the FMG/OPT algorithm outperforms both the TN and MR/OPT algorithms in terms of the computational work and the quality of the optical flow estimation.
Resumo:
The work in this paper deals with the development of momentum and thermal boundary layers when a power law fluid flows over a flat plate. At the plate we impose either constant temperature, constant flux or a Newton cooling condition. The problem is analysed using similarity solutions, integral momentum and energy equations and an approximation technique which is a form of the Heat Balance Integral Method. The fluid properties are assumed to be independent of temperature, hence the momentum equation uncouples from the thermal problem. We first derive the similarity equations for the velocity and present exact solutions for the case where the power law index n = 2. The similarity solutions are used to validate the new approximation method. This new technique is then applied to the thermal boundary layer, where a similarity solution can only be obtained for the case n = 1.
Resumo:
Systematic asymptotic methods are used to formulate a model for the extensional flow of a thin sheet of nematic liquid crystal. With no external body forces applied, the model is found to be equivalent to the so-called Trouton model for Newtonian sheets (and fi bers), albeit with a modi fied "Trouton ratio". However, with a symmetry-breaking electric field gradient applied, behavior deviates from the Newtonian case, and the sheet can undergo fi nite-time breakup if a suitable destabilizing field is applied. Some simple exact solutions are presented to illustrate the results in certain idealized limits, as well as sample numerical results to the full model equations.
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
The automatic interpretation of conventional traffic signs is very complex and time consuming. The paper concerns an automatic warning system for driving assistance. It does not interpret the standard traffic signs on the roadside; the proposal is to incorporate into the existing signs another type of traffic sign whose information will be more easily interpreted by a processor. The type of information to be added is profuse and therefore the most important object is the robustness of the system. The basic proposal of this new philosophy is that the co-pilot system for automatic warning and driving assistance can interpret with greater ease the information contained in the new sign, whilst the human driver only has to interpret the "classic" sign. One of the codings that has been tested with good results and which seems to us easy to implement is that which has a rectangular shape and 4 vertical bars of different colours. The size of these signs is equivalent to the size of the conventional signs (approximately 0.4 m2). The colour information from the sign can be easily interpreted by the proposed processor and the interpretation is much easier and quicker than the information shown by the pictographs of the classic signs
Resumo:
In previous work we proposed a multi-objective traffic engineering scheme (MHDB-S model) using different distribution trees to multicast several flows. In this paper, we propose a heuristic algorithm to create multiple point-to-multipoint (p2mp) LSPs based on the optimum sub-flow values obtained with our MHDB-S model. Moreover, a general problem for supporting multicasting in MPLS networks is the lack of labels. To reduce the number of labels used, a label space reduction algorithm solution is also considered
Resumo:
El poder de l'Estat i la sobirania tradicional s'està deteriorant de manera constant, sobretot en termes de la provisió de certs béns públics fonamentals. Els Estats, en particular, són incapaços de manejar el coneixement i la informació que és essencial per mantenir la competitivitat i la sostenibilitat en una economia interdependent. Estructures fiables de la governança mundial i la cooperació internacional estan lluny de ser establertes. Energia com a problema a les agendes p dels governs, les empreses privades i la societat civil és un exemple manifest d'aquesta dinàmica.. L'actual sistema de governança mundial d'energia implica accions polítiques disperses per actors divers. L'Agència Internacional de l'Energia té un paper destacat, però està debilitat per la seva composició limitada i basada en el coneixement- epistèmic en lloc del material o executiu. Aquest treball sosté que ni la mida ni nombre de membres disponibles estan dificultant la governabilitat mundial d'energia. Més aviat, l'energia és una sèrie de béns públics que es troben als llimbs, on els estats no poden pagar la seva disposició, així com els diversos interessos impedir l'establiment d'una autoritat internacional. Després de la introducció de la teoria del règim internacional i el concepte de coneixement basats en les comunitats epistèmiques, l'article revisa l'estat actual de la governabilitat de l'energia mundia. A continuació es presenta una comparació d'aquesta estructura amb els règims de govern nacional i regional, d'una banda, i amb règims globals ambientals i de salut, de l'altra
Resumo:
Several airline consolidation events have recently been completed both in Europe and in the United States. The model we develop considers two airlines operating hub-and-spoke networks, using different hubs to connect the same spoke airports. We assume the airlines to be vertically differentiated, which allows us to distinguish between primary and secondary hubs. We conclude that this differentiation in air services becomes more accentuated after consolidation, with an increased number of flights being channeled through the primary hub. However, congestion can act as a brake on the concentration of flight frequency in the primary hub following consolidation. Our empirical application involves an analysis of Delta s network following its merger with Northwest. We find evidence consistent with an increase in the importance of Delta s primary hubs at the expense of its secondary airports. We also find some evidence suggesting that the carrier chooses to divert traffic away from those hub airports that were more prone to delays prior to the merger, in particular New York s JFK airport. Keywords: primary hub; secondary hub; airport congestion; airline consolidation; airline networks JEL Classi fication Numbers: D43; L13; L40; L93; R4
Resumo:
This study explores whether firms have differential price-earnings multiples associated with their means of achieving a sequential pattern of increasing positive earnings. Our main findings show that market participants assign higher price-earnings multiples to firms when their pattern of increasing earnings is supported by the same pattern of increasing cash flows. Market participants assign lower price-earnings multiples to firms suspect of having engaged in accrual-based earnings management, sales manipulation, and overproduction to achieve the earnings pattern. We find, however, that market participants do not penalize firms suspect of having achieved the earnings pattern through the opportunistic reduction of discretionary expenses.
Resumo:
Background: Few studies have used longitudinal ultrasound measurements to assess the effect of traffic-related air pollution on fetal growth.Objective: We examined the relationship between exposure to nitrogen dioxide (NO2) and aromatic hydrocarbons [benzene, toluene, ethylbenzene, m/p-xylene, and o-xylene (BTEX)] on fetal growth assessed by 1,692 ultrasound measurements among 562 pregnant women from the Sabadell cohort of the Spanish INMA (Environment and Childhood) study.Methods: We used temporally adjusted land-use regression models to estimate exposures to NO2 and BTEX. We fitted mixed-effects models to estimate longitudinal growth curves for femur length (FL), head circumference (HC), abdominal circumference (AC), biparietal diameter (BPD), and estimated fetal weight (EFW). Unconditional and conditional SD scores were calculated at 12, 20, and 32 weeks of gestation. Sensitivity analyses were performed considering time–activity patterns during pregnancy.Results: Exposure to BTEX from early pregnancy was negatively associated with growth in BPD during weeks 20–32. None of the other fetal growth parameters were associated with exposure to air pollution during pregnancy. When considering only women who spent 2 hr/day in nonresidential outdoor locations, effect estimates were stronger and statistically significant for the association between NO2 and growth in HC during weeks 12–20 and growth in AC, BPD, and EFW during weeks 20–32.Conclusions: Our results lend some support to an effect of exposure to traffic-related air pollutants from early pregnancy on fetal growth during mid-pregnancy.
Resumo:
The identification and integration of reusable and customizable CSCL (Computer Supported Collaborative Learning) may benefit from the capture of best practices in collaborative learning structuring. The authors have proposed CLFPs (Collaborative Learning Flow Patterns) as a way of collecting these best practices. To facilitate the process of CLFPs by software systems, the paper proposes to specify these patterns using IMS Learning Design (IMS-LD). Thus, teachers without technical knowledge can particularize and integrate CSCL tools. Nevertheless, the support of IMS-LD for describing collaborative learning activities has some deficiencies: the collaborative tools that can be defined in these activities are limited. Thus, this paper proposes and discusses an extension to IMS-LD that enables to specify several characteristics of the use of tools that mediate collaboration. In order to obtain a Unit of Learning based on a CLFP, a three stage process is also proposed. A CLFP-based Unit of Learning example is used to illustrate the process and the need of the proposed extension.
Per-antenna rate and power control for MIMO layered architectures in the low- and high-power regimes
Resumo:
In a MIMO layered architecture, several codewordsare transmitted from a multiplicity of antennas. Although thespectral efficiency is maximized if the rates of these codewordsare separately controlled, the feedback rate within the linkadaptation loop is reduced if they are constrained to be identical.This poses a direct tradeoff between performance andfeedback overhead. This paper provides analytical expressionsthat quantify the difference in spectral efficiency between bothapproaches for arbitrary numbers of antennas. Specifically, thecharacterization takes place in the realm of the low- and highpowerregimes via expansions that are shown to have a widerange of validity.In addition, the possibility of adjusting the transmit powerof each codeword individually is considered as an alternative tothe separate control of their rates. Power allocation, however,turns out to be inferior to rate control within the context of thisproblem.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.