917 resultados para traffic engineering computing
Resumo:
The macroscopic fundamental diagram (MFD) traffic modelling method has been proved for large urban roads and freeway networks, but hysteresis and scatter have been found in both such networks. This paper investigates how incident variables affect the shape and scatter of the MFD using both simulated data and real data collected from the M3 Pacific motorway in Brisbane, Australia. Three key components of incidents are investigated based on the simulated data (i.e. incident location, incident duration and traffic demand). The results based on simulated data indicate that the diagram shape is a property not only of the network itself but also of the incident variables. Diagrams for three types of real incidents (crash, hazard and vehicle breakdown) are explored separately. The results based on the empirical data are consistent with the simulated results. The hysteresis phenomenon occurs both upstream and downstream of the incident location, but for opposite hysteresis loops. The gradient of the upstream diagram is greater than that downstream on the incident site, when traffic demand is for an off-peak period.
Resumo:
Designed for undergraduate and postgraduate students, academic researchers and industrial practitioners, this book provides comprehensive case studies on numerical computing of industrial processes and step-by-step procedures for conducting industrial computing. It assumes minimal knowledge in numerical computing and computer programming, making it easy to read, understand and follow. Topics discussed include fundamentals of industrial computing, finite difference methods, the Wavelet-Collocation Method, the Wavelet-Galerkin Method, High Resolution Methods, and comparative studies of various methods. These are discussed using examples of carefully selected models from real processes of industrial significance. The step-by-step procedures in all these case studies can be easily applied to other industrial processes without a need for major changes and thus provide readers with useful frameworks for the applications of engineering computing in fundamental research problems and practical development scenarios.
Resumo:
This paper reviews the use of multi-agent systems to model the impacts of high levels of photovoltaic (PV) system penetration in distribution networks and presents some preliminary data obtained from the Perth Solar City high penetration PV trial. The Perth Solar City trial consists of a low voltage distribution feeder supplying 75 customers where 29 consumers have roof top photovoltaic systems. Data is collected from smart meters at each consumer premises, from data loggers at the transformer low voltage (LV) side and from a nearby distribution network SCADA measurement point on the high voltage side (HV) side of the transformer. The data will be used to progressively develop MAS models.
Resumo:
mathematical model for the steady flow of non-Newtonian fluid through a stenotic region is presented. The results indicate that the general shape and size of the stenosis together with rheological properties of blood are important in understanding the flow characteristics and the presence of flow separation.
Resumo:
A complete solution to the fundamental problem of delineation of an ECG signal into its component waves by filtering the discrete Fourier transform of the signal is presented. The set of samples in a component wave is transformed into a complex sequence with a distinct frequency band. The filter characteristics are determined from the time signal itself. Multiplication of the transformed signal with a complex sinusoidal function allows the use of a bank of low-pass filters for the delineation of all component waves. Data from about 300 beats have been analysed and the results are highly satisfactory both qualitatively and quantitatively.
Resumo:
One of the main disturbances in EEG signals is EMG artefacts generated by muscle movements. In the paper, the use of a linear phase FIR digital low-pass filter with finite wordlength precision coefficients is proposed, designed using the compensation procedure, to minimise EMG artefacts in contaminated EEG signals. To make the filtering more effective, different structures are used, i.e. cascading, twicing and sharpening (apart from simple low-pass filtering) of the designed FIR filter Modifications are proposed to twicing and sharpening structures to regain the linear phase characteristics that are lost in conventional twicing and sharpening operations. The efficacy of all these transformed filters in minimising EMG artefacts is studied, using SNR improvements as a performance measure for simulated signals. Time plots of the signals are also compared. Studies show that the modified sharpening structure is superior in performance to all other proposed methods. These algorithms have also been applied to real or recorded EMG-contaminated EEG signal. Comparison of time plots, and also the output SNR, show that the proposed modified sharpened structure works better in minimising EMG artefacts compared with other methods considered.
Teracluster LSSC-II - Its Designing Principles and Applications in Large Scale Numerical Simulations
Resumo:
The teracluster LSSC-II installed at the State Key Laboratory of Scientific and Engineering Computing, Chinese Academy of Sciences is one of the most powerful PC clusters in China. It has a peek performance of 2Tflops. With a Linpack performance of 1.04Tflops, it is ranked at the 43rd place in the 20th TOP500 List (November 2002), 51st place in the 21st TOP500 List (June 2003), and the 82nd place in the 22nd TOP500 List (November 2003) with a new Linpack performance of 1.3Tflops. In this paper, we present some design principles of this cluster, as well as its applications in some largescale numerical simulations.
Resumo:
Basic research related to heavy-ion cancer therapy has been done at the Institute of Modern Physics (IMP), Chinese Academy of Sciences since 1995. Now a plan of clinical trial with heavy ions has been launched at IMP. First, superficially placed tumor treatment with heavy ions is expected in the therapy terminal at the Heavy Ion Research Facility in Lanzhou (HIRFL), where carbon ion beams with energy up to 100 MeV/u can be supplied. The shallow-seated tumor therapy terminal at HIRFL is equipped with a passive beam delivery system including two orthogonal dipole magnets, which continuously scan pencil beams laterally and generate a broad and uniform irradiation field, a motor-driven energy degrader and a multi-leaf collimator. Two different types of range modulator, ripple filter and ridge filter with which Guassian-shaped physical dose and uniform biological effective dose Bragg peaks can be shaped for therapeutic ion beams respectively, have been designed and manufactured. Therefore, two-dimensional and three-dimensional conformal irradiations to tumors can be performed with the passive beam delivery system at the earlier therapy terminal. Both the conformal irradiation methods have been verified experimentally and carbon-ion conformal irradiations to patients with superficially placed tumors have been carried out at HIRFL since November 2006.
Resumo:
Seepage control in karstic rock masses is one of the most important problems in domestic hydroelectric engineering and mining engineering as well as traffic engineering. At present permeability assessment and leakage analysis of multi-layer karstic rock masses are mainly qualitative, while seldom quantitative. Quantitative analyses of the permeability coefficient and seepage amount are conducted in this report, which will provide a theoretical basis for the study of seepage law and seepage control treatment of karstic rocks. Based on the field measurements in the horizontal grouting galleries of seepage control curtains on the left bank of the Shuibuya Hydropower Project on the Qingjiang river, a hydraulic model is established in this report, and the computation results will provide a scientific basis for optimization of grouting curtain engineering. Following issues are addressed in the report. (1) Based on the in-situ measurements of fissures and karstic cavities in grouting galleries, the characteristics of karstic rock mass is analyzed, and a stochastic structural model of karstic rock masses is set up, which will provide the basis for calculation of the permeability and leakage amount of karstic rock mass. (2) According to the distribution of the measured joints in the grouting galleries and the stochastic results obtained from the stochastic structural model of karstic rock mass between grouting galleries, a formula for computation of permeability tensor of fracturing system is set up, and a computation program is made with Visual Basic language. The computation results will be helpful for zoning of fissured rock masses and calculation of seepage amount as well as optimization of seepage control curtains. (3) Fractal theory is used to describe quantitatively the roughness of conduit walls of karstic systems and the sinuosity of karstic conduits. It is proposed that the roughness coefficient of kastic caves can be expressed by both fractal dimension Ds and Dr that represent respectively the extension sinuosity of karstic caves and the roughness of the conduit walls. The existing formula for calculating the seepage amount of karstic conduits is revised and programmed. The seepage amount of rock masses in the measured grouting galleries is estimated under the condition that no seepage control measures are taken before reservoir impoundment, and the results will be helpful for design and construction optimization of seepage curtains of the Shuibuya hydropower project. This report is one part of the subject "Karstic hydrogeology and the structural model and seepage hydraulics of karstic rock masses", a sub-program of "Study on seepage hydraulics of multi-layer karstic rock masses and its application in seepage control curtain engineering", which is financially supported by the Hubei Provincial key science and technology programme.
Resumo:
Recent measurement based studies reveal that most of the Internet connections are short in terms of the amount of traffic they carry (mice), while a small fraction of the connections are carrying a large portion of the traffic (elephants). A careful study of the TCP protocol shows that without help from an Active Queue Management (AQM) policy, short connections tend to lose to long connections in their competition for bandwidth. This is because short connections do not gain detailed knowledge of the network state, and therefore they are doomed to be less competitive due to the conservative nature of the TCP congestion control algorithm. Inspired by the Differentiated Services (Diffserv) architecture, we propose to give preferential treatment to short connections inside the bottleneck queue, so that short connections experience less packet drop rate than long connections. This is done by employing the RIO (RED with In and Out) queue management policy which uses different drop functions for different classes of traffic. Our simulation results show that: (1) in a highly loaded network, preferential treatment is necessary to provide short TCP connections with better response time and fairness without hurting the performance of long TCP connections; (2) the proposed scheme still delivers packets in FIFO manner at each link, thus it maintains statistical multiplexing gain and does not misorder packets; (3) choosing a smaller default initial timeout value for TCP can help enhance the performance of short TCP flows, however not as effectively as our scheme and at the risk of congestion collapse; (4) in the worst case, our proposal works as well as a regular RED scheme, in terms of response time and goodput.
Resumo:
We present a thorough characterization of the access patterns in blogspace, which comprises a rich interconnected web of blog postings and comments by an increasingly prominent user community that collectively define what has become known as the blogosphere. Our characterization of over 35 million read, write, and management requests spanning a 28-day period is done at three different levels. The user view characterizes how individual users interact with blogosphere objects (blogs); the object view characterizes how individual blogs are accessed; the server view characterizes the aggregate access patterns of all users to all blogs. The more-interactive nature of the blogosphere leads to interesting traffic and communication patterns, which are different from those observed for traditional web content. We identify and characterize novel features of the blogosphere workload, and we show the similarities and differences between typical web server workloads and blogosphere server workloads. Finally, based on our main characterization results, we build a new synthetic blogosphere workload generator called GBLOT, which aims at mimicking closely a stream of requests originating from a population of blog users. Given the increasing share of blogspace traffic, realistic workload models and tools are important for capacity planning and traffic engineering purposes.