918 resultados para macroscopic traffic flow models
Resumo:
The Long Term Evolution (LTE) cellular technology is expected to extend the capacity and improve the performance of current 3G cellular networks. Among the key mechanisms in LTE responsible for traffic management is the packet scheduler, which handles the allocation of resources to active flows in both the frequency and time dimension. This paper investigates for various scheduling scheme how they affect the inter-cell interference characteristics and how the interference in turn affects the user’s performance. A special focus in the analysis is on the impact of flow-level dynamics resulting from the random user behaviour. For this we use a hybrid analytical/simulation approach which enables fast evaluation of flow-level performance measures. Most interestingly, our findings show that the scheduling policy significantly affects the inter-cell interference pattern but that the scheduler specific pattern has little impact on the flow-level performance.
Resumo:
Long Term Evolution (LTE) is a cellular technology foreseen to extend the capacity and improve the performance of current 3G cellular networks. A key mechanism in the LTE traffic handling is the packet scheduler, which is in charge of allocating resources to active flows in both the frequency and time dimension. In this paper we present a performance comparison of three distinct scheduling schemes for LTE uplink with main focus on the impact of flow-level dynamics resulting from the random user behaviour. We apply a combined analytical/simulation approach which enables fast evaluation of flow-level performance measures. The results show that by considering flow-level dynamics we are able to observe performance trends that would otherwise stay hidden if only packet-level analysis is performed.
Resumo:
In an effort to understand the fate of inhaled submicron particles in the small sacs, or alveoli, comprising the gas-exchange region of the lung, we calculated the flow in three-dimensional (3D) rhythmically expanding models of alveolated ducts. Since convection toward the alveolar walls is a precursor to particle deposition, it was the goal of this paper to investigate the streamline maps' dependence upon alveoli location along the acinar tree. On the alveolar midplane, the recirculating flow pattern exhibited closed streamlines with a stagnation saddle point. Off the midplane we found no closed streamlines but nested, funnel-like, spiral, structures (reminiscent of Russian nesting dolls) that were directed towards the expanding walls in inspiration, and away from the contracting walls in expiration. These nested, funnel-like, structures were surrounded by air that flowed into the cavity from the central channel over inspiration and flowed from the cavity to the central channel over expiration. We also found that fluid particle tracks exhibited similar nested funnel-like spiral structures. We conclude that these unique alveolar flow structures may be of importance in enhancing deposition. In addition, due to inertia, the nested, funnel-like, structures change shape and position slightly during a breathing cycle, resulting in flow mixing. Also, each inspiration feeds a fresh supply of particle-laden air from the central channel to the region surrounding the mixing region. Thus, this combination of flow mixer and flow feeder makes each individual alveolus an effective mixing unit, which is likely to play an important role in determining the overall efficiency of convective mixing in the acinus.
Resumo:
This thesis examines two panel data sets of 48 states from 1981 to 2009 and utilizes ordinary least squares (OLS) and fixed effects models to explore the relationship between rural Interstate speed limits and fatality rates and whether rural Interstate speed limits affect non-Interstate safety. Models provide evidence that rural Interstate speed limits higher than 55 MPH lead to higher fatality rates on rural Interstates though this effect is somewhat tempered by reductions in fatality rates for roads other than rural Interstates. These results provide some but not unanimous support for the traffic diversion hypothesis that rural Interstate speed limit increases lead to decreases in fatality rates of other roads. To the author’s knowledge, this paper is the first econometric study to differentiate between the effects of 70 MPH speed limits and speed limits above 70 MPH on fatality rates using a multi-state data set. Considering both rural Interstates and other roads, rural Interstate speed limit increases above 55 MPH are responsible for 39,700 net fatalities, 4.1 percent of total fatalities from 1987, the year limits were first raised, to 2009.
Resumo:
BACKGROUND: Individual adaptation of processed patient's blood volume (PBV) should reduce number and/or duration of autologous peripheral blood progenitor cell (PBPC) collections. STUDY DESIGN AND METHODS: The durations of leukapheresis procedures were adapted by means of an interim analysis of harvested CD34+ cells to obtain the intended yield of CD34+ within as few and/or short as possible leukapheresis procedures. Absolute efficiency (AE; CD34+/kg body weight) and relative efficiency (RE; total CD34+ yield of single apheresis/total number of preapheresis CD34+) were calculated, assuming an intraapheresis recruitment if RE was greater than 1, and a yield prediction models for adults was generated. RESULTS: A total of 196 adults required a total of 266 PBPC collections. The median AE was 7.99 x 10(6), and the median RE was 1.76. The prediction model for AE showed a satisfactory predictive value for preapheresis CD34+ only. The prediction model for RE also showed a low predictive value (R2 = 0.36). Twenty-eight children underwent 44 PBPC collections. The median AE was 12.13 x 10(6), and the median RE was 1.62. Major complications comprised bleeding episodes related to central venous catheters (n = 4) and severe thrombocytopenia of less than 10 x 10(9) per L (n = 16). CONCLUSION: A CD34+ interim analysis is a suitable tool for individual adaptation of the duration of leukapheresis. During leukapheresis, a substantial recruitment of CD34+ was observed, resulting in a RE of greater than 1 in more than 75 percent of patients. The upper limit of processed PBV showing an intraapheresis CD34+ recruitment is higher than in a standard large-volume leukapheresis. Therefore, a reduction of individually needed PBPC collections by means of a further escalation of the processed PBV seems possible.
Resumo:
BACKGROUND: Many studies showing effects of traffic-related air pollution on health rely on self-reported exposure, which may be inaccurate. We estimated the association between self-reported exposure to road traffic and respiratory symptoms in preschool children, and investigated whether the effect could have been caused by reporting bias. METHODS: In a random sample of 8700 preschool children in Leicestershire, UK, exposure to road traffic and respiratory symptoms were assessed by a postal questionnaire (response rate 80%). The association between traffic exposure and respiratory outcomes was assessed using unconditional logistic regression and conditional regression models (matching by postcode). RESULTS: Prevalence odds ratios (95% confidence intervals) for self-reported road traffic exposure, comparing the categories 'moderate' and 'dense', respectively, with 'little or no' were for current wheezing: 1.26 (1.13-1.42) and 1.30 (1.09-1.55); chronic rhinitis: 1.18 (1.05-1.31) and 1.31 (1.11-1.56); night cough: 1.17 (1.04-1.32) and 1.36 (1.14-1.62); and bronchodilator use: 1.20 (1.04-1.38) and 1.18 (0.95-1.46). Matched analysis only comparing symptomatic and asymptomatic children living at the same postcode (thus exposed to similar road traffic) showed similar ORs, suggesting that parents of children with respiratory symptoms reported more road traffic than parents of asymptomatic children. CONCLUSIONS: Our study suggests that reporting bias could explain some or even all the association between reported exposure to road traffic and disease. Over-reporting of exposure by only 10% of parents of symptomatic children would be sufficient to produce the effect sizes shown in this study. Future research should be based only on objective measurements of traffic exposure.
Resumo:
The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.
Resumo:
Traffic particle concentrations show considerable spatial variability within a metropolitan area. We consider latent variable semiparametric regression models for modeling the spatial and temporal variability of black carbon and elemental carbon concentrations in the greater Boston area. Measurements of these pollutants, which are markers of traffic particles, were obtained from several individual exposure studies conducted at specific household locations as well as 15 ambient monitoring sites in the city. The models allow for both flexible, nonlinear effects of covariates and for unexplained spatial and temporal variability in exposure. In addition, the different individual exposure studies recorded different surrogates of traffic particles, with some recording only outdoor concentrations of black or elemental carbon, some recording indoor concentrations of black carbon, and others recording both indoor and outdoor concentrations of black carbon. A joint model for outdoor and indoor exposure that specifies a spatially varying latent variable provides greater spatial coverage in the area of interest. We propose a penalised spline formation of the model that relates to generalised kringing of the latent traffic pollution variable and leads to a natural Bayesian Markov Chain Monte Carlo algorithm for model fitting. We propose methods that allow us to control the degress of freedom of the smoother in a Bayesian framework. Finally, we present results from an analysis that applies the model to data from summer and winter separately
Resumo:
Water-saturated debris flows are among some of the most destructive mass movements. Their complex nature presents a challenge for quantitative description and modeling. In order to improve understanding of the dynamics of these flows, it is important to seek a simplified dynamic system underlying their behavior. Models currently in use to describe the motion of debris flows employ depth-averaged equations of motion, typically assuming negligible effects from vertical acceleration. However, in many cases debris flows experience significant vertical acceleration as they move across irregular surfaces, and it has been proposed that friction associated with vertical forces and liquefaction merit inclusion in any comprehensive mechanical model. The intent of this work is to determine the effect of vertical acceleration through a series of laboratory experiments designed to simulate debris flows, testing a recent model for debris flows experimentally. In the experiments, a mass of water-saturated sediment is released suddenly from a holding container, and parameters including rate of collapse, pore-fluid pressure, and bed load are monitored. Experiments are simplified to axial geometry so that variables act solely in the vertical dimension. Steady state equations to infer motion of the moving sediment mass are not sufficient to model accurately the independent solid and fluid constituents in these experiments. The model developed in this work more accurately predicts the bed-normal stress of a saturated sediment mass in motion and illustrates the importance of acceleration and deceleration.
Resumo:
Single-screw extrusion is one of the widely used processing methods in plastics industry, which was the third largest manufacturing industry in the United States in 2007 [5]. In order to optimize the single-screw extrusion process, tremendous efforts have been devoted for development of accurate models in the last fifty years, especially for polymer melting in screw extruders. This has led to a good qualitative understanding of the melting process; however, quantitative predictions of melting from various models often have a large error in comparison to the experimental data. Thus, even nowadays, process parameters and the geometry of the extruder channel for the single-screw extrusion are determined by trial and error. Since new polymers are developed frequently, finding the optimum parameters to extrude these polymers by trial and error is costly and time consuming. In order to reduce the time and experimental work required for optimizing the process parameters and the geometry of the extruder channel for a given polymer, the main goal of this research was to perform a coordinated experimental and numerical investigation of melting in screw extrusion. In this work, a full three-dimensional finite element simulation of the two-phase flow in the melting and metering zones of a single-screw extruder was performed by solving the conservation equations for mass, momentum, and energy. The only attempt for such a three-dimensional simulation of melting in screw extruder was more than twenty years back. However, that work had only a limited success because of the capability of computers and mathematical algorithms available at that time. The dramatic improvement of computational power and mathematical knowledge now make it possible to run full 3-D simulations of two-phase flow in single-screw extruders on a desktop PC. In order to verify the numerical predictions from the full 3-D simulations of two-phase flow in single-screw extruders, a detailed experimental study was performed. This experimental study included Maddock screw-freezing experiments, Screw Simulator experiments and material characterization experiments. Maddock screw-freezing experiments were performed in order to visualize the melting profile along the single-screw extruder channel with different screw geometry configurations. These melting profiles were compared with the simulation results. Screw Simulator experiments were performed to collect the shear stress and melting flux data for various polymers. Cone and plate viscometer experiments were performed to obtain the shear viscosity data which is needed in the simulations. An optimization code was developed to optimize two screw geometry parameters, namely, screw lead (pitch) and depth in the metering section of a single-screw extruder, such that the output rate of the extruder was maximized without exceeding the maximum temperature value specified at the exit of the extruder. This optimization code used a mesh partitioning technique in order to obtain the flow domain. The simulations in this flow domain was performed using the code developed to simulate the two-phase flow in single-screw extruders.
Resumo:
Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.
Resumo:
Micro-scale, two-phase flow is found in a variety of devices such as Lab-on-a-chip, bio-chips, micro-heat exchangers, and fuel cells. Knowledge of the fluid behavior near the dynamic gas-liquid interface is required for developing accurate predictive models. Light is distorted near a curved gas-liquid interface preventing accurate measurement of interfacial shape and internal liquid velocities. This research focused on the development of experimental methods designed to isolate and probe dynamic liquid films and measure velocity fields near a moving gas-liquid interface. A high-speed, reflectance, swept-field confocal (RSFC) imaging system was developed for imaging near curved surfaces. Experimental studies of dynamic gas-liquid interface of micro-scale, two-phase flow were conducted in three phases. Dynamic liquid film thicknesses of segmented, two-phase flow were measured using the RSFC and compared to a classic film thickness deposition model. Flow fields near a steadily moving meniscus were measured using RSFC and particle tracking velocimetry. The RSFC provided high speed imaging near the menisci without distortion caused the gas-liquid interface. Finally, interfacial morphology for internal two-phase flow and droplet evaporation were measured using interferograms produced by the RSFC imaging technique. Each technique can be used independently or simultaneously when.
Resumo:
BACKGROUND: Activation of endothelial cells (EC) in xenotransplantation is mostly induced through binding of antibodies (Ab) and activation of the complement system. Activated EC lose their heparan sulfate proteoglycan (HSPG) layer and exhibit a procoagulant and pro-inflammatory cell surface. We have recently shown that the semi-synthetic proteoglycan analog dextran sulfate (DXS, MW 5000) blocks activation of the complement cascade and acts as an EC-protectant both in vitro and in vivo. However, DXS is a strong anticoagulant and systemic use of this substance in a clinical setting might therefore be compromised. It was the aim of this study to investigate a novel, fully synthetic EC-protectant with reduced inhibition of the coagulation system. METHOD: By screening with standard complement (CH50) and coagulation assays (activated partial thromboplastin time, aPTT), a conjugate of tyrosine sulfate to a polymer-backbone (sTyr-PAA) was identified as a candidate EC-protectant. The pathway-specificity of complement inhibition by sTyr-PAA was tested in hemolytic assays. To further characterize the substance, the effects of sTyr-PAA and DXS on complement deposition on pig cells were compared by flow cytometry and cytotoxicity assays. Using fluorescein-labeled sTyr-PAA (sTyr-PAA-Fluo), the binding of sTyr-PAA to cell surfaces was also investigated. RESULTS: Of all tested compounds, sTyr-PAA was the most effective substance in inhibiting all three pathways of complement activation. Its capacity to inhibit the coagulation cascade was significantly reduced as compared with DXS. sTyr-PAA also dose-dependently inhibited deposition of human complement on pig cells and this inhibition correlated with the binding of sTyr-PAA to the cells. Moreover, we were able to demonstrate that sTyr-PAA binds preferentially and dose-dependently to damaged EC. CONCLUSIONS: We could show that sTyr-PAA acts as an EC-protectant by binding to the cells and protecting them from complement-mediated damage. It has less effect on the coagulation system than DXS and may therefore have potential for in vivo application.
Resumo:
Conventional debugging tools present developers with means to explore the run-time context in which an error has occurred. In many cases this is enough to help the developer discover the faulty source code and correct it. However, rather often errors occur due to code that has executed in the past, leaving certain objects in an inconsistent state. The actual run-time error only occurs when these inconsistent objects are used later in the program. So-called back-in-time debuggers help developers step back through earlier states of the program and explore execution contexts not available to conventional debuggers. Nevertheless, even back-in-time debuggers do not help answer the question, ``Where did this object come from?'' The Object-Flow Virtual Machine, which we have proposed in previous work, tracks the flow of objects to answer precisely such questions, but this VM does not provide dedicated debugging support to explore faulty programs. In this paper we present a novel debugger, called Compass, to navigate between conventional run-time stack-oriented control flow views and object flows. Compass enables a developer to effectively navigate from an object contributing to an error back-in-time through all the code that has touched the object. We present the design and implementation of Compass, and we demonstrate how flow-centric, back-in-time debugging can be used to effectively locate the source of hard-to-find bugs.