998 resultados para Trip Generation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of clean drinking water in any community is absolutely vital if we as the consumers are to sustain a life of health and wellbeing. Suspended particles in surface waters not only provide the means to transport micro-organisms which can cause serious infections and diseases, they can also affect the performance capacity of a water treatment plant. In such situations pre-treatment ahead of the main plant is recommended. Previous research carried out using non-woven synthetic as a pre-filter materials for protecting slow sand filters from high turbidity showed that filter run times can be extended by several times and filters can be regenerated by simply removing and washing of the fabric ( Mbwette and Graham, 1987 and Mbwette, 1991). Geosynthetic materials have been extensively used for soil retention and dewatering in geotechnical applications and little research exists for the application of turbidity reduction in water treatment. With the development of new materials in geosynthetics today, it was hypothesized that the turbidity removal efficiency can be improved further by selecting appropriate materials. Two different geosynthetic materials (75 micron) tested at a filtration rate of 0.7 m/h yielded 30-45% reduction in turbidity with relatively minor head loss. It was found that the non-woven geotextile Propex 1701 retained the highest performance in both filtration efficiency and head loss across the varying turbidity ranges in comparison to other geotextiles tested. With 5 layers of the Propex 1701 an average percent reduction of approximately 67% was achieved with a head loss average of 4mm over the two and half hour testing period. Using the data collected for the Propex 1701 a mathematical model was developed for predicting the expected percent reduction given the ability to control the cost and as a result the number of layers to be used in a given filtration scenario.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Platelet-derived microparticles that are produced during platelet activation bind to traumatized endothelium. Such endothelial injury occurs during percutaneous transluminal coronary angioplasty. Approximately 20% of these patients subsequently develop restenosis, although this is improved by treatment with the anti-platelet glycoprotein IIb/IIIa receptor drug abciximab. As platelet activation occurs during angioplasty, it is likely that platelet-derived microparticles may be produced and hence contribute to restenosis. This study population consisted of 113 angioplasty patients, of whom 38 received abciximab. Paired peripheral arterial blood samples were obtained following heparinization and subsequent to all vessel manipulation. Platelet-derived microparticles were identified using an anti-CD61 (glycoprotein IIIa) fluorescence-conjugated antibody and flow cytometry. Baseline clinical characteristics between patient groups were similar. The level of platelet-derived microparticles increased significantly following angioplasty in the group without abciximab (paired t test, P 0.019). However, there was no significant change in the level of platelet-derived microparticles following angioplasty in patients who received abciximab, despite requiring more complex angioplasty procedures. In this study, we have demonstrated that the level of platelet-derived microparticles increased during percutaneous transluminal coronary angioplasty, with no such increase with abciximab treatment. The increased platelet-derived microparticles may adhere to traumatized endothelium, contributing to re-occlusion of the arteries, but this remains to be determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new mesh adaptivity algorithm that combines a posteriori error estimation with bubble-type local mesh generation (BLMG) strategy for elliptic differential equations is proposed. The size function used in the BLMG is defined on each vertex during the adaptive process based on the obtained error estimator. In order to avoid the excessive coarsening and refining in each iterative step, two factor thresholds are introduced in the size function. The advantages of the BLMG-based adaptive finite element method, compared with other known methods, are given as follows: the refining and coarsening are obtained fluently in the same framework; the local a posteriori error estimation is easy to implement through the adjacency list of the BLMG method; at all levels of refinement, the updated triangles remain very well shaped, even if the mesh size at any particular refinement level varies by several orders of magnitude. Several numerical examples with singularities for the elliptic problems, where the explicit error estimators are used, verify the efficiency of the algorithm. The analysis for the parameters introduced in the size function shows that the algorithm has good flexibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, integration of small-scale electricity generators, known as Distributed Generation (DG), into distribution networks has become increasingly popular. This tendency together with the falling price of DG units has a great potential in giving the DG a better chance to participate in voltage regulation process, in parallel with other regulating devices already available in the distribution systems. The voltage control issue turns out to be a very challenging problem for distribution engineers, since existing control coordination schemes need to be reconsidered to take into account the DG operation. In this paper, a control coordination approach is proposed, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimize the interaction of DG with another DG or other active devices, such as On-load Tap Changing Transformer (OLTC). The proposed technique has been developed based on the concepts of protection principles (magnitude grading and time grading) for response coordination of DG and other regulating devices and uses Advanced Line Drop Compensators (ALDCs) for implementation. A distribution feeder with tap changing transformer and DG units has been extracted from a practical system to test the proposed control technique. The results show that the proposed method provides an effective solution for coordination of DG with another DG or voltage regulating devices and the integration of protection principles has considerably reduced the control interaction to achieve the desired voltage correction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simple and reliable formation of biodegradable nanoparticles formed from poly-ε-caprolactone was achieved using 1.645 MHz piston atomization of a source fluid of 0.5% w/v of the polymer dissolved in acetone; the particles were allowed to descend under gravity in air 8 cm into a 1 mM solution of sodium dodecyl sulfate. After centrifugation to remove surface agglomerations, a symmetric monodisperse distribution of particles φ 186 nm (SD=5.7, n=6) was obtained with a yield of 65.2%. © 2006 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global efforts to reduce carbon emissions from power generation have favoured renewable energy resources such as wind and solar in recent years. The generation of power from the renewable energy resources has become attractive because of various incentives provided by government policies supporting green power. Among the various available renewable energy resources, the power generation from wind has seen tremendous growth in the last decade. This article discusses various advantages of the upcoming offshore wind technology and associated considerations related to their construction. The conventional configuration of the offshore wind farm is based on the alternative current internal links. With the recent advances of improved commercialised converters, voltage source converters based high voltage direct current link for offshore wind farms is gaining popularity. The planning and construction phases of offshore wind farms, including related environmental issues, are discussed here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper relates to the importance of impact of the chosen bottle-point method when conducting ion exchange equilibria experiments. As an illustration, potassium ion exchange with strong acid cation resin was investigated due to its relevance to the treatment of various industrial effluents and groundwater. The “constant mass” bottle-point method was shown to be problematic in that depending upon the resin mass used the equilibrium isotherm profiles were different. Indeed, application of common equilibrium isotherm models revealed that the optimal fit could be with either the Freundlich or Temkin equations, depending upon the conditions employed. It could be inferred that the resin surface was heterogeneous in character, but precise conclusions regarding the variation in the heat of sorption were not possible. Estimation of the maximum potassium loading was also inconsistent when employing the “constant mass” method. The “constant concentration” bottle-point method illustrated that the Freundlich model was a good representation of the exchange process. The isotherms recorded were relatively consistent when compared to the “constant mass” approach. Unification of all the equilibrium isotherm data acquired was achieved by use of the Langmuir Vageler expression. The maximum loading of potassium ions was predicted to be at least 116.5 g/kg resin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this chapter, we explore methods for automatically generating game content—and games themselves—adapted to individual players in order to improve their playing experience or achieve a desired effect. This goes beyond notions of mere replayability and involves modeling player needs to maximize their enjoyment, involvement, and interest in the game being played. We identify three main aspects of this process: generation of new content and rule sets, measurement of this content and the player, and adaptation of the game to change player experience. This process forms a feedback loop of constant refinement, as games are continually improved while being played. Framed within this methodology, we present an overview of our recent and ongoing research in this area. This is illustrated by a number of case studies that demonstrate these ideas in action over a variety of game types, including 3D action games, arcade games, platformers, board games, puzzles, and open-world games. We draw together some of the lessons learned from these projects to comment on the difficulties, the benefits, and the potential for personalized gaming via adaptive game design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the dynamics of disease spread is essential in contexts such as estimating load on medical services, as well as risk assessment and interven- tion policies against large-scale epidemic outbreaks. However, most of the information is available after the outbreak itself, and preemptive assessment is far from trivial. Here, we report on an agent-based model developed to investigate such epidemic events in a stylised urban environment. For most diseases, infection of a new individual may occur from casual contact in crowds as well as from repeated interactions with social partners such as work colleagues or family members. Our model therefore accounts for these two phenomena. Given the scale of the system, efficient parallel computing is required. In this presentation, we focus on aspects related to paralllelisation for large networks generation and massively multi-agent simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic Vehicle Identification Systems are being increasingly used as a new source of travel information. As in the last decades these systems relied on expensive new technologies, few of them were scattered along a networks making thus Travel-Time and Average Speed estimation their main objectives. However, as their price dropped, the opportunity of building dense AVI networks arose, as in Brisbane where more than 250 Bluetooth detectors are now installed. As a consequence this technology represents an effective means to acquire accurate time dependant Origin Destination information. In order to obtain reliable estimations, however, a number of issues need to be addressed. Some of these problems stem from the structure of a network made out of isolated detectors itself while others are inherent of Bluetooth technology (overlapping detection area, missing detections,\...). The aim of this paper is threefold: First, after having presented the level of details that can be reached with a network of isolated detectors we present how we modelled Brisbane's network, keeping only the information valuable for the retrieval of trip information. Second, we give an overview of the issues inherent to the Bluetooth technology and we propose a method for retrieving the itineraries of the individual Bluetooth vehicles. Last, through a comparison with Brisbane Transport Strategic Model results, we highlight the opportunities and the limits of Bluetooth detectors networks. The aim of this paper is twofold. We first give a comprehensive overview of the aforementioned issues. Further, we propose a methodology that can be followed, in order to cleanse, correct and aggregate Bluetooth data. We postulate that the methods introduced by this paper are the first crucial steps that need to be followed in order to compute accurate Origin-Destination matrices in urban road networks.