994 resultados para sediment reduction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reset/inhibitor nets are Petri nets extended with reset arcs and inhibitor arcs. These extensions can be used to model cancellation and blocking. A reset arc allows a transition to remove all tokens from a certain place when the transition fires. An inhibitor arc can stop a transition from being enabled if the place contains one or more tokens. While reset/inhibitor nets increase the expressive power of Petri nets, they also result in increased complexity of analysis techniques. One way of speeding up Petri net analysis is to apply reduction rules. Unfortunately, many of the rules defined for classical Petri nets do not hold in the presence of reset and/or inhibitor arcs. Moreover, new rules can be added. This is the first paper systematically presenting a comprehensive set of reduction rules for reset/inhibitor nets. These rules are liveness and boundedness preserving and are able to dramatically reduce models and their state spaces. It can be observed that most of the modeling languages used in practice have features related to cancellation and blocking. Therefore, this work is highly relevant for all kinds of application areas where analysis is currently intractable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The flood flow in urbanised areas constitutes a major hazard to the population and infrastructure as seen during the summer 2010-2011 floods in Queensland (Australia). Flood flows in urban environments have been studied relatively recently, although no study considered the impact of turbulence in the flow. During the 12-13 January 2011 flood of the Brisbane River, some turbulence measurements were conducted in an inundated urban environment in Gardens Point Road next to Brisbane's central business district (CBD) at relatively high frequency (50 Hz). The properties of the sediment flood deposits were characterised and the acoustic Doppler velocimeter unit was calibrated to obtain both instantaneous velocity components and suspended sediment concentration in the same sampling volume with the same temporal resolution. While the flow motion in Gardens Point Road was subcritical, the water elevations and velocities fluctuated with a distinctive period between 50 and 80 s. The low frequency fluctuations were linked with some local topographic effects: i.e, some local choke induced by an upstream constriction between stairwells caused some slow oscillations with a period close to the natural sloshing period of the car park. The instantaneous velocity data were analysed using a triple decomposition, and the same triple decomposition was applied to the water depth, velocity flux, suspended sediment concentration and suspended sediment flux data. The velocity fluctuation data showed a large energy component in the slow fluctuation range. For the first two tests at z = 0.35 m, the turbulence data suggested some isotropy. At z = 0.083 m, on the other hand, the findings indicated some flow anisotropy. The suspended sediment concentration (SSC) data presented a general trend with increasing SSC for decreasing water depth. During a test (T4), some long -period oscillations were observed with a period about 18 minutes. The cause of these oscillations remains unknown to the authors. The last test (T5) took place in very shallow waters and high suspended sediment concentrations. It is suggested that the flow in the car park was disconnected from the main channel. Overall the flow conditions at the sampling sites corresponded to a specific momentum between 0.2 to 0.4 m2 which would be near the upper end of the scale for safe evacuation of individuals in flooded areas. But the authors do not believe the evacuation of individuals in Gardens Point Road would have been safe because of the intense water surges and flow turbulence. More generally any criterion for safe evacuation solely based upon the flow velocity, water depth or specific momentum cannot account for the hazards caused by the flow turbulence, water depth fluctuations and water surges.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is worldwide interest in reducing aircraft emissions. The difficulty of reducing emissions including water vapour, carbon dioxide (CO2) and oxides of nitrogen (NOx) is mainly due from the fact that a commercial aircraft is usually designed for a particular optimal cruise altitude but may be requested or required to operate and deviate at different altitude and speeds to archive a desired or commanded flight plan, resulting in increased emissions. This is a multi- disciplinary problem with multiple trade-offs such as optimising engine efficiency, minimising fuel burnt, minimise emissions while maintaining aircraft separation and air safety. This project presents the coupling of an advanced optimisation technique with mathematical models and algorithms for aircraft emission reduction through flight optimisation. Numerical results show that the method is able to capture a set of useful trade-offs between aircraft range and NOx, and mission fuel consumption and NOx. In addition, alternative cruise operating conditions including Mach and altitude that produce minimum NOx and CO2 (minimum mission fuel weight) are suggested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphene, functionalized with oleylamine (OA) and soluble in non-polar organic solvents, was produced on a large scale with a high yield by combining the Hummers process for graphite oxidation, an amine-coupling process to make OA-functionalized graphite oxide (OA-GO), and a novel reduction process using trioctylphosphine (TOP). TOP acts as both a reducing agent and an aggregation-prevention surfactant in the reduction of OA-GO in 1,2-dichlorobenzene (DCB). The reduction of OA-GO is confirmed by X-ray photoelectron spectroscopy, Fourier-transform infrared spectroscopy, X-ray diffraction, thermogravimetric analysis, and Raman spectroscopy. The exfoliation of GO, OA GO, and OA-functionalized graphene (OA-G) is verified by atomic force microscopy. The conductivity of TOP-reduced OA G, which is deduced from the current–voltage characteristics of a vacuum-filtered thin film, shows that the reduction of functionalized GO by TOP is as effective as the reduction of GO by hydrazine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Signal-degrading speckle is one factor that can reduce the quality of optical coherence tomography images. We demonstrate the use of a hierarchical model-based motion estimation processing scheme based on an affine-motion model to reduce speckle in optical coherence tomography imaging, by image registration and the averaging of multiple B-scans. The proposed technique is evaluated against other methods available in the literature. The results from a set of retinal images show the benefit of the proposed technique, which provides an improvement in signal-to-noise ratio of the square root of the number of averaged images, leading to clearer visual information in the averaged image. The benefits of the proposed technique are also explored in the case of ocular anterior segment imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gold nanoparticles supported on CeO2 were found to be efficient photocatalysts for three selective reductions of organic compounds at ambient temperatures, under irradiation of visible light; their reduction ability can be tuned by manipulating the irradiation wavelength.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper investigates a detailed Active Shock Control Bump Design Optimisation on a Natural Laminar Flow (NLF) aerofoil; RAE 5243 to reduce cruise drag at transonic flow conditions using Evolutionary Algorithms (EAs) coupled to a robust design approach. For the uncertainty design parameters, the positions of boundary layer transition (xtr) and the coefficient of lift (Cl) are considered (250 stochastic samples in total). In this paper, two robust design methods are considered; the first approach uses a standard robust design method, which evaluates one design model at 250 stochastic conditions for uncertainty. The second approach is the combination of a standard robust design method and the concept of hierarchical (multi-population) sampling (250, 50, 15) for uncertainty. Numerical results show that the evolutionary optimization method coupled to uncertainty design techniques produces useful and reliable Pareto optimal SCB shapes which have low sensitivity and high aerodynamic performance while having significant total drag reduction. In addition,it also shows the benefit of using hierarchical robust method for detailed uncertainty design optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an analysis of the stream cipher Mixer, a bit-based cipher with structural components similar to the well-known Grain cipher and the LILI family of keystream generators. Mixer uses a 128-bit key and 64-bit IV to initialise a 217-bit internal state. The analysis is focused on the initialisation function of Mixer and shows that there exist multiple key-IV pairs which, after initialisation, produce the same initial state, and consequently will generate the same keystream. Furthermore, if the number of iterations of the state update function performed during initialisation is increased, then the number of distinct initial states that can be obtained decreases. It is also shown that there exist some distinct initial states which produce the same keystream, resulting in a further reduction of the effective key space