926 resultados para Fluid mechanics - Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field studies show that the internal screens in a gross pollutant trap (GPT) are often clogged with organic matter, due to infrequent cleaning. The hydrodynamic performance of a GPT with fully blocked screens was comprehensively investigated under a typical range of onsite operating conditions. Using an acoustic Doppler velocimeter (ADV), velocity profiles across three critical sections of the GPT were measured and integrated to examine the net fluid flow at each section. The data revealed that when the screens are fully blocked, the flow structure within the GPT radically changes. Consequently, the capture/retention performance of the device rapidly deteriorates. Good agreement was achieved between the experimental and the previous 2D computational fluid dynamics (CFD) velocity profiles for the lower GPT inlet flow conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A technique was developed to investigate the capture/retention characteristic of a gross pollutant trap (GPT) with fully and partially blocked internal screens. Custom modified spheres of variable density filled with liquid were released into the GPT inlet and monitored at the outlet. The outlet data shows that the capture/retention performances of a GPT with fully blocked screens deteriorate rapidly. During higher flow rates, screen blockages below 68% approach maximum efficiency. At lower flow rates, the high performance trend is reversed and the variation in behaviour of pollutants with different densities becomes more noticeable. Additional experiments with a second upstream inlet configured GPT showed an improved capture/retention performance. It was also noted that the bypass allows the incoming pollutants to escape when the GPT is blocked. This useful feature prevents upstream blockages between cleaning intervals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A computational fluid dynamics (CFD) analysis has been performed for a flat plate photocatalytic reactor using CFD code FLUENT. Under the simulated conditions (Reynolds number, Re around 2650), a detailed time accurate computation shows the different stages of flow evolution and the effects of finite length of the reactor in creating flow instability, which is important to improve the performance of the reactor for storm and wastewater reuse. The efficiency of a photocatalytic reactor for pollutant decontamination depends on reactor hydrodynamics and configurations. This study aims to investigate the role of different parameters on the optimization of the reactor design for its improved performance. In this regard, more modelling and experimental efforts are ongoing to better understand the interplay of the parameters that influence the performance of the flat plate photocatalytic reactor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In natural waterways and estuaries, the understanding of turbulent mixing is critical to the knowledge of sediment transport, stormwater runoff during flood events, and release of nutrient-rich wastewater into ecosystems. In the present study, some field measurements were conducted in a small subtropical estuary with micro-tidal range and semi-diurnal tides during king tide conditions: i. e., the tidal range was the largest for both 2009 and 2010. The turbulent velocity measurements were performed continuously at high-frequency (50Hz) for 60 h. Two acoustic Doppler velocimeters (ADVs) were sampled simultaneously in the middle estuarine zone, and a third ADV was deployed in the upper estuary for 12 h only. The results provided an unique characterisation of the turbulence in both middle and upper estuarine zones under the king tide conditions. The present observations showed some marked differences between king tide and neap tide conditions. During the king tide conditions, the tidal forcing was the dominant water exchange and circulation mechanism in the estuary. In contrast, the long-term oscillations linked with internal and external resonance played a major role in the turbulent mixing during neap tides. The data set showed further that the upper estuarine zone was drastically less affected by the spring tide range: the flow motion remained slow, but the turbulent velocity data were affected by the propagation of a transient front during the very early flood tide motion at the sampling site. © 2012 Springer Science+Business Media B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical flow fields in a stormwater gross pollutant trap (GPT) with blocked retaining screens were experimentally captured and visualised. Particle image velocimetry (PIV) software was used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. A technique was developed to apply the Image Based Flow Visualization (IBFV) algorithm to the experimental raw dataset generated by the PIV software. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding gross pollutant capture and retention within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate specific areas and identify the flow features within the GPT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modelling of food processing is complex because it involves sophisticated material and transport phenomena. Most of the agricultural products such fruits and vegetables are hygroscopic porous media containing free water, bound water, gas and solid matrix. Considering all phase in modelling is still not developed. In this article, a comprehensive porous media model for drying has been developed considering bound water, free water separately, as well as water vapour and air. Free water transport was considered as diffusion, pressure driven and evaporation. Bound water assumed to be converted to free water due to concentration difference and also can diffuse. Binary diffusion between water vapour and air was considered. Since, the model is fundamental physics based it can be applied to any drying applications and other food processing where heat and mass transfer takes place in porous media with significant evaporation and other phase change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical results are presented to investigate the performance of a partly-filled porous heat exchanger for waste heat recovery units. A parametric study was conducted to investigate the effects of inlet velocity and porous block height on the pressure drop of the heat exchanger. The focus of this work is on modelling the interface of a porous and non-porous region. As such, numerical simulation of the problem is conducted along with hot-wire measurements to better understand the physics of the problem. Results from the two sources are then compared to existing theoretical predictions available in the literature which are unable to predict the existence of two separation regions before and after the porous block. More interestingly, a non-uniform interface velocity was observed along the streamwise direction based on both numerical and experimental data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion are the result of the combination of large scale advection and small scale turbulence which are both complex to estimate. A field study was conducted in a small sub-tropical estuary in which high frequency (50 Hz) turbulent data were recorded continuously for about 48 hours. A triple decomposition technique was introduced to isolate the contributions of tides, resonance and turbulence in the flow field. A striking feature of the data set was the slow fluctuations which exhibited large amplitudes up to 50% the tidal amplitude under neap tide conditions. The triple decomposition technique allowed a characterisation of broader temporal scales of high frequency fluctuation data sampled during a number of full tidal cycles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper combines experimental data with simple mathematical models to investigate the influence of spray formulation type and leaf character (wettability) on shatter, bounce and adhesion of droplets impacting with cotton, rice and wheat leaves. Impaction criteria that allow for different angles of the leaf surface and the droplet impact trajectory are presented; their predictions are based on whether combinations of droplet size and velocity lie above or below bounce and shatter boundaries. In the experimental component, real leaves are used, with all their inherent natural variability. Further, commercial agricultural spray nozzles are employed, resulting in a range of droplet characteristics. Given this natural variability, there is broad agreement between the data and predictions. As predicted, the shatter of droplets was found to increase as droplet size and velocity increased, and the surface became harder to wet. Bouncing of droplets occurred most frequently on hard to wet surfaces with high surface tension mixtures. On the other hand, a number of small droplets with low impact velocity were observed to bounce when predicted to lie well within the adhering regime. We believe this discrepancy between the predictions and experimental data could be due to air layer effects that were not taken into account in the current bounce equations. Other discrepancies between experiment and theory are thought to be due to the current assumption of a dry impact surface, whereas, in practice, the leaf surfaces became increasingly covered with fluid throughout the spray test runs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we used a nonconservative Lagrangian mechanics approach to formulate a new statistical algorithm for fluid registration of 3-D brain images. This algorithm is named SAFIRA, acronym for statistically-assisted fluid image registration algorithm. A nonstatistical version of this algorithm was implemented, where the deformation was regularized by penalizing deviations from a zero rate of strain. In, the terms regularizing the deformation included the covariance of the deformation matrices Σ and the vector fields (q). Here, we used a Lagrangian framework to reformulate this algorithm, showing that the regularizing terms essentially allow nonconservative work to occur during the flow. Given 3-D brain images from a group of subjects, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the nonstatistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the nonconservative terms, creating four versions of SAFIRA. We evaluated and compared our algorithms' performance on 92 3-D brain scans from healthy monozygotic and dizygotic twins; 2-D validations are also shown for corpus callosum shapes delineated at midline in the same subjects. After preliminary tests to demonstrate each method, we compared their detection power using tensor-based morphometry (TBM), a technique to analyze local volumetric differences in brain structure. We compared the accuracy of each algorithm variant using various statistical metrics derived from the images and deformation fields. All these tests were also run with a traditional fluid method, which has been quite widely used in TBM studies. The versions incorporating vector-based empirical statistics on brain variation were consistently more accurate than their counterparts, when used for automated volumetric quantification in new brain images. This suggests the advantages of this approach for large-scale neuroimaging studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.