897 resultados para air annealing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One important step in the design of air stripping operations for the removal of VOC is the choice of operating conditions, which are based in the phase ratio. This parameter sets on directly the stripping factor and the efficiency of the operation. Its value has an upper limit determined by the flooding regime, which is previewed using empirical correlations, namely the one developed by Eckert. This type of approach is not suitable for the development of algorithms. Using a pilot scale column and a convenient solution, the pressure drop was determined in different operating conditions and the experimental values were compared with the estimations. This particular research will be incorporated in a global model for simulating the dynamics of air stripping using a multi variable distributed parameter system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

STRIPPING is a software application developed for the automatic design of a randomly packing column where the transfer of volatile organic compounds (VOCs) from water to air can be performed and to simulate it’s behaviour in a steady-state. This software completely purges any need of experimental work for the selection of diameter of the column, and allows a choice, a priori, of the most convenient hydraulic regime for this type of operation. It also allows the operator to choose the model used for the calculation of some parameters, namely between the Eckert/Robbins model and the Billet model for estimating the pressure drop of the gaseous phase, and between the Billet and Onda/Djebbar’s models for the mass transfer. Illustrations of the graphical interface offered are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work deals with the numerical simulation of air stripping process for the pre-treatment of groundwater used in human consumption. The model established in steady state presents an exponential solution that is used, together with the Tau Method, to get a spectral approach of the solution of the system of partial differential equations associated to the model in transient state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for column design for any type of packing and contaminant which avoids the necessity of an arbitrary chosen diameter. It also avoids the employment of the usual graphical Eckert correlations for pressure drop. The hydraulic features are previously chosen as a project criterion. The design procedure was translated into a convenient algorithm in C++ language. A column was built in order to test the design, the theoretical steady-state and dynamic behaviour. The experiments were conducted using a solution of chloroform in distilled water. The results allowed for a correction in the theoretical global mass transfer coefficient previously estimated by the Onda correlations, which depend on several parameters that are not easy to control in experiments. For best describe the column behaviour in stationary and dynamic conditions, an original mathematical model was developed. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting ODE can be solved by analytical methods, and in dynamic state the discretization of the PDE by finite differences allows for the overcoming of this difficulty. To estimate the contaminant concentrations in both phases in the column, a numerical algorithm was used. The high number of resulting algebraic equations and the impossibility of generating a recursive procedure did not allow the construction of a generalized programme. But an iterative procedure developed in an electronic worksheet allowed for the simulation. The solution is stable only for similar discretizations values. If different values for time/space discretization parameters are used, the solution easily becomes unstable. The system dynamic behaviour was simulated for the common liquid phase perturbations: step, impulse, rectangular pulse and sinusoidal. The final results do not configure strange or non-predictable behaviours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydraulic systems are dynamically susceptible in the presence of entrapped air pockets, leading to amplified transient reactions. In order to model the dynamic action of an entrapped air pocket in a confined system, a heuristic mathematical formulation based on a conceptual analogy to a mechanical spring-damper system is proposed. The formulation is based on the polytropic relationship of an ideal gas and includes an additional term, which encompasses the combined damping effects associated with the thermodynamic deviations from the theoretical transformation, as well as those arising from the transient vorticity developed in both fluid domains (air and water). These effects represent the key factors that account for flow energy dissipation and pressure damping. Model validation was completed via numerical simulation of experimental measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of magnetic cores can be carried out by taking into account the optimization of different parameters in accordance with the application requirements. Considering the specifications of the fast field cycling nuclear magnetic resonance (FFC-NMR) technique, the magnetic flux density distribution, at the sample insertion volume, is one of the core parameters that needs to be evaluated. Recently, it has been shown that the FFC-NMR magnets can be built on the basis of solenoid coils with ferromagnetic cores. Since this type of apparatus requires magnets with high magnetic flux density uniformity, a new type of magnet using a ferromagnetic core, copper coils, and superconducting blocks was designed with improved magnetic flux density distribution. In this paper, the designing aspects of the magnet are described and discussed with emphasis on the improvement of the magnetic flux density homogeneity (Delta B/B-0) in the air gap. The magnetic flux density distribution is analyzed based on 3-D simulations and NMR experimental results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho de Projecto de Mestrado em Novos Media e Práticas Web

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed generators, storage units, demand response and EVs. The large number of resources causes more complexity in the energy resource management, taking several hours to reach the optimal solution which requires a quick solution for the next day. Therefore, it is necessary to use adequate optimization techniques to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the previous SA version, resulting in a cost reduction of 1.94%. For this scenario, the proposed approach is approximately 94 times faster than the deterministic approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ventilation efficiency concept is an attempt to quantify a parameter that can easily distinguish the different options for air diffusion in the building spaces. Thirteen strategies of air diffusion were measured in a test chamber through the application of the tracer gas method, with the objective to validate the calculation by Computational fluid dynamics (CFD). Were compared the Air Change Efficiency (ACE) and the Contaminant Removal Effectiveness (CRE), the two indicators most internationally accepted. The main results from this work shows that the values of the numerical simulations are in good agreement with experimental measurements and also, that the solutions to be adopted for maximizing the ventilation efficiency should be the schemes that operate with low speeds of supply air and small differences between supply air temperature and the room temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering tobacco smoke as one of the most health-relevant indoor sources, the aim of this work was to further understand its negative impacts on human health. The specific objectives of this work were to evaluate the levels of particulate-bound PAHs in smoking and non-smoking homes and to assess the risks associated with inhalation exposure to these compounds. The developed work concerned the application of the toxicity equivalency factors approach (including the estimation of the lifetime lung cancer risks, WHO) and the methodology established by USEPA (considering three different age categories) to 18 PAHs detected in inhalable (PM10) and fine (PM2.5) particles at two homes. The total concentrations of 18 PAHs (ΣPAHs) was 17.1 and 16.6 ng m−3 in PM10 and PM2.5 at smoking home and 7.60 and 7.16 ng m−3 in PM10 and PM2.5 at non-smoking one. Compounds with five and six rings composed the majority of the particulate PAHs content (i.e., 73 and 78 % of ΣPAHs at the smoking and non-smoking home, respectively). Target carcinogenic risks exceeded USEPA health-based guideline at smoking home for 2 different age categories. Estimated values of lifetime lung cancer risks largely exceeded (68–200 times) the health-based guideline levels at both homes thus demonstrating that long-term exposure to PAHs at the respective levels would eventually cause risk of developing cancer. The high determined values of cancer risks in the absence of smoking were probably caused by contribution of PAHs from outdoor sources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to their detrimental effects on human health, the scientific interest in ultrafine particles (UFP) has been increasing, but available information is far from comprehensive. Compared to the remaining population, the elderly are potentially highly susceptible to the effects of outdoor air pollution. Thus, this study aimed to (1) determine the levels of outdoor pollutants in an urban area with emphasis on UFP concentrations and (2) estimate the respective dose rates of exposure for elderly populations. UFP were continuously measured over 3 weeks at 3 sites in north Portugal: 2 urban (U1 and U2) and 1 rural used as reference (R1). Meteorological parameters and outdoor pollutants including particulate matter (PM10), ozone (O3), nitric oxide (NO), and nitrogen dioxide (NO2) were also measured. The dose rates of inhalation exposure to UFP were estimated for three different elderly age categories: 64–70, 71–80, and >81 years. Over the sampling period levels of PM10, O3 and NO2 were in compliance with European legislation. Mean UFP were 1.7 × 104 and 1.2 × 104 particles/cm3 at U1 and U2, respectively, whereas at rural site levels were 20–70% lower (mean of 1 ×104 particles/cm3). Vehicular traffic and local emissions were the predominant identified sources of UFP at urban sites. In addition, results of correlation analysis showed that UFP were meteorologically dependent. Exposure dose rates were 1.2- to 1.4-fold higher at urban than reference sites with the highest levels noted for adults at 71–80 yr, attributed mainly to higher inhalation rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have calculated the equilibrium shape of the axially symmetric meniscus along which a spherical bubble contacts a flat liquid surface by analytically integrating the Young-Laplace equation in the presence of gravity, in the limit of large Bond numbers. This method has the advantage that it provides semianalytical expressions for key geometrical properties of the bubble in terms of the Bond number. Results are in good overall agreement with experimental data and are consistent with fully numerical (Surface Evolver) calculations. In particular, we are able to describe how the bubble shape changes from hemispherical, with a flat, shallow bottom, to lenticular, with a deeper, curved bottom, as the Bond number is decreased.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aggregation and management of Distributed Energy Resources (DERs) by an Virtual Power Players (VPP) is an important task in a smart grid context. The Energy Resource Management (ERM) of theses DERs can become a hard and complex optimization problem. The large integration of several DERs, including Electric Vehicles (EVs), may lead to a scenario in which the VPP needs several hours to have a solution for the ERM problem. This is the reason why it is necessary to use metaheuristic methodologies to come up with a good solution with a reasonable amount of time. The presented paper proposes a Simulated Annealing (SA) approach to determine the ERM considering an intensive use of DERs, mainly EVs. In this paper, the possibility to apply Demand Response (DR) programs to the EVs is considered. Moreover, a trip reduce DR program is implemented. The SA methodology is tested on a 32-bus distribution network with 2000 EVs, and the SA results are compared with a deterministic technique and particle swarm optimization results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demand Response has been taking over the years an extreme importance. There’s a lot of demand response programs, one of them proposed in this paper, using air conditioners that could increase the power quality and decrease the spent money in many ways like: infrastructures and customers energy bill reduction. This paper proposes a method and a study on how air conditioners could integrate demand response programs. The proposed method has been modelled as an energy resources management optimization problem. This paper presents two case studies, the first one with all costumers participating and second one with some of costumers. The results obtained for both case studies have been analyzed.