996 resultados para Classical methods
Resumo:
Understanding the product`s `end-of-life` is important to reduce the environmental impact of the products` final disposal. When the initial stages of product development consider end-of-life aspects, which can be established by ecodesign (a proactive approach of environmental management that aims to reduce the total environmental impact of products), it becomes easier to close the loop of materials. The `end-of-life` ecodesign methods generally include more than one `end-of-life` strategy. Since product complexity varies substantially, some components, systems or sub-systems are easier to be recycled, reused or remanufactured than others. Remanufacture is an effective way to maintain products in a closed-loop, reducing both environmental impacts and costs of the manufacturing processes. This paper presents some ecodesign methods focused on the integration of different `end-of-life` strategies, with special attention to remanufacturing, given its increasing importance in the international scenario to reduce the life cycle impacts of products. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We assess the performance of three unconditionally stable finite-difference time-domain (FDTD) methods for the modeling of doubly dispersive metamaterials: 1) locally one-dimensional FDTD; 2) locally one-dimensional FDTD with Strang splitting; and (3) alternating direction implicit FDTD. We use both double-negative media and zero-index media as benchmarks.
Resumo:
The airflow velocities and pressures are calculated from a three-dimensional model of the human larynx by using the finite element method. The laryngeal airflow is assumed to be incompressible, isothermal, steady, and created by fixed pressure drops. The influence of different laryngeal profiles (convergent, parallel, and divergent), glottal area, and dimensions of false vocal folds in the airflow are investigated. The results indicate that vertical and horizontal phase differences in the laryngeal tissue movements are influenced by the nonlinear pressure distribution across the glottal channel, and the glottal entrance shape influences the air pressure distribution inside the glottis. Additionally, the false vocal folds increase the glottal duct pressure drop by creating a new constricted channel in the larynx, and alter the airflow vortexes formed after the true vocal folds. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
On-line leak detection is a main concern for the safe operation of pipelines. Acoustic and mass balance are the most important and extensively applied technologies in field problems. The objective of this work is to compare these leak detection methods with respect to a given reference situation, i.e., the same pipeline and monitoring signals acquired at the inlet and outlet ends. Experimental tests were conducted in a 749 m long laboratory pipeline transporting water as the working fluid. The instrumentation included pressure transducers and electromagnetic flowmeters. Leaks were simulated by opening solenoid valves placed at known positions and previously calibrated to produce known average leak flow rates. Results have clearly shown the limitations and advantages of each method. It is also quite clear that acoustics and mass balance technologies are, in fact, complementary. In general, an acoustic leak detection system sends out an alarm more rapidly and locates the leak more precisely, provided that the rupture of the pipeline occurs abruptly enough. On the other hand, a mass balance leak detection method is capable of quantifying the leak flow rate very accurately and of detecting progressive leaks.
Resumo:
Neodymium doped and undoped aluminum oxide samples were obtained using two different techniques: Pechini and sol-gel. Fine grained powders were produced using both procedures, which were analyzed using Scanning Electron Microscopy (SEM) and Thermo-Stimulated Luminescence (TSL). Results showed that neodymium ions incorporation is responsible for the creation of two new TSL peaks (125 and 265 degrees C) and, also, for the enhancement of the intrinsic TSL peak at 190 degrees C. An explanation was proposed for these observations. SEM gave the dimensions of the clusters produced by each method, showing that those obtained by Pechini are smaller than the ones produced by sol-gel; it can also explain the higher emission supplied by the first one. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper deals with the use of simplified methods to predict methane generation in tropical landfills. Methane recovery data obtained on site as part of a research program being carried Out at the Metropolitan Landfill, Salvador, Brazil, is analyzed and used to obtain field methane generation over time. Laboratory data from MSW samples of different ages are presented and discussed: and simplified procedures to estimate the methane generation potential, L(o), and the constant related to the biodegradation rate, k are applied. The first order decay method is used to fit field and laboratory results. It is demonstrated that despite the assumptions and the simplicity of the adopted laboratory procedures, the values L(o) and k obtained are very close to those measured in the field, thus making this kind of analysis very attractive for first approach purposes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
An updated flow pattern map was developed for CO2 on the basis of the previous Cheng-Ribatski-Wojtan-Thome CO2 flow pattern map [1,2] to extend the flow pattern map to a wider range of conditions. A new annular flow to dryout transition (A-D) and a new dryout to mist flow transition (D-M) were proposed here. In addition, a bubbly flow region which generally occurs at high mass velocities and low vapor qualities was added to the updated flow pattern map. The updated flow pattern map is applicable to a much wider range of conditions: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to +25 degrees C (reduced pressures from 0.21 to 0.87). The updated flow pattern map was compared to independent experimental data of flow patterns for CO2 in the literature and it predicts the flow patterns well. Then, a database of CO2 two-phase flow pressure drop results from the literature was set up and the database was compared to the leading empirical pressure drop models: the correlations by Chisholm [3], Friedel [4], Gronnerud [5] and Muller-Steinhagen and Heck [6], a modified Chisholm correlation by Yoon et al. [7] and the flow pattern based model of Moreno Quiben and Thome [8-10]. None of these models was able to predict the CO2 pressure drop data well. Therefore, a new flow pattern based phenomenological model of two-phase flow frictional pressure drop for CO2 was developed by modifying the model of Moreno Quiben and Thome using the updated flow pattern map in this study and it predicts the CO2 pressure drop database quite well overall. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Corresponding to the updated flow pattern map presented in Part I of this study, an updated general flow pattern based flow boiling heat transfer model was developed for CO2 using the Cheng-Ribatski-Wojtan-Thome [L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside horizontal tubes, Int. J. Heat Mass Transfer 49 (2006) 4082-4094; L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, Erratum to: ""New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside tubes"" [Heat Mass Transfer 49 (21-22) (2006) 4082-4094], Int. J. Heat Mass Transfer 50 (2007) 391] flow boiling heat transfer model as the starting basis. The flow boiling heat transfer correlation in the dryout region was updated. In addition, a new mist flow heat transfer correlation for CO2 was developed based on the CO2 data and a heat transfer method for bubbly flow was proposed for completeness sake. The updated general flow boiling heat transfer model for CO2 covers all flow regimes and is applicable to a wider range of conditions for horizontal tubes: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to 25 degrees C (reduced pressures from 0.21 to 0.87). The updated general flow boiling heat transfer model was compared to a new experimental database which contains 1124 data points (790 more than that in the previous model [Cheng et al., 2006, 2007]) in this study. Good agreement between the predicted and experimental data was found in general with 71.4% of the entire database and 83.2% of the database without the dryout and mist flow data predicted within +/-30%. However, the predictions for the dryout and mist flow regions were less satisfactory due to the limited number of data points, the higher inaccuracy in such data, scatter in some data sets ranging up to 40%, significant discrepancies from one experimental study to another and the difficulties associated with predicting the inception and completion of dryout around the perimeter of the horizontal tubes. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents an Adaptive Maximum Entropy (AME) approach for modeling biological species. The Maximum Entropy algorithm (MaxEnt) is one of the most used methods in modeling biological species geographical distribution. The approach presented here is an alternative to the classical algorithm. Instead of using the same set features in the training, the AME approach tries to insert or to remove a single feature at each iteration. The aim is to reach the convergence faster without affect the performance of the generated models. The preliminary experiments were well performed. They showed an increasing on performance both in accuracy and in execution time. Comparisons with other algorithms are beyond the scope of this paper. Some important researches are proposed as future works.
Resumo:
Most post-processors for boundary element (BE) analysis use an auxiliary domain mesh to display domain results, working against the profitable modelling process of a pure boundary discretization. This paper introduces a novel visualization technique which preserves the basic properties of the boundary element methods. The proposed algorithm does not require any domain discretization and is based on the direct and automatic identification of isolines. Another critical aspect of the visualization of domain results in BE analysis is the effort required to evaluate results in interior points. In order to tackle this issue, the present article also provides a comparison between the performance of two different BE formulations (conventional and hybrid). In addition, this paper presents an overview of the most common post-processing and visualization techniques in BE analysis, such as the classical algorithms of scan line and the interpolation over a domain discretization. The results presented herein show that the proposed algorithm offers a very high performance compared with other visualization procedures.
Resumo:
An alternative approach for the analysis of arbitrarily curved shells is developed in this paper based on the idea of initial deformations. By `alternative` we mean that neither differential geometry nor the concept of degeneration is invoked here to describe the shell surface. We begin with a flat reference configuration for the shell mid-surface, after which the initial (curved) geometry is mapped as a stress-free deformation from the plane position. The actual motion of the shell takes place only after this initial mapping. In contrast to classical works in the literature, this strategy enables the use of only orthogonal frames within the theory and therefore objects such as Christoffel symbols, the second fundamental form or three-dimensional degenerated solids do not enter the formulation. Furthermore, the issue of physical components of tensors does not appear. Another important aspect (but not exclusive of our scheme) is the possibility to describe exactly the initial geometry. The model is kinematically exact, encompasses finite strains in a totally consistent manner and is here discretized under the light of the finite element method (although implementation via mesh-free techniques is also possible). Assessment is made by means of several numerical simulations. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In this paper, processing methods of Fourier optics implemented in a digital holographic microscopy system are presented. The proposed methodology is based on the possibility of the digital holography in carrying out the whole reconstruction of the recorded wave front and consequently, the determination of the phase and intensity distribution in any arbitrary plane located between the object and the recording plane. In this way, in digital holographic microscopy the field produced by the objective lens can be reconstructed along its propagation, allowing the reconstruction of the back focal plane of the lens, so that the complex amplitudes of the Fraunhofer diffraction, or equivalently the Fourier transform, of the light distribution across the object can be known. The manipulation of Fourier transform plane makes possible the design of digital methods of optical processing and image analysis. The proposed method has a great practical utility and represents a powerful tool in image analysis and data processing. The theoretical aspects of the method are presented, and its validity has been demonstrated using computer generated holograms and images simulations of microscopic objects. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
The classical approach for acoustic imaging consists of beamforming, and produces the source distribution of interest convolved with the array point spread function. This convolution smears the image of interest, significantly reducing its effective resolution. Deconvolution methods have been proposed to enhance acoustic images and have produced significant improvements. Other proposals involve covariance fitting techniques, which avoid deconvolution altogether. However, in their traditional presentation, these enhanced reconstruction methods have very high computational costs, mostly because they have no means of efficiently transforming back and forth between a hypothetical image and the measured data. In this paper, we propose the Kronecker Array Transform ( KAT), a fast separable transform for array imaging applications. Under the assumption of a separable array, it enables the acceleration of imaging techniques by several orders of magnitude with respect to the fastest previously available methods, and enables the use of state-of-the-art regularized least-squares solvers. Using the KAT, one can reconstruct images with higher resolutions than was previously possible and use more accurate reconstruction techniques, opening new and exciting possibilities for acoustic imaging.
Resumo:
Fungal entomopathogens have been used more frequently than other types of pathogens for classical biological control. Among 136 programs using different groups of arthropod pathogens, 49.3% have introduced fungal pathogens (including both the traditional fungi and microsporidia). The most commonly introduced species was Metarhizium anisopliae (Metschnikoff) Sorokin, with 13 introductions, followed by Entomophaga maimaiga Humber, Shimazu & Soper, which was released seven times. The majority of introduction programs have focused on controlling invasive species of insects or mites (70.7%) rather than on native hosts (29.4%). Almost half of the introductions of traditional fungi targeted species of Hemiptera and 75% of the microsporidia introduced have been introduced against lepidopteran species. The United States was the country where most introductions of fungi took place (n = 24). From 1993 to 2007, no arthropod pathogens were released in the US due to the rigorous regulatory structure, but in 2008 two species of microsporidia were introduced against the gypsy moth, Lymantria dispar (L.). Establishment of entomopathogenic fungi in programs introducing traditional fungi was 32.1% and establishment was 50.0% for programs introducing microsporidia. In some programs, releases have resulted in permanent successful establishment with no non-target effects. In summary, classical biological control using fungal entomopathogens can provide a successful and environmentally friendly avenue for controlling arthropod pests, including the increasing numbers of invasive non-native species.
Resumo:
Grass reference evapotranspiration (ETo) is an important agrometeorological parameter for climatological and hydrological studies, as well as for irrigation planning and management. There are several methods to estimate ETo, but their performance in different environments is diverse, since all of them have some empirical background. The FAO Penman-Monteith (FAD PM) method has been considered as a universal standard to estimate ETo for more than a decade. This method considers many parameters related to the evapotranspiration process: net radiation (Rn), air temperature (7), vapor pressure deficit (Delta e), and wind speed (U); and has presented very good results when compared to data from lysimeters Populated with short grass or alfalfa. In some conditions, the use of the FAO PM method is restricted by the lack of input variables. In these cases, when data are missing, the option is to calculate ETo by the FAD PM method using estimated input variables, as recommended by FAD Irrigation and Drainage Paper 56. Based on that, the objective of this study was to evaluate the performance of the FAO PM method to estimate ETo when Rn, Delta e, and U data are missing, in Southern Ontario, Canada. Other alternative methods were also tested for the region: Priestley-Taylor, Hargreaves, and Thornthwaite. Data from 12 locations across Southern Ontario, Canada, were used to compare ETo estimated by the FAD PM method with a complete data set and with missing data. The alternative ETo equations were also tested and calibrated for each location. When relative humidity (RH) and U data were missing, the FAD PM method was still a very good option for estimating ETo for Southern Ontario, with RMSE smaller than 0.53 mm day(-1). For these cases, U data were replaced by the normal values for the region and Delta e was estimated from temperature data. The Priestley-Taylor method was also a good option for estimating ETo when U and Delta e data were missing, mainly when calibrated locally (RMSE = 0.40 mm day(-1)). When Rn was missing, the FAD PM method was not good enough for estimating ETo, with RMSE increasing to 0.79 mm day(-1). When only T data were available, adjusted Hargreaves and modified Thornthwaite methods were better options to estimate ETo than the FAO) PM method, since RMSEs from these methods, respectively 0.79 and 0.83 mm day(-1), were significantly smaller than that obtained by FAO PM (RMSE = 1.12 mm day(-1). (C) 2009 Elsevier B.V. All rights reserved.