965 resultados para TIGHT GAS. Low permeability. Hydraulic fracturing. Reservoir modeling. Numerical simulation
Resumo:
Gas sensing systems based on low-cost chemical sensor arrays are gaining interest for the analysis of multicomponent gas mixtures. These sensors show different problems, e.g., nonlinearities and slow time-response, which can be partially solved by digital signal processing. Our approach is based on building a nonlinear inverse dynamic system. Results for different identification techniques, including artificial neural networks and Wiener series, are compared in terms of measurement accuracy.
Resumo:
Recent findings suggest an association between exposure to cleaning products and respiratory dysfunctions including asthma. However, little information is available about quantitative airborne exposures of professional cleaners to volatile organic compounds deriving from cleaning products. During the first phases of the study, a systematic review of cleaning products was performed. Safety data sheets were reviewed to assess the most frequently added volatile organic compounds. It was found that professional cleaning products are complex mixtures of different components (compounds in cleaning products: 3.5 ± 2.8), and more than 130 chemical substances listed in the safety data sheets were identified in 105 products. The main groups of chemicals were fragrances, glycol ethers, surfactants, solvents; and to a lesser extent phosphates, salts, detergents, pH-stabilizers, acids, and bases. Up to 75% of products contained irritant (Xi), 64% harmful (Xn) and 28% corrosive (C) labeled substances. Hazards for eyes (59%), skin (50%) and by ingestion (60%) were the most reported. Monoethanolamine, a strong irritant and known to be involved in sensitizing mechanisms as well as allergic reactions, is frequently added to cleaning products. Monoethanolamine determination in air has traditionally been difficult and air sampling and analysis methods available were little adapted for personal occupational air concentration assessments. A convenient method was developed with air sampling on impregnated glass fiber filters followed by one step desorption, gas chromatography and nitrogen phosphorous selective detection. An exposure assessment was conducted in the cleaning sector, to determine airborne concentrations of monoethanolamine, glycol ethers, and benzyl alcohol during different cleaning tasks performed by professional cleaning workers in different companies, and to determine background air concentrations of formaldehyde, a known indoor air contaminant. The occupational exposure study was carried out in 12 cleaning companies, and personal air samples were collected for monoethanolamine (n=68), glycol ethers (n=79), benzyl alcohol (n=15) and formaldehyde (n=45). All but ethylene glycol mono-n-butyl ether air concentrations measured were far below (<1/10) of the Swiss eight hours occupational exposure limits, except for butoxypropanol and benzyl alcohol, where no occupational exposure limits were available. Although only detected once, ethylene glycol mono-n-butyl ether air concentrations (n=4) were high (49.5 mg/m3 to 58.7 mg/m3), hovering at the Swiss occupational exposure limit (49 mg/m3). Background air concentrations showed no presence of monoethanolamine, while the glycol ethers were often present, and formaldehyde was universally detected. Exposures were influenced by the amount of monoethanolamine in the cleaning product, cross ventilation and spraying. The collected data was used to test an already existing exposure modeling tool during the last phases of the study. The exposure estimation of the so called Bayesian tool converged with the measured range of exposure the more air concentrations of measured exposure were added. This was best described by an inverse 2nd order equation. The results suggest that the Bayesian tool is not adapted to predict low exposures. The Bayesian tool should be tested also with other datasets describing higher exposures. Low exposures to different chemical sensitizers and irritants should be further investigated to better understand the development of respiratory disorders in cleaning workers. Prevention measures should especially focus on incorrect use of cleaning products, to avoid high air concentrations at the exposure limits. - De récentes études montrent l'existence d'un lien entre l'exposition aux produits de nettoyages et les maladies respiratoires telles que l'asthme. En revanche, encore peu d'informations sont disponibles concernant la quantité d'exposition des professionnels du secteur du nettoyage aux composants organiques volatiles provenant des produits qu'ils utilisent. Pendant la première phase de cette étude, un recueil systématique des produits professionnels utilisés dans le secteur du nettoyage a été effectué. Les fiches de données de sécurité de ces produits ont ensuite été analysées, afin de répertorier les composés organiques volatiles les plus souvent utilisés. Il a été mis en évidence que les produits de nettoyage professionnels sont des mélanges complexes de composants chimiques (composants chimiques dans les produits de nettoyage : 3.5 ± 2.8). Ainsi, plus de 130 substances listées dans les fiches de données de sécurité ont été retrouvées dans les 105 produits répertoriés. Les principales classes de substances chimiques identifiées étaient les parfums, les éthers de glycol, les agents de surface et les solvants; dans une moindre mesure, les phosphates, les sels, les détergents, les régulateurs de pH, les acides et les bases ont été identifiés. Plus de 75% des produits répertoriés contenaient des substances décrites comme irritantes (Xi), 64% nuisibles (Xn) et 28% corrosives (C). Les risques pour les yeux (59%), la peau (50%) et par ingestion (60%) était les plus mentionnés. La monoéthanolamine, un fort irritant connu pour être impliqué dans les mécanismes de sensibilisation tels que les réactions allergiques, est fréquemment ajouté aux produits de nettoyage. L'analyse de la monoéthanolamine dans l'air a été habituellement difficile et les échantillons d'air ainsi que les méthodes d'analyse déjà disponibles étaient peu adaptées à l'évaluation de la concentration individuelle d'air aux postes de travail. Une nouvelle méthode plus efficace a donc été développée en captant les échantillons d'air sur des filtres de fibre de verre imprégnés, suivi par une étape de désorption, puis une Chromatographie des gaz et enfin une détection sélective des composants d'azote. Une évaluation de l'exposition des professionnels a été réalisée dans le secteur du nettoyage afin de déterminer la concentration atmosphérique en monoéthanolamine, en éthers de glycol et en alcool benzylique au cours des différentes tâches de nettoyage effectuées par les professionnels du nettoyage dans différentes entreprises, ainsi que pour déterminer les concentrations atmosphériques de fond en formaldéhyde, un polluant de l'air intérieur bien connu. L'étude de l'exposition professionnelle a été effectuée dans 12 compagnies de nettoyage et les échantillons d'air individuels ont été collectés pour l'éthanolamine (n=68), les éthers de glycol (n=79), l'alcool benzylique (n=15) et le formaldéhyde (n=45). Toutes les substances mesurées dans l'air, excepté le 2-butoxyéthanol, étaient en-dessous (<1/10) de la valeur moyenne d'exposition aux postes de travail en Suisse (8 heures), excepté pour le butoxypropanol et l'alcool benzylique, pour lesquels aucune valeur limite d'exposition n'était disponible. Bien que détecté qu'une seule fois, les concentrations d'air de 2-butoxyéthanol (n=4) étaient élevées (49,5 mg/m3 à 58,7 mg/m3), se situant au-dessus de la frontière des valeurs limites d'exposition aux postes de travail en Suisse (49 mg/m3). Les concentrations d'air de fond n'ont montré aucune présence de monoéthanolamine, alors que les éthers de glycol étaient souvent présents et les formaldéhydes quasiment toujours détectés. L'exposition des professionnels a été influencée par la quantité de monoéthanolamine présente dans les produits de nettoyage utilisés, par la ventilation extérieure et par l'emploie de sprays. Durant la dernière phase de l'étude, les informations collectées ont été utilisées pour tester un outil de modélisation de l'exposition déjà existant, l'outil de Bayesian. L'estimation de l'exposition de cet outil convergeait avec l'exposition mesurée. Cela a été le mieux décrit par une équation du second degré inversée. Les résultats suggèrent que l'outil de Bayesian n'est pas adapté pour mettre en évidence les taux d'expositions faibles. Cet outil devrait également être testé avec d'autres ensembles de données décrivant des taux d'expositions plus élevés. L'exposition répétée à des substances chimiques ayant des propriétés irritatives et sensibilisantes devrait être investiguée d'avantage, afin de mieux comprendre l'apparition de maladies respiratoires chez les professionnels du nettoyage. Des mesures de prévention devraient tout particulièrement être orientées sur l'utilisation correcte des produits de nettoyage, afin d'éviter les concentrations d'air élevées se situant à la valeur limite d'exposition acceptée.
Resumo:
AMADEUS is a dexterous subsea robot hand incorporating force and slip contact sensing, using fluid filled tentacles for fingers. Hydraulic pressure variations in each of three flexible tubes (bellows) in each finger create a bending moment, and consequent motion or increase in contact force during grasping. Such fingers have inherent passive compliance, no moving parts, and are naturally depth pressure-compensated, making them ideal for reliable use in the deep ocean. In addition to the mechanical design, development of the hand has also considered closed loop finger position and force control, coordinated finger motion for grasping, force and slip sensor development/signal processing, and reactive world modeling/planning for supervisory `blind grasping¿. Initially, the application focus is for marine science tasks, but broader roles in offshore oil and gas, salvage, and military use are foreseen. Phase I of the project is complete, with the construction of a first prototype. Phase I1 is now underway, to deploy the hand from an underwater robot arm, and carry out wet trials with users.
Resumo:
BACKGROUND: The goal of this paper is to investigate the respective influence of work characteristics, the effort-reward ratio, and overcommitment on the poor mental health of out-of-hospital care providers. METHODS: 333 out-of-hospital care providers answered a questionnaire that included queries on mental health (GHQ-12), demographics, health-related information and work characteristics, questions from the Effort-Reward Imbalance Questionnaire, and items about overcommitment. A two-level multiple regression was performed between mental health (the dependent variable) and the effort-reward ratio, the overcommitment score, weekly number of interventions, percentage of non-prehospital transport of patients out of total missions, gender, and age. Participants were first-level units, and ambulance services were second-level units. We also shadowed ambulance personnel for a total of 416 hr. RESULTS: With cutoff points of 2/3 and 3/4 positive answers on the GHQ-12, the percentages of potential cases with poor mental health were 20% and 15%, respectively. The effort-reward ratio was associated with poor mental health (P < 0.001), irrespective of age or gender. Overcommitment was associated with poor mental health; this association was stronger in women (β = 0.054) than in men (β = 0.020). The percentage of prehospital missions out of total missions was only associated with poor mental health at the individual level. CONCLUSIONS: Emergency medical services should pay attention to the way employees perceive their efforts and the rewarding aspects of their work: an imbalance of those aspects is associated with poor mental health. Low perceived esteem appeared particularly associated with poor mental health. This suggests that supervisors of emergency medical services should enhance the value of their employees' work. Employees with overcommitment should also receive appropriate consideration. Preventive measures should target individual perceptions of effort and reward in order to improve mental health in prehospital care providers.
Resumo:
Soil organic matter (SOM) plays an important role in carbon (C) cycle and soil quality. Considering the complexity of factors that control SOM cycling and the long time it usually takes to observe changes in SOM stocks, modeling constitutes a very important tool to understand SOM cycling in forest soils. The following hypotheses were tested: (i) soil organic carbon (SOC) stocks would be higher after several rotations of eucalyptus than in low-productivity pastures; (ii) SOC values simulated by the Century model would describe the data better than the mean of observations. So, the aims of the current study were: (i) to evaluate the SOM dynamics using the Century model to simulate the changes of C stocks for two eucalyptus chronosequences in the Rio Doce Valley, Minas Gerais State, Brazil; and (ii) to compare the C stocks simulated by Century with the C stocks measured in soils of different Orders and regions of the Rio Doce Valley growing eucalyptus. In Belo Oriente (BO), short-rotation eucalyptus plantations had been cultivated for 4.0; 13.0, 22.0, 32.0 and 34.0 years, at a lower elevation and in a warmer climate, while in Virginópolis (VG), these time periods were 8.0, 19.0 and 33.0 years, at a higher elevation and in a milder climate. Soil samples were collected from the 0-20 cm layer to estimate C stocks. Results indicate that the C stocks simulated by the Century model decreased after 37 years of poorly managed pastures in areas previously covered by native forest in the regions of BO and VG. The substitution of poorly managed pastures by eucalyptus in the early 1970´s led to an average increase of C of 0.28 and 0.42 t ha-1 year-1 in BO and VG, respectively. The measured C stocks under eucalyptus in distinct soil Orders and independent regions with variable edapho-climate conditions were not far from the values estimated by the Century model (root mean square error - RMSE = 20.9; model efficiency - EF = 0.29) despite the opposite result obtained with the statistical procedure to test the identity of analytical methods. Only for lower soil C stocks, the model over-estimated the C stock in the 0-20 cm layer. Thus, the Century model is highly promising to detect changes in C stocks in distinct soil orders under eucalyptus, as well as to indicate the impact of harvest residue management on SOM in future rotations.
Resumo:
We study the discrepancy between the effective flow permeability and the effective seismic permeability, that is, the effective permeability controlling seismic attenuation due to wave-induced fluid flow, in 2D rock samples having mesoscopic heterogeneities and in the presence of strong permeability fluctuations. In order to do so, we employ a numerical oscillatory compressibility test to determine attenuation and velocity dispersion due to wave-induced fluid flow in these kinds of media and compare the responses with those obtained by replacing the heterogeneous permeability field by constant values, including the average permeability as well as the effective flow permeability of the sample. The latter is estimated in a separate upscaling procedure by solving the steady-state flow equation in the rock sample under study. Numerical experiments let us verify that attenuation levels are less significant and the attenuation peak gets broader in the presence of such strong permeability fluctuations. Moreover, we observe that for very low frequencies the effective seismic permeability is similar to the effective flow permeability, while for very high frequencies it approaches the arithmetic average of the permeability field.
Resumo:
Leptosols and Regosols are soils with a series of restrictions for use, mainly related to the effective depth, which have been poorly studied in Brazil. These soils, when derived from sedimentary rocks should be treated with particular care to avoid environmental damage such as aquifer contamination. The purpose of this study was to verify the behavior of hydraulic conductivity and water retention capacity in profiles of Leptosols and Regosols derived from sandstone of the Caturrita formation in Rio Grande do Sul state. The morphology, particle size distribution, porosity, soil density (Ds), saturated hydraulic conductivity (Ks), basic water infiltration in the field (BI) and water retention were determined in soil and saprolite samples of six soil profiles. High Ds, low macroporosity and high microporosity were observed in the profiles, resulting in a low Ks and BI, even under conditions of sandy texture and a highly fractured saprolite layer. The variation coefficients of data of Ks and BI were high among the studied profiles and between replications of a same profile. Water retention of the studied soils was higher in Cr layers than in the A horizons and the volume of plant-available water greater and variable among A horizons and Cr layers.
Resumo:
In an attempt to solve the bridge problem faced by many county engineers, this investigation focused on a low cost bridge alternative that consists of using railroad flatcars (RRFC) as the bridge superstructure. The intent of this study was to determine whether these types of bridges are structurally adequate and potentially feasible for use on low volume roads. A questionnaire was sent to the Bridge Committee members of the American Association of State Highway and Transportation Officials (AASHTO) to determine their use of RRFC bridges and to assess the pros and cons of these bridges based on others’ experiences. It was found that these types of bridges are widely used in many states with large rural populations and they are reported to be a viable bridge alternative due to their low cost, quick and easy installation, and low maintenance. A main focus of this investigation was to study an existing RRFC bridge that is located in Tama County, IA. This bridge was analyzed using computer modeling and field load testing. The dimensions of the major structural members of the flatcars in this bridge were measured and their properties calculated and used in an analytical grillage model. The analytical results were compared with those obtained in the field tests, which involved instrumenting the bridge and loading it with a fully loaded rear tandem-axle truck. Both sets of data (experimental and theoretical) show that the Tama County Bridge (TCB) experienced very low strains and deflections when loaded and the RRFCs appeared to be structurally adequate to serve as a bridge superstructure. A calculated load rating of the TCB agrees with this conclusion. Because many different types of flatcars exist, other flatcars were modeled and analyzed. It was very difficult to obtain the structural plans of RRFCs; thus, only two additional flatcars were analyzed. The results of these analyses also yielded very low strains and displacements. Taking into account the experiences of other states, the inspection of several RRFC bridges in Oklahoma, the field test and computer analysis of the TCB, and the computer analysis of two additional flatcars, RRFC bridges appear to provide a safe and feasible bridge alternative for low volume roads.
Resumo:
The thermodynamic functions of a Fermi gas with spin population imbalance are studied in the temperature-asymmetry plane in the BCS limit. The low-temperature domain is characterized by an anomalous enhancement of the entropy and the specific heat above their values in the unpaired state, decrease of the gap and eventual unpairing phase transition as the temperature is lowered. The unpairing phase transition induces a second jump in the specific heat, which can be measured in calorimetric experiments. While the superfluid is unstable against a supercurrent carrying state, it may sustain a metastable state if cooled adiabatically down from the stable high-temperature domain. In the latter domain the temperature dependence of the gap and related functions is analogous to the predictions of the BCS theory.
Resumo:
The density of states of a Bose-condensed gas confined in a harmonic trap is investigated. The predictions of Bogoliubov theory are compared with those of Hartree-Fock theory and of the hydrodynamic model. We show that the Hartree-Fock scheme provides an excellent description of the excitation spectrum in a wide range of energy, revealing a major role played by single-particle excitations in these confined systems. The crossover from the hydrodynamic regime, holding at low energies, to the independent-particle regime is explicitly explored by studying the frequency of the surface mode as a function of their angular momentum. The applicability of the semiclassical approximation for the excited states is also discussed. We show that the semiclassical approach provides simple and accurate formulas for the density of states and the quantum depletion of the condensate.
Resumo:
The energy and structure of a dilute hard-disks Bose gas are studied in the framework of a variational many-body approach based on a Jastrow correlated ground-state wave function. The asymptotic behaviors of the radial distribution function and the one-body density matrix are analyzed after solving the Euler equation obtained by a free minimization of the hypernetted chain energy functional. Our results show important deviations from those of the available low density expansions, already at gas parameter values x~0.001 . The condensate fraction in 2D is also computed and found generally lower than the 3D one at the same x.
Resumo:
At seismic frequencies, wave-induced fluid flow is a major cause of P-wave attenuation in partially saturated porous rocks. Attenuation is of great importance for the oil industry in the interpretation of seismic field data. Here, the effects on P-wave attenuation resulting from changes in oil saturation are studied for media with coexisting water, oil, and gas. For that, creep experiments are numerically simulated by solving Biot's equations for consolidation of poroelastic media with the finite-element method. The experiments yield time-dependent stress?strain relations that are used to calculate the complex P-wave modulus from which frequency-dependent P-wave attenuation is determined. The models are layered media with periodically alternating triplets of layers. Models consisting of triplets of layers having randomly varying layer thicknesses are also considered. The layers in each triplet are fully saturated with water, oil, and gas. The layer saturated with water has lower porosity and permeability than the layers saturated with oil and gas. These models represent hydrocarbon reservoirs in which water is the wetting fluid preferentially saturating regions of lower porosity. The results from the numerical experiments showed that increasing oil saturation, connected to a decrease in gas saturation, resulted in a significant increase of attenuation at low frequencies (lower than 2 Hz). Furthermore, replacing the oil with water resulted in a distinguishable behavior of the frequency-dependent attenuation. These results imply that, according to the physical mechanism of wave-induced fluid flow, frequency-dependent attenuation in media saturated with water, oil, and gas is a potential indicator of oil saturation.
Resumo:
In this paper we present new results on doped μc-Si:H thin films deposited by hot-wire chemical vapour deposition (HWCVD) in the very low temperature range (125-275°C). The doped layers were obtained by the addition of diborane or phosphine in the gas phase during deposition. The incorporation of boron and phosphorus in the films and their influence on the crystalline fraction are studied by secondary ion mass spectrometry and Raman spectroscopy, respectively. Good electrical transport properties were obtained in this deposition regime, with best dark conductivities of 2.6 and 9.8 S cm -1 for the p- and n-doped films, respectively. The effect of the hydrogen dilution and the layer thickness on the electrical properties are also studied. Some technological conclusions referred to cross contamination could be deduced from the nominally undoped samples obtained in the same chamber after p- and n-type heavily doped layers.
Resumo:
Modeling of water movement in non-saturated soil usually requires a large number of parameters and variables, such as initial soil water content, saturated water content and saturated hydraulic conductivity, which can be assessed relatively easily. Dimensional flow of water in the soil is usually modeled by a nonlinear partial differential equation, known as the Richards equation. Since this equation cannot be solved analytically in certain cases, one way to approach its solution is by numerical algorithms. The success of numerical models in describing the dynamics of water in the soil is closely related to the accuracy with which the water-physical parameters are determined. That has been a big challenge in the use of numerical models because these parameters are generally difficult to determine since they present great spatial variability in the soil. Therefore, it is necessary to develop and use methods that properly incorporate the uncertainties inherent to water displacement in soils. In this paper, a model based on fuzzy logic is used as an alternative to describe water flow in the vadose zone. This fuzzy model was developed to simulate the displacement of water in a non-vegetated crop soil during the period called the emergency phase. The principle of this model consists of a Mamdani fuzzy rule-based system in which the rules are based on the moisture content of adjacent soil layers. The performances of the results modeled by the fuzzy system were evaluated by the evolution of moisture profiles over time as compared to those obtained in the field. The results obtained through use of the fuzzy model provided satisfactory reproduction of soil moisture profiles.
Greenhouse Gas and Nitrogen Fertilizer Scenarios for U.S. Agriculture and Global Biofuels, June 2011
Resumo:
This analysis uses the 2011 FAPRI-CARD (Food and Agricultural Policy Research Institute–Center for Agricultural and Rural Development) baseline to evaluate the impact of four alternative scenarios on U.S. and world agricultural markets, as well as on world fertilizer use and world agricultural greenhouse gas emissions. A key assumption in the 2011 baseline is that ethanol support policies disappear in 2012. The baseline also assumes that existing biofuel mandates remain in place and are binding. Two of the scenarios are adverse supply shocks, the first being a 10% increase in the price of nitrogen fertilizer in the United States, and the second, a reversion of cropland into forestland. The third scenario examines how lower energy prices would impact world agriculture. The fourth scenario reintroduces biofuel tax credits and duties. Given that the baseline excludes these policies, the fourth scenario is an attempt to understand the impact of these policies under the market conditions that prevail in early 2011. A key to understanding the results of this fourth scenario is that in the absence of tax credits and duties, the mandate drives biofuel use. Therefore, when the tax credits and duties are reintroduced, the impacts are relatively small. In general, the results show that the entire international commodity market system is remarkably robust with respect to policy changes in one country or in one sector. The policy implication is that domestic policy changes implemented by a large agricultural producer like the United States can have fairly significant impacts on the aggregate world commodity markets. A second point that emerges from the results is that the law of unintended consequences is at work in world agriculture. For example, a U.S. nitrogen tax that might presumably be motivated for environmental benefit results in an increase in world greenhouse gas emissions. A similar situation occurs in the afforestation scenario in which crop production shifts from high-yielding land in the United States to low-yielding land and probably native vegetation in the rest of the world, resulting in an unintended increase in global greenhouse gas emissions.