27 resultados para Multidimensional Compressible Flows
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Mixing layers are present in very different types of physical situations such as atmospheric flows, aerodynamics and combustion. It is, therefore, a well researched subject, but there are aspects that require further studies. Here the instability of two-and three-dimensional perturbations in the compressible mixing layer was investigated by numerical simulations. In the numerical code, the derivatives were discretized using high-order compact finite-difference schemes. A stretching in the normal direction was implemented with both the objective of reducing the sound waves generated by the shear region and improving the resolution near the center. The compact schemes were modified to work with non-uniform grids. Numerical tests started with an analysis of the growth rate in the linear regime to verify the code implementation. Tests were also performed in the non-linear regime and it was possible to reproduce the vortex roll-up and pairing, both in two-and three-dimensional situations. Amplification rate analysis was also performed for the secondary instability of this flow. It was found that, for essentially incompressible flow, maximum growth rates occurred for a spanwise wavelength of approximately 2/3 of the streamwise spacing of the vortices. The result demonstrated the applicability of the theory developed by Pierrehumbet and Widnall. Compressibility effects were then considered and the maximum growth rates obtained for relatively high Mach numbers (typically under 0.8) were also presented.
Resumo:
Given a compact 2 dimensional manifold M we classify all continuous flows phi without wandering points on M. This classification is performed by finding finitely many pairwise disjoint open phi-invariant subsets {U(1), U(2), ..., U(n)} of M such that U(i=1)(n) (U(i)) over bar = M and each U(i) is either a suspension of an interval exchange transformation, or a maximal open cylinder made up of closed trajectories of phi.
Resumo:
Investigations of chaotic particle transport by drift waves propagating in the edge plasma of tokamaks with poloidal zonal flow are described. For large aspect ratio tokamaks, the influence of radial electric field profiles on convective cells and transport barriers, created by the nonlinear interaction between the poloidal flow and resonant waves, is investigated. For equilibria with edge shear flow, particle transport is seen to be reduced when the electric field shear is reversed. The transport reduction is attributed to the robust invariant tori that occur in nontwist Hamiltonian systems. This mechanism is proposed as an explanation for the transport reduction in Tokamak Chauffage Alfven Bresilien [R. M. O. Galvao , Plasma Phys. Controlled Fusion 43, 1181 (2001)] for discharges with a biased electrode at the plasma edge.
Resumo:
Laminar and pulsed flows typical of multi-commuted and multi-pumping flow systems, were evaluated in relation to analytical procedures carried out at high temperatures. As application, the spectrophotometric determination of total reducing sugars (TRS, hydrolyzed sucrose plus reducing sugars) in sugar-cane juice and molasses was selected. The method involves in-line hydrolysis of sucrose and alkaline degradation of the reducing sugars at about 98 degrees C. Better results were obtained with pulsed flows, due to the efficient radial mass transport inherent to the multi-pumping flow system. The proposed system presents favorable characteristics of ruggedness, analytical precision (r.s.d. < 0.013 for typical samples), stability (no measurable baseline drift during 4-h working periods), linearity of the analytical curve (r > 0.992, n = 5, 0.05-0.50% w/v TRS) and sampling rate (65 h(-1)). Results are in agreement with ion chromatography.
Resumo:
An improved flow-based procedure is proposed for turbidimetric sulphate determination in waters. The flow system was designed with solenoid micro-pumps in order to improve mixing conditions and minimize reagent consumption as well as waste generation. Stable baselines were observed in view of the pulsed flow characteristic of the systems designed with solenoid micro-pumps, thus making the use of washing solutions unnecessary. The nucleation process was improved by stopping the flow prior to the measurement, thus avoiding the need of sulphate addition. When a 1-cm optical path flow cell was employed, linear response was achieved within 20-200 mg L(-1), described by the equation S = -0.0767 + 0.00438C (mg L(-1)), r = 0.999. The detection limit was estimated as 3 mg L(-1) at the 99.7% confidence level and the coefficient of variation was 2.4% (n = 20). The sampling rate was estimated as 33 determinations per hour. A long pathlength (100-cm) flow cell based on a liquid core waveguide was exploited to increase sensitivity in turbidimetry. Baseline drifts were avoided by a periodical washing step with EDTA in alkaline medium. Linear response was observed within 7-16 mg L(-1), described by the equation S = -0.865 + 0.132C (mg L(-1)), r = 0.999. The detection limit was estimated as 150 mu g L(-1) at the 99.7% confidence level and the coefficient of variation was 3.0% (n = 20). The sampling rate was estimated as 25 determinations per hour. The results obtained for freshwater and rain water samples were in agreement with those achieved by batch turbidimetry at the 95% confidence level. (C) 2008 Elsevier B.V All rights reserved.
Resumo:
The taxonomy of the N(2)-fixing bacteria belonging to the genus Bradyrhizobium is still poorly refined, mainly due to conflicting results obtained by the analysis of the phenotypic and genotypic properties. This paper presents an application of a method aiming at the identification of possible new clusters within a Brazilian collection of 119 Bradryrhizobium strains showing phenotypic characteristics of B. japonicum and B. elkanii. The stability was studied as a function of the number of restriction enzymes used in the RFLP-PCR analysis of three ribosomal regions with three restriction enzymes per region. The method proposed here uses Clustering algorithms with distances calculated by average-linkage clustering. Introducing perturbations using sub-sampling techniques makes the stability analysis. The method showed efficacy in the grouping of the species B. japonicum and B. elkanii. Furthermore, two new clusters were clearly defined, indicating possible new species, and sub-clusters within each detected cluster. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this paper a bond graph methodology is used to model incompressible fluid flows with viscous and thermal effects. The distinctive characteristic of these flows is the role of pressure, which does not behave as a state variable but as a function that must act in such a way that the resulting velocity field has divergence zero. Velocity and entropy per unit volume are used as independent variables for a single-phase, single-component flow. Time-dependent nodal values and interpolation functions are introduced to represent the flow field, from which nodal vectors of velocity and entropy are defined as state variables. The system for momentum and continuity equations is coincident with the one obtained by using the Galerkin method for the weak formulation of the problem in finite elements. The integral incompressibility constraint is derived based on the integral conservation of mechanical energy. The weak formulation for thermal energy equation is modeled with true bond graph elements in terms of nodal vectors of temperature and entropy rates, resulting a Petrov-Galerkin method. The resulting bond graph shows the coupling between mechanical and thermal energy domains through the viscous dissipation term. All kind of boundary conditions are handled consistently and can be represented as generalized effort or flow sources. A procedure for causality assignment is derived for the resulting graph, satisfying the Second principle of Thermodynamics. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A great deal of attention in the supply chain management literature is devoted to study material and demand information flows and their coordination. But in many situations, supply chains may convey information from different nature, they may be an important channel companies have to deliver knowledge, or specifically, technical information to the market. This paper studies the technical flow and highlights its particular requirements. Drawing upon a qualitative field research, it studies pharmaceutical companies, since those companies face a very specific challenge: consumers do not have discretion over their choices, ethical drugs must be prescribed by physicians to be bought and used by final consumers. Technical information flow is rich, and must be redundant and early delivered at multiple points. Thus, apart from the regular material channel where products and order information flow, those companies build a specialized information channel, developed to communicate to those who need it to create demand. Conclusions can be extended to supply chains where products and services are complex and decision makers must be clearly informed about technology-related information. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.
Resumo:
The aim of this study was to identify the psycho-musical factors that govern time evaluation in Western music from baroque, classic, romantic, and modern repertoires. The excerpts were previously found to represent variability in musical properties and to induce four main categories of emotions. 48 participants (musicians and nonmusicians) freely listened to 16 musical excerpts (lasting 20 sec. each) and grouped those that seemed to have the same duration. Then, participants associated each group of excerpts to one of a set of sine wave tones varying in duration from 16 to 24 sec. Multidimensional scaling analysis generated a two-dimensional solution for these time judgments. Musical excerpts with high arousal produced an overestimation of time, and affective valence had little influence on time perception. The duration was also overestimated when tempo and loudness were higher, and to a lesser extent, timbre density. In contrast, musical tension had little influence.
Resumo:
Based on previous observational studies on cold extreme events over southern South America, some recent studies suggest a possible relationship between Rossby wave propagation remotely triggered and the occurrence of frost. Using the concept of linear theory of Rossby wave propagation, this paper analyzes the propagation of such waves in two different basic states that correspond to austral winters with maximum and minimum generalized frost frequency of occurrence in the Wet Pampa (central-northwest Argentina). In order to determine the wave trajectories, the ray tracing technique is used in this study. Some theoretical discussion about this technique is also presented. The analysis of the basic state, from a theoretical point of view and based on the calculation of ray tracings, corroborates that remotely excited Rossby waves is the mechanism that favors the maximum occurrence of generalized frosts. The basic state in which the waves propagate is what conditions the places where they are excited. The Rossby waves are excited in determined places of the atmosphere, propagating towards South America along the jet streams that act as wave guides, favoring the generation of generalized frosts. In summary, this paper presents an overview of the ray tracing technique and how it can be used to investigate an important synoptic event, such as frost in a specific region, and its relationship with the propagation of large scale planetary waves.
Resumo:
The diffusion of astrophysical magnetic fields in conducting fluids in the presence of turbulence depends on whether magnetic fields can change their topology via reconnection in highly conducting media. Recent progress in understanding fast magnetic reconnection in the presence of turbulence reassures that the magnetic field behavior in computer simulations and turbulent astrophysical environments is similar, as far as magnetic reconnection is concerned. This makes it meaningful to perform MHD simulations of turbulent flows in order to understand the diffusion of magnetic field in astrophysical environments. Our studies of magnetic field diffusion in turbulent medium reveal interesting new phenomena. First of all, our three-dimensional MHD simulations initiated with anti-correlating magnetic field and gaseous density exhibit at later times a de-correlation of the magnetic field and density, which corresponds well to the observations of the interstellar media. While earlier studies stressed the role of either ambipolar diffusion or time-dependent turbulent fluctuations for de-correlating magnetic field and density, we get the effect of permanent de-correlation with one fluid code, i.e., without invoking ambipolar diffusion. In addition, in the presence of gravity and turbulence, our three-dimensional simulations show the decrease of the magnetic flux-to-mass ratio as the gaseous density at the center of the gravitational potential increases. We observe this effect both in the situations when we start with equilibrium distributions of gas and magnetic field and when we follow the evolution of collapsing dynamically unstable configurations. Thus, the process of turbulent magnetic field removal should be applicable both to quasi-static subcritical molecular clouds and cores and violently collapsing supercritical entities. The increase of the gravitational potential as well as the magnetization of the gas increases the segregation of the mass and magnetic flux in the saturated final state of the simulations, supporting the notion that the reconnection-enabled diffusivity relaxes the magnetic field + gas system in the gravitational field to its minimal energy state. This effect is expected to play an important role in star formation, from its initial stages of concentrating interstellar gas to the final stages of the accretion to the forming protostar. In addition, we benchmark our codes by studying the heat transfer in magnetized compressible fluids and confirm the high rates of turbulent advection of heat obtained in an earlier study.
Resumo:
We study compressible magnetohydrodynamic turbulence, which holds the key to many astrophysical processes, including star formation and cosmic-ray propagation. To account for the variations of the magnetic field in the strongly turbulent fluid, we use wavelet decomposition of the turbulent velocity field into Alfven, slow, and fast modes, which presents an extension of the Cho & Lazarian decomposition approach based on Fourier transforms. The wavelets allow us to follow the variations of the local direction of the magnetic field and therefore improve the quality of the decomposition compared to the Fourier transforms, which are done in the mean field reference frame. For each resulting component, we calculate the spectra and two-point statistics such as longitudinal and transverse structure functions as well as higher order intermittency statistics. In addition, we perform a Helmholtz-Hodge decomposition of the velocity field into incompressible and compressible parts and analyze these components. We find that the turbulence intermittency is different for different components, and we show that the intermittency statistics depend on whether the phenomenon was studied in the global reference frame related to the mean magnetic field or in the frame defined by the local magnetic field. The dependencies of the measures we obtained are different for different components of the velocity; for instance, we show that while the Alfven mode intermittency changes marginally with the Mach number, the intermittency of the fast mode is substantially affected by the change.
Resumo:
Visualization of high-dimensional data requires a mapping to a visual space. Whenever the goal is to preserve similarity relations a frequent strategy is to use 2D projections, which afford intuitive interactive exploration, e. g., by users locating and selecting groups and gradually drilling down to individual objects. In this paper, we propose a framework for projecting high-dimensional data to 3D visual spaces, based on a generalization of the Least-Square Projection (LSP). We compare projections to 2D and 3D visual spaces both quantitatively and through a user study considering certain exploration tasks. The quantitative analysis confirms that 3D projections outperform 2D projections in terms of precision. The user study indicates that certain tasks can be more reliably and confidently answered with 3D projections. Nonetheless, as 3D projections are displayed on 2D screens, interaction is more difficult. Therefore, we incorporate suitable interaction functionalities into a framework that supports 3D transformations, predefined optimal 2D views, coordinated 2D and 3D views, and hierarchical 3D cluster definition and exploration. For visually encoding data clusters in a 3D setup, we employ color coding of projected data points as well as four types of surface renderings. A second user study evaluates the suitability of these visual encodings. Several examples illustrate the framework`s applicability for both visual exploration of multidimensional abstract (non-spatial) data as well as the feature space of multi-variate spatial data.
Resumo:
The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.