69 resultados para Reference frame
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In recent years, we have experienced increasing interest in the understanding of the physical properties of collisionless plasmas, mostly because of the large number of astrophysical environments (e. g. the intracluster medium (ICM)) containing magnetic fields that are strong enough to be coupled with the ionized gas and characterized by densities sufficiently low to prevent the pressure isotropization with respect to the magnetic line direction. Under these conditions, a new class of kinetic instabilities arises, such as firehose and mirror instabilities, which have been studied extensively in the literature. Their role in the turbulence evolution and cascade process in the presence of pressure anisotropy, however, is still unclear. In this work, we present the first statistical analysis of turbulence in collisionless plasmas using three-dimensional numerical simulations and solving double-isothermal magnetohydrodynamic equations with the Chew-Goldberger-Low laws closure (CGL-MHD). We study models with different initial conditions to account for the firehose and mirror instabilities and to obtain different turbulent regimes. We found that the CGL-MHD subsonic and supersonic turbulences show small differences compared to the MHD models in most cases. However, in the regimes of strong kinetic instabilities, the statistics, i.e. the probability distribution functions (PDFs) of density and velocity, are very different. In subsonic models, the instabilities cause an increase in the dispersion of density, while the dispersion of velocity is increased by a large factor in some cases. Moreover, the spectra of density and velocity show increased power at small scales explained by the high growth rate of the instabilities. Finally, we calculated the structure functions of velocity and density fluctuations in the local reference frame defined by the direction of magnetic lines. The results indicate that in some cases the instabilities significantly increase the anisotropy of fluctuations. These results, even though preliminary and restricted to very specific conditions, show that the physical properties of turbulence in collisionless plasmas, as those found in the ICM, may be very different from what has been largely believed.
Resumo:
We present a catalogue of galaxy photometric redshifts and k-corrections for the Sloan Digital Sky Survey Data Release 7 (SDSS-DR7), available on the World Wide Web. The photometric redshifts were estimated with an artificial neural network using five ugriz bands, concentration indices and Petrosian radii in the g and r bands. We have explored our redshift estimates with different training sets, thus concluding that the best choice for improving redshift accuracy comprises the main galaxy sample (MGS), the luminous red galaxies and the galaxies of active galactic nuclei covering the redshift range 0 < z < 0.3. For the MGS, the photometric redshift estimates agree with the spectroscopic values within rms = 0.0227. The distribution of photometric redshifts derived in the range 0 < z(phot) < 0.6 agrees well with the model predictions. k-corrections were derived by calibration of the k-correct_v4.2 code results for the MGS with the reference-frame (z = 0.1) (g - r) colours. We adopt a linear dependence of k-corrections on redshift and (g - r) colours that provide suitable distributions of luminosity and colours for galaxies up to redshift z(phot) = 0.6 comparable to the results in the literature. Thus, our k-correction estimate procedure is a powerful, low computational time algorithm capable of reproducing suitable results that can be used for testing galaxy properties at intermediate redshifts using the large SDSS data base.
Resumo:
We study compressible magnetohydrodynamic turbulence, which holds the key to many astrophysical processes, including star formation and cosmic-ray propagation. To account for the variations of the magnetic field in the strongly turbulent fluid, we use wavelet decomposition of the turbulent velocity field into Alfven, slow, and fast modes, which presents an extension of the Cho & Lazarian decomposition approach based on Fourier transforms. The wavelets allow us to follow the variations of the local direction of the magnetic field and therefore improve the quality of the decomposition compared to the Fourier transforms, which are done in the mean field reference frame. For each resulting component, we calculate the spectra and two-point statistics such as longitudinal and transverse structure functions as well as higher order intermittency statistics. In addition, we perform a Helmholtz-Hodge decomposition of the velocity field into incompressible and compressible parts and analyze these components. We find that the turbulence intermittency is different for different components, and we show that the intermittency statistics depend on whether the phenomenon was studied in the global reference frame related to the mean magnetic field or in the frame defined by the local magnetic field. The dependencies of the measures we obtained are different for different components of the velocity; for instance, we show that while the Alfven mode intermittency changes marginally with the Mach number, the intermittency of the fast mode is substantially affected by the change.
Concepts and determination of reference values for human biomonitoring of environmental contaminants
Resumo:
Human biomonitoring (HBM) of environmental contaminants plays an important role in estimating exposure and evaluating risk, and thus it has been increasingly applied in the environmental field. The results of HBM must be compared with reference values ( RV). The term ""reference values"" has always been related to the interpretation of clinical laboratory tests. For physicians, RV indicate ""normal values"" or ""limits of normal""; in turn, toxicologists prefer the terms ""background values"" or ""baseline values"" to refer to the presence of contaminants in biological fluids. This discrepancy leads to the discussion concerning which should be the population selected to determine RV. Whereas clinical chemistry employs an altered health state as the main exclusion criterion to select a reference population ( that is, a ""healthy"" population would be selected), in environmental toxicology the exclusion criterion is the abnormal exposure to xenobiotics. Therefore, the choice of population to determine RV is based on the very purpose of the RV to be determined. The present paper discusses the concepts and methodology used to determine RV for biomarkers of chemical environmental contaminants.
Resumo:
Broad-scale phylogenetic analyses of the angiosperms and of the Asteridae have failed to confidently resolve relationships among the major lineages of the campanulid Asteridae (i.e., the euasterid II of APG II, 2003). To address this problem we assembled presently available sequences for a core set of 50 taxa, representing the diversity of the four largest lineages (Apiales, Aquifoliales, Asterales, Dipsacales) as well as the smaller ""unplaced"" groups (e.g., Bruniaceae, Paracryphiaceae, Columelliaceae). We constructed four data matrices for phylogenetic analysis: a chloroplast coding matrix (atpB, matK, ndhF, rbcL), a chloroplast non-coding matrix (rps16 intron, trnT-F region, trnV-atpE IGS), a combined chloroplast dataset (all seven chloroplast regions), and a combined genome matrix (seven chloroplast regions plus 18S and 26S rDNA). Bayesian analyses of these datasets using mixed substitution models produced often well-resolved and supported trees. Consistent with more weakly supported results from previous studies, our analyses support the monophyly of the four major clades and the relationships among them. Most importantly, Asterales are inferred to be sister to a clade containing Apiales and Dipsacales. Paracryphiaceae is consistently placed sister to the Dipsacales. However, the exact relationships of Bruniaceae, Columelliaceae, and an Escallonia clade depended upon the dataset. Areas of poor resolution in combined analyses may be partly explained by conflict between the coding and non-coding data partitions. We discuss the implications of these results for our understanding of campanulid phylogeny and evolution, paying special attention to how our findings bear on character evolution and biogeography in Dipsacales.
Resumo:
Objectives. A large-scale survey of doses to patients undergoing the most frequent radiological examinations was carried out in health services in Sao Paulo (347 radiological examinations per 1 000 inhabitants), the most populous Brazilian state. Methods. A postal dosimetric kit with thermoluminescence dosimeters was used to evaluate the entrance surface dose (ESD) to patients. A stratified sampling technique applied to the national health database furnished important data on the distribution of equipment and the annual number of examinations. Chest, head (skull and sinus), and spine (cervical, thoracic, and lumbar) examinations were included in the trial. A total of 83 rooms and 868 patients were included, and 1 415 values of ESD were measured. Results. The data show large coefficients of variation in tube charge, giving rise to large variations in ESD values. Also, a series of high ESD values associated with unnecessary localizing fluoroscopy were detected. Diagnostic reference levels were determined, based on the 75th percentile (third quartile) of the ESD distributions. For adult patients, the diagnostic reference levels achieved are very similar to those obtained in international surveys. However, the situation is different for pediatric patients: the ESD values found in this survey are twice as large as the international recommendations for chest radiographs of children. Conclusions. Despite the reduced number of ESD values and rooms for the pediatric patient group, it is recommended that practices in chest examinations be revised and that specific national reference doses and image quality be established after a broader survey is carried out.
Resumo:
O estudo teve por objetivo desenvolver questionário de freqüência alimentar para cada um dos grupos: para mulheres, homens e ambos os gêneros, baseados nos dados dietéticos obtidos em estudo de base populacional de diferentes faixas de renda. Para elaboração do questionário foram utilizados dados dietéticos de recordatório de 24h para 1.477 indivíduos de amostra probabilística do município de São Paulo, em 2003. Foram selecionados os itens alimentares que contribuíram com pelo menos 90% da ingestão diária de calorias e nutrientes. O período de referência foi o ano anterior à entrevista e a escolha de alimentos pôde ser feita entre quatro tamanhos de porção.
Resumo:
O objetivo do presente estudo foi avaliar a prevalência de ingestão inadequada de nutrientes em um grupo de adolescentes de São Bernardo do Campo-SP. Dados de consumo de energia e nutrientes foram obtidos por meio de recordatórios de 24 horas aplicados em 89 adolescentes. A prevalência de inadequação foi calculada utilizando o método EAR como ponto de corte, após ajuste pela variabilidade intrapessoal, utilizando o procedimento desenvolvido pela Iowa State University. As Referências de Ingestão Dietética (IDR) foram os valores de referência para ingestão. Para os nutrientes que não possuem EAR estabelecida, a distribuição do consumo foi comparada com a AI. As maiores prevalências de inadequação em ambos sexos foram observadas para o magnésio (99,3 por cento para o sexo masculino e 81,8 por cento para o feminino), zinco (44,0 por cento para o sexo masculino e 23,5 por cento para o feminino), vitamina C (57,2 por cento para o sexo masculino e 59,9 por cento para o feminino) e folato (34,8 por cento para o sexo feminino). A proporção de indivíduos com ingestão superior à AI foi insignificante (menor que 2,0 por cento) em ambos os sexos
Resumo:
Background: To estimate the prevalence of and identify factors associated with physical activity in leisure, transportation, occupational, and household settings. Methods: This was a cross-sectional study aimed at investigating living and health conditions among the population of So Paulo, Brazil. Data on 1318 adults aged 18 to 65 years were used. To assess physical activity, the long version of the International Physical Activity Questionnaire was applied. Multivariate analysis was conducted using a hierarchical model. Results: The greatest prevalence of insufficient activity related to transportation (91.7%), followed by leisure (77.5%), occupational (68.9%), and household settings (56.7%). The variables associated with insufficient levels of physical activity in leisure were female sex, older age, low education level, nonwhite skin color, smoking, and self-reported poor health; in occupational settings were female sex, white skin color, high education level, self-reported poor health, nonsmoking, and obesity; in transportation settings were female sex; and in household settings, with male sex, separated, or widowed status and high education level. Conclusion: Physical activity in transportation and leisure settings should be encouraged. This study will serve as a reference point in monitoring different types of physical activities and implementing public physical activity policies in developing countries.
Resumo:
The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.
Resumo:
For obtaining accurate and reliable gene expression results it is essential that quantitative real-time RT-PCR (qRT-PCR) data are normalized with appropriate reference genes. The current exponential increase in postgenomic studies on the honey bee, Apis mellifera, makes the standardization of qRT-PCR results an important task for ongoing community efforts. For this aim we selected four candidate reference genes (actin, ribosomal protein 49, elongation factor 1-alpha, tbp-association factor) and used three software-based approaches (geNorm, BestKeeper and NormFinder) to evaluate the suitability of these genes as endogenous controls. Their expression was examined during honey bee development, in different tissues, and after juvenile hormone exposure. Furthermore, the importance of choosing an appropriate reference gene was investigated for two developmentally regulated target genes. The results led us to consider all four candidate genes as suitable genes for normalization in A. mellifera. However, each condition evaluated in this study revealed a specific set of genes as the most appropriated ones.
Resumo:
The dengue virus has a single-stranded positive-sense RNA genome of similar to 10.700 nucleotides with a single open reading frame that encodes three structural (C, prM, and E) and seven nonstructural (NS1, NS2A, NS2B, NS3, NS4A, NS4B, and NS5) proteins. It possesses four antigenically distinct serotypes (DENV 1-4). Many phylogenetic studies address particularities of the different serotypes using convenience samples that are not conducive to a spatio-temporal analysis in a single urban setting. We describe the pattern of spread of distinct lineages of DENV-3 circulating in Sao Jose do Rio Preto, Brazil, during 2006. Blood samples from patients presenting dengue-like symptoms were collected for DENV testing. We performed M-N-PCR using primers based on NS5 for virus detection and identification. The fragments were purified from PCR mixtures and sequenced. The positive dengue cases were geo-coded. To type the sequenced samples, 52 reference sequences were aligned. The dataset generated was used for iterative phylogenetic reconstruction with the maximum likelihood criterion. The best demographic model, the rate of growth, rate of evolutionary change, and Time to Most Recent Common Ancestor (TMRCA) were estimated. The basic reproductive rate during the epidemics was estimated. We obtained sequences from 82 patients among 174 blood samples. We were able to geo-code 46 sequences. The alignment generated a 399-nucleotide-long dataset with 134 taxa. The phylogenetic analysis indicated that all samples were of DENV-3 and related to strains circulating on the isle of Martinique in 2000-2001. Sixty DENV-3 from Sao Jose do Rio Preto formed a monophyletic group (lineage 1), closely related to the remaining 22 isolates (lineage 2). We assumed that these lineages appeared before 2006 in different occasions. By transforming the inferred exponential growth rates into the basic reproductive rate, we obtained values for lineage 1 of R(0) = 1.53 and values for lineage 2 of R(0) = 1.13. Under the exponential model, TMRCA of lineage 1 dated 1 year and lineage 2 dated 3.4 years before the last sampling. The possibility of inferring the spatio-temporal dynamics from genetic data has been generally little explored, and it may shed light on DENV circulation. The use of both geographic and temporally structured phylogenetic data provided a detailed view on the spread of at least two dengue viral strains in a populated urban area.
Resumo:
The degree of homogeneity is normally assessed by the variability of the results of independent analyses of several (e.g., 15) normal-scale replicates. Large sample instrumental neutron activation analysis (LS-INAA) with a collimated Ge detector allows inspecting the degree of homogeneity of the initial batch material, using a kilogram-size sample. The test is based on the spatial distributions of induced radioactivity. Such test was applied to samples of Brazilian whole (green) coffee beans (Coffea arabica and Coffea canephora) of approximately I kg in the frame of development of a coffee reference material. Results indicated that the material do not contain significant element composition inhomogeneities between batches of approximately 30-50 g, masses typically forming the starting base of a reference material.
Resumo:
Assuming as a starting point the acknowledge that the principles and methods used to build and manage the documentary systems are disperse and lack systematization, this study hypothesizes that the notion of structure, when assuming mutual relationships among its elements, promotes more organical systems and assures better quality and consistency in the retrieval of information concerning users` matters. Accordingly, it aims to explore the fundamentals about the records of information and documentary systems, starting from the notion of structure. In order to achieve that, it presents basic concepts and relative matters to documentary systems and information records. Next to this, it lists the theoretical subsides over the notion of structure, studied by Benveniste, Ferrater Mora, Levi-Strauss, Lopes, Penalver Simo, Saussure, apart from Ducrot, Favero and Koch. Appropriations that have already been done by Paul Otlet, Garcia Gutierrez and Moreiro Gonzalez. In Documentation come as a further topic. It concludes that the adopted notion of structure to make explicit a hypothesis of real systematization achieves more organical systems, as well as it grants pedagogical reference to the documentary tasks.
Resumo:
The most ordinary finite element formulations for 3D frame analysis do not consider the warping of cross-sections as part of their kinematics. So the stiffness, regarding torsion, should be directly introduced by the user into the computational software and the bar is treated as it is working under no warping hypothesis. This approach does not give good results for general structural elements applied in engineering. Both displacement and stress calculation reveal sensible deficiencies for both linear and non-linear applications. For linear analysis, displacements can be corrected by assuming a stiffness that results in acceptable global displacements of the analyzed structure. However, the stress calculation will be far from reality. For nonlinear analysis the deficiencies are even worse. In the past forty years, some special structural matrix analysis and finite element formulations have been proposed in literature to include warping and the bending-torsion effects for 3D general frame analysis considering both linear and non-linear situations. In this work, using a kinematics improvement technique, the degree of freedom ""warping intensity"" is introduced following a new approach for 3D frame elements. This degree of freedom is associated with the warping basic mode, a geometric characteristic of the cross-section, It does not have a direct relation with the rate of twist rotation along the longitudinal axis, as in existent formulations. Moreover, a linear strain variation mode is provided for the geometric non-linear approach, for which complete 3D constitutive relation (Saint-Venant Kirchhoff) is adopted. The proposed technique allows the consideration of inhomogeneous cross-sections with any geometry. Various examples are shown to demonstrate the accuracy and applicability of the proposed formulation. (C) 2009 Elsevier Inc. All rights reserved.