876 resultados para Dynamic Data eXchange
Resumo:
The semiarid region of northeastern Brazil, the Caatinga, is extremely important due to its biodiversity and endemism. Measurements of plant physiology are crucial to the calibration of Dynamic Global Vegetation Models (DGVMs) that are currently used to simulate the responses of vegetation in face of global changes. In a field work realized in an area of preserved Caatinga forest located in Petrolina, Pernambuco, measurements of carbon assimilation (in response to light and CO2) were performed on 11 individuals of Poincianella microphylla, a native species that is abundant in this region. These data were used to calibrate the maximum carboxylation velocity (Vcmax) used in the INLAND model. The calibration techniques used were Multiple Linear Regression (MLR), and data mining techniques as the Classification And Regression Tree (CART) and K-MEANS. The results were compared to the UNCALIBRATED model. It was found that simulated Gross Primary Productivity (GPP) reached 72% of observed GPP when using the calibrated Vcmax values, whereas the UNCALIBRATED approach accounted for 42% of observed GPP. Thus, this work shows the benefits of calibrating DGVMs using field ecophysiological measurements, especially in areas where field data is scarce or non-existent, such as in the Caatinga
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.
Resumo:
Cancer is a multistep process that begins with the transformation of normal epithelial cells and continues with tumor growth, stromal invasion and metastasis. The remodeling of the peritumoral environment is decisive for the onset of tumor invasiveness. This event is dependent on epithelial-stromal interactions, degradation of extracellular matrix components and reorganization of fibrillar components. Our research group has studied in a new proposed rodent model the participation of cellular and molecular components in the prostate microenvironment that contributes to cancer progression. Our group adopted the gerbil Meriones unguiculatus as an alternative experimental model for prostate cancer study. This model has presented significant responses to hormonal treatments and to development of spontaneous and induced neoplasias. The data obtained indicate reorganization of type I collagen fibers and reticular fibers, synthesis of new components such as tenascin and proteoglycans, degradation of basement membrane components and elastic fibers and increased expression of metalloproteinases. Fibroblasts that border the region, apparently participate in the stromal reaction. The roles of each of these events, as well as some signaling molecules, participants of neoplastic progression and factors that promote genetic reprogramming during epithelial-stromal transition are also discussed.
Resumo:
This work is part of a research under construction since 2000, in which the main objective is to measure small dynamic displacements by using L1 GPS receivers. A very sensible way to detect millimetric periodic displacements is based on the Phase Residual Method (PRM). This method is based on the frequency domain analysis of the phase residuals resulted from the L1 double difference static data processing of two satellites in almost orthogonal elevation angle. In this article, it is proposed to obtain the phase residuals directly from the raw phase observable collected in a short baseline during a limited time span, in lieu of obtaining the residual data file from regular GPS processing programs which not always allow the choice of the aimed satellites. In order to improve the ability to detect millimetric oscillations, two filtering techniques are introduced. One is auto-correlation which reduces the phase noise with random time behavior. The other is the running mean to separate low frequency from the high frequency phase sources. Two trials have been carried out to verify the proposed method and filtering techniques. One simulates a 2.5 millimeter vertical antenna displacement and the second uses the GPS data collected during a bridge load test. The results have shown a good consistency to detect millimetric oscillations.
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.
Resumo:
Background: Detailed analysis of the dynamic interactions among biological, environmental, social, and economic factors that favour the spread of certain diseases is extremely useful for designing effective control strategies. Diseases like tuberculosis that kills somebody every 15 seconds in the world, require methods that take into account the disease dynamics to design truly efficient control and surveillance strategies. The usual and well established statistical approaches provide insights into the cause-effect relationships that favour disease transmission but they only estimate risk areas, spatial or temporal trends. Here we introduce a novel approach that allows figuring out the dynamical behaviour of the disease spreading. This information can subsequently be used to validate mathematical models of the dissemination process from which the underlying mechanisms that are responsible for this spreading could be inferred. Methodology/Principal Findings: The method presented here is based on the analysis of the spread of tuberculosis in a Brazilian endemic city during five consecutive years. The detailed analysis of the spatio-temporal correlation of the yearly geo-referenced data, using different characteristic times of the disease evolution, allowed us to trace the temporal path of the aetiological agent, to locate the sources of infection, and to characterize the dynamics of disease spreading. Consequently, the method also allowed for the identification of socio-economic factors that influence the process. Conclusions/Significance: The information obtained can contribute to more effective budget allocation, drug distribution and recruitment of human skilled resources, as well as guiding the design of vaccination programs. We propose that this novel strategy can also be applied to the evaluation of other diseases as well as other social processes.
Resumo:
BACKGROUND: Xylitol bioproduction from lignocellulosic residues comprises hydrolysis of the hemicellulose, detoxification of the hydrolysate, bioconversion of the xylose, and recovery of xylitol from the fermented hydrolysate. There are relatively few reports on xylitol recovery from fermented media. In the present study, ion-exchange resins were used to clarify a fermented wheat straw hemicellulosic hydrolysate, which was then vacuum-concentrated and submitted to cooling in the presence of ethanol for xylitol crystallization. RESULTS: Sequential adsorption into two anion-exchange resins (A-860S and A-500PS) promoted considerable reductions in the content of soluble by-products (up to 97.5%) and in medium coloration (99.5%). Vacuum concentration led to a dark-colored viscous solution that inhibited xylitol crystallization. This inhibition could be overcome by mixing the concentrated medium with a commercial xylitol solution. Such a strategy led to xylitol crystals with up to 95.9% purity. The crystallization yield (43.5%) was close to that observed when using commercial xylitol solution (51.4%). CONCLUSION: The experimental data demonstrate the feasibility of using ion-exchange resins followed by cooling in the presence of ethanol as a strategy to promote the fast recovery and purification of xylitol from hemicellulose-derived fermentation media. (c) 2008 Society of Chemical Industry.
Resumo:
Highly ordered A-B-A block copolymer arrangements in the submicrometric scale, resulting from dewetting and solvent evaporation of thin films, have inspired a variety of new applications in the nanometric world. Despite the progress observed in the control of such structures, the intricate scientific phenomena related to regular patterns formation are still not completely elucidated. SEBS is a standard example of a triblock copolymer that forms spontaneously impressive pattern arrangements. From macroscopic thin liquid films of SEBS solution, several physical effects and phenomena act synergistically to achieve well-arranged patterns of stripes and/or droplets. That is, concomitant with dewetting, solvent evaporation, and Marangoni effect, Rayleigh instability and phase separation also play important role in the pattern formation. These two last effects are difficult to be followed experimentally in the nanoscale, which render difficulties to the comprehension of the whole phenomenon. In this paper, we use computational methods for image analysis, which provide quantitative morphometric data of the patterns, specifically comprising stripes fragmentation into droplets. With the help of these computational techniques, we developed an explanation for the final part of the pattern formation, i.e. structural dynamics related to the stripes fragmentation. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
One of the electrical impedance tomography objectives is to estimate the electrical resistivity distribution in a domain based only on electrical potential measurements at its boundary generated by an imposed electrical current distribution into the boundary. One of the methods used in dynamic estimation is the Kalman filter. In biomedical applications, the random walk model is frequently used as evolution model and, under this conditions, poor tracking ability of the extended Kalman filter (EKF) is achieved. An analytically developed evolution model is not feasible at this moment. The paper investigates the identification of the evolution model in parallel to the EKF and updating the evolution model with certain periodicity. The evolution model transition matrix is identified using the history of the estimated resistivity distribution obtained by a sensitivity matrix based algorithm and a Newton-Raphson algorithm. To numerically identify the linear evolution model, the Ibrahim time-domain method is used. The investigation is performed by numerical simulations of a domain with time-varying resistivity and by experimental data collected from the boundary of a human chest during normal breathing. The obtained dynamic resistivity values lie within the expected values for the tissues of a human chest. The EKF results suggest that the tracking ability is significantly improved with this approach.
Resumo:
Dynamic experiments in a nonadiabatic packed bed were carried out to evaluate the response to disturbances in wall temperature and inlet airflow rate and temperature. A two-dimensional, pseudo-homogeneous, axially dispersed plug-flow model was numerically solved and used to interpret the results. The model parameters were fitted in distinct stages: effective radial thermal conductivity (K (r)) and wall heat transfer coefficient (h (w)) were estimated from steady-state data and the characteristic packed bed time constant (tau) from transient data. A new correlation for the K (r) in packed beds of cylindrical particles was proposed. It was experimentally proved that temperature measurements using radially inserted thermocouples and a ring-shaped sensor were not distorted by heat conduction across the thermocouple or by the thermal inertia effect of the temperature sensors.
Resumo:
A simple calorimetric method to estimate both kinetics and heat transfer coefficients using temperature-versus-time data under non-adiabatic conditions is described for the reaction of hydrolysis of acetic anhydride. The methodology is applied to three simple laboratory-scale reactors in a very simple experimental setup that can be easily implemented. The quality of the experimental results was verified by comparing them with literature values and with predicted values obtained by energy balance. The comparison shows that the experimental kinetic parameters do not agree exactly with those reported in the literature, but provide a good agreement between predicted and experimental data of temperature and conversion. The differences observed between the activation energy obtained and the values reported in the literature can be ascribed to differences in anhydride-to-water ratios (anhydride concentrations). (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Computer viruses are an important risk to computational systems endangering either corporations of all sizes or personal computers used for domestic applications. Here, classical epidemiological models for disease propagation are adapted to computer networks and, by using simple systems identification techniques a model called SAIC (Susceptible, Antidotal, Infectious, Contaminated) is developed. Real data about computer viruses are used to validate the model. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Traditional waste stabilisation pond (WSP) models encounter problems predicting pond performance because they cannot account for the influence of pond features, such as inlet structure or pond geometry, on fluid hydrodynamics. In this study, two dimensional (2-D) computational fluid dynamics (CFD) models were compared to experimental residence time distributions (RTD) from literature. In one of the-three geometries simulated, the 2-D CFD model successfully predicted the experimental RTD. However, flow patterns in the other two geometries were not well described due to the difficulty of representing the three dimensional (3-D) experimental inlet in the 2-D CFD model, and the sensitivity of the model results to the assumptions used to characterise the inlet. Neither a velocity similarity nor geometric similarity approach to inlet representation in 2-D gave results correlating with experimental data. However. it was shown that 2-D CFD models were not affected by changes in values of model parameters which are difficult to predict, particularly the turbulent inlet conditions. This work suggests that 2-D CFD models cannot be used a priori to give an adequate description of the hydrodynamic patterns in WSP. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.