875 resultados para modeling of data sources
Resumo:
When the food supply flnishes, or when the larvae of blowflies complete their development and migrate prior to the total removal of the larval substrate, they disperse to find adequate places for pupation, a process known as post-feeding larval dispersal. Based on experimental data of the Initial and final configuration of the dispersion, the reproduction of such spatio-temporal behavior is achieved here by means of the evolutionary search for cellular automata with a distinct transition rule associated with each cell, also known as a nonuniform cellular automata, and with two states per cell in the lattice. Two-dimensional regular lattices and multivalued states will be considered and a practical question is the necessity of discovering a proper set of transition rules. Given that the number of rules is related to the number of cells in the lattice, the search space is very large and an evolution strategy is then considered to optimize the parameters of the transition rules, with two transition rules per cell. As the parameters to be optimized admit a physical interpretation, the obtained computational model can be analyzed to raise some hypothetical explanation of the observed spatiotemporal behavior. © 2006 IEEE.
Resumo:
This paper presents a comparative analysis between the experimental characterization and the numerical simulation results for a three-dimensional FCC photonic crystal (PhC) based on a self-assembly synthesis of monodispersive latex spheres. Specifically, experimental optical characterization, by means of reflectance measurements under variable angles over the lattice plane family [1,1, 1], are compared to theoretical calculations based on the Finite Di®erence Time Domain (FDTD) method, in order to investigate the correlation between theoretical predictions and experimental data. The goal is to highlight the influence of crystal defects on the achieved performance.
Resumo:
Cellobiohydrolases hydrolyze cellulose releasing cellobiose units. They are very important for a number of biotechnological applications, such as, for example, production of cellulosic ethanol and cotton fiber processing. The Trichoderma cellobiohydrolase I (CBH1 or Cel7A) is an industrially important exocellulase. It exhibits a typical two domain architecture, with a small C-terminal cellulose-binding domain and a large N-terminal catalytic core domain, connected by an O-glycosylated linker peptide. The mechanism by which the linker mediates the concerted action of the two domains remains a conundrum. Here, we probe the protein shape and domain organization of the CBH1 of Trichoderma harzianum (ThCel7A) by small angle X-ray scattering (SAXS) and structural modeling. Our SAXS data shows that ThCel7A linker is partially-extended in solution. Structural modeling suggests that this linker conformation is stabilized by inter- and intra-molecular interactions involving the linker peptide and its O-glycosylations. © 2013 Springer Science+Business Media Dordrecht.
Resumo:
Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.
Resumo:
The objective of this work is to develop a non-stoichiometric equilibrium model to study parameter effects in the gasification process of a feedstock in downdraft gasifiers. The non-stoichiometric equilibrium model is also known as the Gibbs free energy minimization method. Four models were developed and tested. First a pure non-stoichiometric equilibrium model called M1 was developed; then the methane content was constrained by correlating experimental data and generating the model M2. A kinetic constraint that determines the apparent gasification rate was considered for model M3 and finally the two aforementioned constraints were implemented together in model M4. Models M2 and M4 showed to be the more accurate among the four developed models with mean RMS (root mean square error) values of 1.25 each.Also the gasification of Brazilian Pinus elliottii in a downdraft gasifier with air as gasification agent was studied. The input parameters considered were: (a) equivalence ratio (0.28-035); (b) moisture content (5-20%); (c) gasification time (30-120 min) and carbon conversion efficiency (80-100%). (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The aim of this work is to develop stoichiometric equilibrium models that permit the study of parameters effect in the gasification process of a particular feedstock. In total four models were tested in order to determine the syngas composition. One of these four models, called M2, was based on the theoretical equilibrium constants modified by two correction factors determined using published experimental data. The other two models, M3 and M4 were based in correlations, while model M4 was based in correlations to determine the equilibrium constants, model M3 was based in correlations that relate the H-2, CO and CO2 content on the synthesis gas. Model M2 proved to be the more accurate and versatile among these four models, and also showed better results than some previously published models. Also a case study for the gasification of a blend of hardwood chips and glycerol at 80% and 20% respectively, was performed considering equivalence ratios form 0.3 to 0.5, moisture contents from 0%-20% and oxygen percentages in the gasification agent of 100%, 60% and 21%. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Experiments of continuous alcoholic fermentation of sugarcane juice with flocculating yeast recycle were conducted in a system of two 0.22-L tower bioreactors in series, operated at a range of dilution rates (D (1) = D (2) = 0.27-0.95 h(-1)), constant recycle ratio (alpha = F (R) /F = 4.0) and a sugar concentration in the feed stream (S (0)) around 150 g/L. The data obtained in these experimental conditions were used to adjust the parameters of a mathematical model previously developed for the single-stage process. This model considers each of the tower bioreactors as a perfectly mixed continuous reactor and the kinetics of cell growth and product formation takes into account the limitation by substrate and the inhibition by ethanol and biomass, as well as the substrate consumption for cellular maintenance. The model predictions agreed satisfactorily with the measurements taken in both stages of the cascade. The major differences with respect to the kinetic parameters previously estimated for a single-stage system were observed for the maximum specific growth rate, for the inhibition constants of cell growth and for the specific rate of substrate consumption for cell maintenance. Mathematical models were validated and used to simulate alternative operating conditions as well as to analyze the performance of the two-stage process against that of the single-stage process.
Resumo:
This study evaluated the influence of light sources and immersion media on the color stability of a nanofilled composite resin. Conventional halogen, high-power-density halogen and high-power-density light-emitting diode (LED) units were used. There were 4 immersion media: coffee, tea, Coke® and artificial saliva. A total of 180 specimens (10 mm x 2 mm) were prepared, immersed in artificial saliva for 24 h at 37±1ºC, and had their initial color measured with a spectrophotometer according to the CIELab system. Then, the specimens were immersed in the 4 media during 60 days. Data from the color change and luminosity were collected and subjected to statistical analysis by the Kruskall-Wallis test (p<0.05). For immersion time, the data were subjected to two-way ANOVA test and Fisher's test (p<0.05). High-power-density LED (ΔE=1.91) promoted similar color stability of the composite resin to that of the tested halogen curing units (Jet Lite 4000 plus--ΔE=2.05; XL 3000--ΔE=2.28). Coffee (ΔE=8.40; ΔL=-5.21) showed the highest influence on color stability of the studied composite resin. There was no significant difference in color stability regardless of the light sources, and coffee was the immersion medium that promoted the highest color changes on the tested composite resin.
Resumo:
This study evaluated the influence of light sources and immersion media on the color stability of a nanofilled composite resin. Conventional halogen, high-power-density halogen and high-power-density light-emitting diode (LED) units were used. There were 4 immersion media: coffee, tea, Coke® and artificial saliva. A total of 180 specimens (10 mm x 2 mm) were prepared, immersed in artificial saliva for 24 h at 37±1ºC, and had their initial color measured with a spectrophotometer according to the CIELab system. Then, the specimens were immersed in the 4 media during 60 days. Data from the color change and luminosity were collected and subjected to statistical analysis by the Kruskall-Wallis test (p<0.05). For immersion time, the data were subjected to two-way ANOVA test and Fisher's test (p<0.05). High-power-density LED (ΔE=1.91) promoted similar color stability of the composite resin to that of the tested halogen curing units (Jet Lite 4000 plus--ΔE=2.05; XL 3000--ΔE=2.28). Coffee (ΔE=8.40; ΔL=-5.21) showed the highest influence on color stability of the studied composite resin. There was no significant difference in color stability regardless of the light sources, and coffee was the immersion medium that promoted the highest color changes on the tested composite resin.
Resumo:
Adequate polymerization plays an important role on the longevity of the composite resin restorations. Objectives: The aim of this study was to evaluate the effect of light-curing units, curing mode techniques and storage media on sorption, solubility and biaxial flexural strength (BFS) of a composite resin. Material and Methods: Two hundred and forty specimens were made of one composite resin (Esthet-X) in a stainless steel mold (2 mm x 8 mm 0), and divided into 24 groups (n=10) established according to the 4 study factors: light-curing units: quartz tungsten halogen (QTH) lamp and light-emitting diodes (LED); energy densities: 16 J/cm(2) and 20 J/cm(2); curing modes: conventional (CM) and pulse-delay (PD); and permeants: deionized water and 75% ethanol for 28 days. Sorption and solubility tests were performed according to ISO 4049:2000 specifications. All specimens were then tested for BFS according to ASTM F394-78 specification. Data were analyzed by three-way ANOVA followed by Tukey, Kruskal-Wallis and Mann-Whitney tests (alpha=0.05). Results: In general, no significant differences were found regarding sorption, solubility or BFS means for the light-curing units and curing modes (p>0.05). Only LED unit using 16 J/cm(2) and PD using 10 s produced higher sorption and solubility values than QTH. Otherwise, using CM (16 J/cm(2)), LED produced lower values of BFS than QTH (p<0.05). 75% ethanol permeant produced higher values of sorption and solubility and lower values of BFS than water (p<0.05). Conclusion: Ethanol storage media produced more damage on composite resin than water. In general the LED and QTH curing units using 16 and 20 J/cm(2) by CM and PD curing modes produced no influence on the sorption, solubility or BFS of the tested resin.
Resumo:
The first part of my work consisted in samplings conduced in nine different localities of the salento peninsula and Apulia (Italy): Costa Merlata (BR), Punta Penne (BR), Santa Cesarea terme (LE), Santa Caterina (LE), Torre Inserraglio (LE), Torre Guaceto (BR), Porto Cesareo (LE), Otranto (LE), Isole Tremiti (FG). I collected data of species percentage covering from the infralittoral rocky zone, using squares of 50x50 cm. We considered 3 sites for location and 10 replicates for each site, which has been taken randomly. Then I took other data about the same places, collected in some years, and I combined them together, to do a spatial analysis. So I started from a data set of 1896 samples but I decided not to consider time as a factor because I have reason to think that in this period of time anthropogenic stressors and their effects (if present), didn’t change considerably. The response variable I’ve analysed is the covering percentage of an amount of 243 species (subsequently merged into 32 functional groups), including seaweeds, invertebrates, sediment and rock. 2 After the sampling, I have been spent a period of two months at the Hopkins Marine Station of Stanford University, in Monterey (California,USA), at Fiorenza Micheli's laboratory. I've been carried out statistical analysis on my data set, using the software PRIMER 6. My explorative analysis starts with a nMDS in PRIMER 6, considering the original data matrix without, for the moment, the effect of stressors. What comes out is a good separation between localities and it confirms the result of ANOSIM analysis conduced on the original data matrix. What is possible to ensure is that there is not a separation led by a geographic pattern, but there should be something else that leads the differences. Is clear the presence of at least three groups: one composed by Porto cesareo, Torre Guaceto and Isole tremiti (the only marine protected areas considered in this work); another one by Otranto, and the last one by the rest of little, impacted localities. Inside the localities that include MPA(Marine Protected Areas), is also possible to observe a sort of grouping between protected and controlled areas. What comes out from SIMPER analysis is that the most of the species involved in leading differences between populations are not rare species, like: Cystoseira spp., Mytilus sp. and ECR. Moreover I assigned discrete values (0,1,2) of each stressor to all the sites I considered, in relation to the intensity with which the anthropogenic factor affect the localities. 3 Then I tried to estabilish if there were some significant interactions between stressors: by using Spearman rank correlation and Spearman tables of significance, and taking into account 17 grades of freedom, the outcome shows some significant stressors interactions. Then I built a nMDS considering the stressors as response variable. The result was positive: localities are well separeted by stressors. Consequently I related the matrix with 'localities and species' with the 'localities and stressors' one. Stressors combination explains with a good significance level the variability inside my populations. I tried with all the possible data transformations (none, square root, fourth root, log (X+1), P/A), but the fourth root seemed to be the best one, with the highest level of significativity, meaning that also rare species can influence the result. The challenge will be to characterize better which kind of stressors (including also natural ones), act on the ecosystem; and give them a quantitative and more accurate values, trying to understand how they interact (in an additive or non-additive way).
Resumo:
The Italian radio telescopes currently undergo a major upgrade period in response to the growing demand for deep radio observations, such as surveys on large sky areas or observations of vast samples of compact radio sources. The optimised employment of the Italian antennas, at first constructed mainly for VLBI activities and provided with a control system (FS – Field System) not tailored to single-dish observations, required important modifications in particular of the guiding software and data acquisition system. The production of a completely new control system called ESCS (Enhanced Single-dish Control System) for the Medicina dish started in 2007, in synergy with the software development for the forthcoming Sardinia Radio Telescope (SRT). The aim is to produce a system optimised for single-dish observations in continuum, spectrometry and polarimetry. ESCS is also planned to be installed at the Noto site. A substantial part of this thesis work consisted in designing and developing subsystems within ESCS, in order to provide this software with tools to carry out large maps, spanning from the implementation of On-The-Fly fast scans (following both conventional and innovative observing strategies) to the production of single-dish standard output files and the realisation of tools for the quick-look of the acquired data. The test period coincided with the commissioning phase for two devices temporarily installed – while waiting for the SRT to be completed – on the Medicina antenna: a 18-26 GHz 7-feed receiver and the 14-channel analogue backend developed for its use. It is worth stressing that it is the only K-band multi-feed receiver at present available worldwide. The commissioning of the overall hardware/software system constituted a considerable section of the thesis work. Tests were led in order to verify the system stability and its capabilities, down to sensitivity levels which had never been reached in Medicina using the previous observing techniques and hardware devices. The aim was also to assess the scientific potential of the multi-feed receiver for the production of wide maps, exploiting its temporary availability on a mid-sized antenna. Dishes like the 32-m antennas at Medicina and Noto, in fact, offer the best conditions for large-area surveys, especially at high frequencies, as they provide a suited compromise between sufficiently large beam sizes to cover quickly large areas of the sky (typical of small-sized telescopes) and sensitivity (typical of large-sized telescopes). The KNoWS (K-band Northern Wide Survey) project is aimed at the realisation of a full-northern-sky survey at 21 GHz; its pilot observations, performed using the new ESCS tools and a peculiar observing strategy, constituted an ideal test-bed for ESCS itself and for the multi-feed/backend system. The KNoWS group, which I am part of, supported the commissioning activities also providing map-making and source-extraction tools, in order to complete the necessary data reduction pipeline and assess the general system scientific capabilities. The K-band observations, which were carried out in several sessions along the December 2008-March 2010 period, were accompanied by the realisation of a 5 GHz test survey during the summertime, which is not suitable for high-frequency observations. This activity was conceived in order to check the new analogue backend separately from the multi-feed receiver, and to simultaneously produce original scientific data (the 6-cm Medicina Survey, 6MS, a polar cap survey to complete PMN-GB6 and provide an all-sky coverage at 5 GHz).