166 resultados para Integrated forensic approach
Resumo:
We present parameter-free calculations of electronic properties of InGaN, InAlN, and AlGaN alloys. The calculations are based on a generalized quasichemical approach, to account for disorder and composition effects, and first-principles calculations within the density functional theory with the LDA-1/2 approach, to accurately determine the band gaps. We provide precise results for AlGaN, InGaN, and AlInN band gaps for the entire range of compositions, and their respective bowing parameters. (C) 2011 American Institute of Physics. [doi:10.1063/1.3576570]
Resumo:
We consider the problem of interaction neighborhood estimation from the partial observation of a finite number of realizations of a random field. We introduce a model selection rule to choose estimators of conditional probabilities among natural candidates. Our main result is an oracle inequality satisfied by the resulting estimator. We use then this selection rule in a two-step procedure to evaluate the interacting neighborhoods. The selection rule selects a small prior set of possible interacting points and a cutting step remove from this prior set the irrelevant points. We also prove that the Ising models satisfy the assumptions of the main theorems, without restrictions on the temperature, on the structure of the interacting graph or on the range of the interactions. It provides therefore a large class of applications for our results. We give a computationally efficient procedure in these models. We finally show the practical efficiency of our approach in a simulation study.
Resumo:
Obesity has been recognized as a worldwide public health problem. It significantly increases the chances of developing several diseases, including Type II diabetes. The roles of insulin and leptin in obesity involve reactions that can be better understood when they are presented step by step. The aim of this work was to design software with data from some of the most recent publications on obesity, especially those concerning the roles of insulin and leptin in this metabolic disturbance. The most notable characteristic of this software is the use of animations representing the cellular response together with the presentation of recently discovered mechanisms on the participation of insulin and leptin in processes leading to obesity. The software was field tested in the Biochemistry of Nutrition web-based course. After using the software and discussing its contents in chatrooms, students were asked to answer an evaluation survey about the whole activity and the usefulness of the software within the learning process. The teaching assistants (TA) evaluated the software as a tool to help in the teaching process. The students' and TAs' satisfaction was very evident and encouraged us to move forward with the software development and to improve the use of this kind of educational tool in biochemistry classes.
Resumo:
In order for solar energy to serve as a primary energy source, it must be paired with energy storage on a massive scale. At this scale, solar fuels and energy storage in chemical bonds is the only practical approach. Solar fuels are produced in massive amounts by photosynthesis with the reduction of CO(2) by water to give carbohydrates but efficiencies are low. In photosystem II (PSII), the oxygen-producing site for photosynthesis, light absorption and sensitization trigger a cascade of coupled electron-proton transfer events with time scales ranging from picoseconds to microseconds. Oxidative equivalents are built up at the oxygen evolving complex (OEC) for water oxidation by the Kok cycle. A systematic approach to artificial photo synthesis is available based on a ""modular approach"" in which the separate functions of a final device are studied separately, maximized for rates and stability, and used as modules in constructing integrated devices based on molecular assemblies, nanoscale arrays, self-assembled monolayers, etc. Considerable simplification is available by adopting a ""dyesensitized photoelectrosynthesis cell"" (DSPEC) approach inspired by dye-sensitized solar cells (DSSCs). Water oxidation catalysis is a key feature, and significant progress has been made in developing a single-site solution and surface catalysts based on polypyridyl complexes of Ru. In this series, ligand variations can be used to tune redox potentials and reactivity over a wide range. Water oxidation electrocatalysis has been extended to chromophore-catalyst assemblies for both water oxidation and DSPEC applications.
Resumo:
A new approach for the integration of dual contactless conductivity and amperometric detection with an electrophoresis microchip system is presented. The PDMS layer with the embedded channels was reversibly sealed to a thin glass substrate (400 mu m), on top of which a palladium electrode had been previously fabricated enabling end-channel amperometric detection. The thin glass substrate served also as a physical wall between the separation channel and the sensing copper electrodes for contactless conductivity detection. The latter were not integrated in the microfluidic device, but fabricated on an independent plastic substrate allowing a simpler and more cost-effective fabrication of the chip. PDMS/glass chips with merely contactless conductivity detection were first characterized in terms of sensitivity, efficiency and reproducibility. The separation efficiency of this system was found to be similar or slightly superior to other systems reported in the literature. The simultaneous determination of ionic and electroactive species was illustrated by the separation of peroxynitrite degradation products, i.e. NO(3)(-) (non-electroactive) and NO(2)(-) (electroactive), using hybrid PDMS/glass chips with dual contactless conductivity and amperometric detection. While both ions were detected by contactless conductivity detection with good efficiency, NO(2)(-) was also simultaneously detected amperometrically with a significant enhancement in sensitivity compared to contactless conductivity detection.
Resumo:
The Brazilian Amazon is one of the most rapidly developing agricultural areas in the world and represents a potentially large future source of greenhouse gases from land clearing and subsequent agricultural management. In an integrated approach, we estimate the greenhouse gas dynamics of natural ecosystems and agricultural ecosystems after clearing in the context of a future climate. We examine scenarios of deforestation and postclearing land use to estimate the future (2006-2050) impacts on carbon dioxide (CO(2)), methane (CH(4)), and nitrous oxide (N(2)O) emissions from the agricultural frontier state of Mato Grosso, using a process-based biogeochemistry model, the Terrestrial Ecosystems Model (TEM). We estimate a net emission of greenhouse gases from Mato Grosso, ranging from 2.8 to 15.9 Pg CO(2)-equivalents (CO(2)-e) from 2006 to 2050. Deforestation is the largest source of greenhouse gas emissions over this period, but land uses following clearing account for a substantial portion (24-49%) of the net greenhouse gas budget. Due to land-cover and land-use change, there is a small foregone carbon sequestration of 0.2-0.4 Pg CO(2)-e by natural forests and cerrado between 2006 and 2050. Both deforestation and future land-use management play important roles in the net greenhouse gas emissions of this frontier, suggesting that both should be considered in emissions policies. We find that avoided deforestation remains the best strategy for minimizing future greenhouse gas emissions from Mato Grosso.
Resumo:
The aim of this paper was to study a method based on gas production technique to measure the biological effects of tannins on rumen fermentation. Six feeds were used as fermentation substrates in a semi-automated gas method: feed A - aroeira (Astronium urundeuva); feed B - jurema preta (Mimosa hostilis), feed C - sorghum grains (Sorghum bicolor); feed D - Tifton-85 (Cynodon sp.); and two others prepared mixing 450 g sorghum leaves, 450 g concentrate (maize and soybean meal) and 100 g either of acacia (Acacia mearnsii) tannin extract (feed E) or quebracho (Schinopsis lorentzii) tannin extract (feed F) per kg (w:w). Three assays were carried out to standardize the bioassay for tannins. The first assay compared two binding agents (polyethylene glycol - PEG - and polyvinyl polypirrolidone - PVPP) to attenuate the tannin effects. The complex formed by PEG and tannins showed to be more stable than PVPP and tannins. Then, in the second assay, PEG was used as binding agent, and this assay was done to evaluate levels of PEG (0, 500, 750, 1000 and 1250 mg/g DM) to minimize the tannin effect. All the tested levels of PEG produced a response to evaluate tannin effects but the best response was for dose of 1000 mg/g DM. Using this dose of PEG, the final assay was carried out to test three compounds (tannic acid, quebracho extract and acacia extract) to establish a curve of biological equivalent effect of tannins. For this, five levels of each compound were added to I g of a standard feed (Lucerne hay). The equivalent effect showed not to be directly related to the chemical analysis for tannins. It was shown that different sources of tannins had different activities or reactivities. The curves of biological equivalence can provide information about tannin reactivity and its use seems to be important as an additional factor for chemical analysis. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
A simultaneous optimization strategy based on a neuro-genetic approach is proposed for selection of laser induced breakdown spectroscopy operational conditions for the simultaneous determination of macronutrients (Ca, Mg and P), micro-nutrients (B, Cu, Fe, Mn and Zn), Al and Si in plant samples. A laser induced breakdown spectroscopy system equipped with a 10 Hz Q-switched Nd:YAG laser (12 ns, 532 nm, 140 mJ) and an Echelle spectrometer with intensified coupled-charge device was used. Integration time gate, delay time, amplification gain and number of pulses were optimized. Pellets of spinach leaves (NIST 1570a) were employed as laboratory samples. In order to find a model that could correlate laser induced breakdown spectroscopy operational conditions with compromised high peak areas of all elements simultaneously, a Bayesian Regularized Artificial Neural Network approach was employed. Subsequently, a genetic algorithm was applied to find optimal conditions for the neural network model, in an approach called neuro-genetic, A single laser induced breakdown spectroscopy working condition that maximizes peak areas of all elements simultaneously, was obtained with the following optimized parameters: 9.0 mu s integration time gate, 1.1 mu s delay time, 225 (a.u.) amplification gain and 30 accumulated laser pulses. The proposed approach is a useful and a suitable tool for the optimization process of such a complex analytical problem. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We present here the sequence of the mitochondrial genome of the basidiomycete phytopathogenic hemibiotrophic fungus Moniliophthora perniciosa, causal agent of the Witches` Broom Disease in Theobroma cacao. The DNA is a circular molecule of 109103 base pairs, with 31.9 % GC, and is the largest sequenced so far. This size is due essentially to the presence of numerous non-conserved hypothetical ORFs. It contains the 14 genes coding for proteins involved in the oxidative phosphorylation, the two rRNA genes, one ORF coding for a ribosomal protein (rps3), and a set of 26 tRNA genes that recognize codons for all amino acids. Seven homing endonucleases are located inside introns. Except atp8, all conserved known genes are in the same orientation. Phylogenetic analysis based on the cox genes agrees with the commonly accepted fungal taxonomy. An uncommon feature of this mitochondrial genome is the presence of a region that contains a set of four, relatively small, nested, inverted repeats enclosing two genes coding for polymerases with an invertron-type structure and three conserved hypothetical genes interpreted as the stable integration of a mitochondrial linear plasmid. The integration of this plasmid seems to be a recent evolutionary event that could have implications in fungal biology. This sequence is available under GenBank accession number AY376688. (c) 2008 The British Mycological Society. Published by Elsevier Ltd. All rights reserved.
Resumo:
A novel strategy for accomplishing zone trapping in flow analysis is proposed. The sample and the reagent solutions are simultaneously inserted into convergent carrier streams and the established zones merge together before reaching the detector, where the most concentrated portion of the entire sample zone is trapped. The main characteristics, potentialities and limitations of the strategy were critically evaluated in relation to an analogous flow system with zone stopping. When applied to the spectrophotometric determination of nitrite in river waters, the main figures of merit were maintained, exception made for the sampling frequency which was calculated as 189h(-1), about 32% higher relatively to the analogous system with zone stopping. The sample inserted volume can be increased up to 1.0 mL without affecting sampling frequency and no problems with pump heating or malfunctions were noted after 8-h operation of the system. In contrast to zone stopping, only a small portion of the sample zone is halted with zone trapping, leading to these beneficial effects. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The analysis of one-, two-, and three-dimensional coupled map lattices is here developed under a statistical and dynamical perspective. We show that the three-dimensional CML exhibits low dimensional behavior with long range correlation and the power spectrum follows 1/f noise. This approach leads to an integrated understanding of the most important properties of these universal models of spatiotemporal chaos. We perform a complete time series analysis of the model and investigate the dependence of the signal properties by change of dimension. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This article intends to contribute to the reflection on the Educational Statistics as being source for the researches on History of Education. The main concern was to reveal the way Educational Statistics related to the period from 1871 to 1931 were produced, in central government. Official reports - from the General Statistics Directory - and Statistics yearbooks released by that department were analyzed and, on this analysis, recommendations and definitions to perform the works were sought. By rending problematic to the documental issues on Educational Statistics and their usual interpretations, the intention was to reduce the ignorance about the origin of the school numbers, which are occasionally used in current researches without the convenient critical exam.
Resumo:
Electrodeposition of thin copper layer was carried out on titanium wires in acidic sulphate bath. The influence of titanium surface preparation, cathodic current density, copper sulphate and sulphuric acid concentrations, electrical charge density and stirring of the solution on the adhesion of the electrodeposits was studied using the Taguchi statistical method. A L(16) orthogonal array with the six factors of control at two levels each and three interactions was employed. The analysis of variance of the mean adhesion response and signal-to-noise ratio showed the great influence of cathodic current density on adhesion. on the contrary, the other factors as well as the three investigated interactions revealed low or no significant effect. From this study optimized electrolysis conditions were defined. The copper electrocoating improved the electrical conductivity of the titanium wire. This shows that copper electrocoated titanium wires could be employed for both electrical purpose and mechanical reinforcement in superconducting magnets. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
An implementation of a computational tool to generate new summaries from new source texts is presented, by means of the connectionist approach (artificial neural networks). Among other contributions that this work intends to bring to natural language processing research, the use of a more biologically plausible connectionist architecture and training for automatic summarization is emphasized. The choice relies on the expectation that it may bring an increase in computational efficiency when compared to the sa-called biologically implausible algorithms.
Resumo:
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.