961 resultados para Fast Algorithm
Resumo:
Eucalyptus is the dominant and most productive planted forest in Brazil, covering around 3.4 million ha for the production of charcoal, pulp, sawtimber, timber plates, wood foils, plywood and for building purposes. At the early establishment of the forest plantations, during the second half of the 1960s, the eucalypt yield was 10 m(3) ha(-1) y(-1). Now, as a result of investments in research and technology, the average productivity is 38 m3 ha(-1) y(-1). The productivity restrictions are related to the following environmental factors, in order of importance: water deficits > nutrient deficiency > soil depth and strength. The clonal forests have been fundamental in sites with larger water and nutrient restrictions, where they out-perform those established from traditional seed-based planting stock. When the environmental limitations are small the productivities of plantations based on clones or seeds appear to be similar. In the long term there are risks to sustainability, because of the low fertility and low reserves of primary minerals in the soils, which are, commonly, loamy and clayey oxisols and ultisols. Usually, a decline of soil quality is caused by management that does not conserve soil and site resources, damages soil physical and chemical characteristics, and insufficient or unbalanced fertiliser management. The problem is more serious when fast-growing genotypes are planted, which have a high nutrient demand and uptake capacity, and therefore high nutrient output through harvesting. The need to mobilise less soil by providing more cover and protection, reduce the nutrient and organic matter losses, preserve crucial physical properties as permeability ( root growth, infiltration and aeration), improve weed control and reduce costs has led to a progressive increase in the use of minimum cultivation practices during the last 20 years, which has been accepted as a good alternative to keep or increase site quality in the long term. In this paper we provide a synthesis and critical appraisal of the research results and practical implications of early silvicultural management on long-term site productivity of fast-growing eucalypt plantations arising from the Brazilian context.
Resumo:
By allowing the estimation of forest structural and biophysical characteristics at different temporal and spatial scales, remote sensing may contribute to our understanding and monitoring of planted forests. Here, we studied 9-year time-series of the Normalized Difference Vegetation Index (NDVI) from the Moderate Resolution Imaging Spectroradiometer (MODIS) on a network of 16 stands in fast-growing Eucalyptus plantations in Sao Paulo State, Brazil. We aimed to examine the relationships between NDVI time-series spanning entire rotations and stand structural characteristics (volume, dominant height, mean annual increment) in these simple forest ecosystems. Our second objective was to examine spatial and temporal variations of light use efficiency for wood production, by comparing time-series of Absorbed Photosynthetically Active Radiation (APAR) with inventory data. Relationships were calibrated between the NDVI and the fractions of intercepted diffuse and direct radiation, using hemispherical photographs taken on the studied stands at two seasons. APAR was calculated from the NDVI time-series using these relationships. Stem volume and dominant height were strongly correlated with summed NDVI values between planting date and inventory date. Stand productivity was correlated with mean NDVI values. APAR during the first 2 years of growth was variable between stands and was well correlated with stem wood production (r(2) = 0.78). In contrast, APAR during the following years was less variable and not significantly correlated with stem biomass increments. Production of wood per unit of absorbed light varied with stand age and with site index. In our study, a better site index was accompanied both by increased APAR during the first 2 years of growth and by higher light use efficiency for stem wood production during the whole rotation. Implications for simple process-based modelling are discussed. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A method was optimized for the analysis of omeprazole (OMZ) by ultra-high speed LC with diode array detection using a monolithic Chromolith Fast Gradient RP 18 endcapped column (50 x 2.0 mm id). The analyses were performed at 30 degrees C using a mobile phase consisting of 0.15% (v/v) trifluoroacetic acid (TFA) in water (solvent A) and 0.15% (v/v) TFA in acetonitrile (solvent B) under a linear gradient of 5 to 90% B in 1 min at a flow rate of 1.0 mL/min and detection at 220 nm. Under these conditions, OMZ retention time was approximately 0.74 min. Validation parameters, such as selectivity, linearity, precision, accuracy, and robustness, showed results within the acceptable criteria. The method developed was successfully applied to OMZ enteric-coated pellets, showing that this assay can be used in the pharmaceutical industry for routine QC analysis. Moreover, the analytical conditions established allow for the simultaneous analysis of OMZ metabolites, 5-hydroxyomeprazole and omeprazole sulfone, in the same run, showing that this method can be extended to other matrixes with adequate procedures for sample preparation.
Resumo:
Despite the necessity to differentiate chemical species of mercury in clinical specimens, there area limited number of methods for this purpose. Then, this paper describes a simple method for the determination of methylmercury and inorganic mercury in blood by using liquid chromatography with inductively coupled mass spectrometry (LC-ICP-MS) and a fast sample preparation procedure. Prior to analysis, blood (250 mu L) is accurately weighed into 15-mL conical tubes. Then, an extractant solution containing mercaptoethanol, L-cysteine and HCI was added to the samples following sonication for 15 min. Quantitative mercury extraction was achieved with the proposed procedure. Separation of mercury species was accomplished in less than 5 min on a C18 reverse-phase column with a mobile phase containing 0.05% (v/v) mercaptoethanol, 0.4% (m/v) L-cysteine, 0.06 mol L(-1) ammonium acetate and 5% (v/v) methanol. The method detection limits were found to be 0.25 mu g L(-1) and 0.1 mu Lg L(-1) for inorganic mercury and methylmercury, respectively. Method accuracy is traceable to Standard Reference Material (SRM) 966 Toxic Metals in Bovine Blood from the National Institute of Standards and Technology (NIST). The proposed method was also applied to the speciation of mercury in blood samples collected from fish-eating communities and from rats exposed to thimerosal. With the proposed method there is a considerable reduction of the time of sample preparation prior to speciation of Hg by LC-ICP-MS. Finally, after the application of the proposed method, we demonstrated an interesting in vivo ethylmercury conversion to inorganic mercury. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
A simple method for mercury speciation in hair samples with a fast sample preparation procedure using high-performance liquid chromatography coupled to inductively coupled plasma mass spectrometry is proposed. Prior to analysis, 50 mg of hair samples were accurately weighed into 15 mL conical tubes. Then, an extractant solution containing mercaptoethanol, L-cysteine and HCl was added to the samples following sonication for 10 min. Quantitative mercury extraction was achieved with the proposed procedure. Separation of inorganic mercury (Ino-Hg), methylmercury (Met-Hg) and ethylmercury (Et-Hg) was accomplished in less than 8 min on a C18 reverse phase column with a mobile phase containing 0.05% v/v mercaptoethanol, 0.4% m/v L-cysteine, 0.06 mol L(-1) ammonium acetate and 5% v/v methanol. The method detection limits were found to be 15 ng g(-1), 10 ng g(-1) and 38 ng g(-1), for inorganic mercury, methylmercury and ethylmercury, respectively. Sample throughput is 4 samples h(-1) (duplicate). A considerable improvement in the time of analysis was achieved when compared to other published methods. Method accuracy is traceable to Certified Reference Materials (CRMs) 85 and 86 human hair from the International Atomic Energy Agency (IAEA). Finally, the proposed method was successfully applied to the speciation of mercury in hair samples collected from fish-eating communities of the Brazilian Amazon.
Resumo:
A simple and fast method is described for simultaneous determination of methylmercury (MeHg), ethylmercury (Et-Hg) and inorganic mercury (Ino-Hg) in blood samples by using capillary gas chromatography-inductively coupled plasma mass spectrometry (GC-ICP-MS) after derivatization and alkaline digestion. Closed-vessel microwave assisted digestion conditions with tetramethylammonium hydroxide (TMAH) have been optimized. Derivatization by using ethylation and propylation procedures have also been evaluated and compared. The absolute detection limits (using a 1 mu L injection) obtained by GC-ICP-MS with ethylation were 40 fg for MeHg and Ino-Hg, respectively, and with propylation were 50, 20 and 50 fg for MeHg, Et-Hg and Ino-Hg, respectively. Method accuracy is traceable to Standard Reference Material (SRM) 966 Toxic Metals in Bovine Blood from the National Institute of Standards and Technology (NIST). Additional validation is provided based on the comparison of results obtained for mercury speciation in blood samples with the proposed procedure and with a previously reported LC-ICP-MS method. With the new proposed procedure no tedious clean-up steps are required and a considerable improvement of the time of analysis was achieved compared to other methods using GC separation.
Resumo:
The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.
Fast Structure-Based Assignment of 15N HSQC Spectra of Selectively 15N-Labeled Paramagnetic Proteins
Resumo:
A novel strategy for fast NMR resonance assignment of N-15 HSQC spectra of proteins is presented. It requires the structure coordinates of the protein, a paramagnetic center, and one or more residue-selectively N-15-labeled samples. Comparison of sensitive undecoupled N-15 HSQC spectra recorded of paramagnetic and diamagnetic samples yields data for every cross-peak on pseudocontact shift, paramagnetic relaxation enhancement, cross-correlation between Curie-spin and dipole-dipole relaxation, and residual dipolar coupling. Comparison of these four different paramagnetic quantities with predictions from the three-dimensional structure simultaneously yields the resonance assignment and the anisotropy of the susceptibility tensor of the paramagnetic center. The method is demonstrated with the 30 kDa complex between the N-terminal domain of the epsilon subunit and the theta subunit of Escherichia Coll DNA polymerase III. The program PLATYPUS was developed to perform the assignment, provide a measure of reliability of the assignment, and determine the susceptibility tensor anisotropy.
Resumo:
We present a fast method for finding optimal parameters for a low-resolution (threading) force field intended to distinguish correct from incorrect folds for a given protein sequence. In contrast to other methods, the parameterization uses information from >10(7) misfolded structures as well as a set of native sequence-structure pairs. In addition to testing the resulting force field's performance on the protein sequence threading problem, results are shown that characterize the number of parameters necessary for effective structure recognition.
Resumo:
Evolution strategies are a class of general optimisation algorithms which are applicable to functions that are multimodal, nondifferentiable, or even discontinuous. Although recombination operators have been introduced into evolution strategies, the primary search operator is still mutation. Classical evolution strategies rely on Gaussian mutations. A new mutation operator based on the Cauchy distribution is proposed in this paper. It is shown empirically that the new evolution strategy based on Cauchy mutation outperforms the classical evolution strategy on most of the 23 benchmark problems tested in this paper. The paper also shows empirically that changing the order of mutating the objective variables and mutating the strategy parameters does not alter the previous conclusion significantly, and that Cauchy mutations with different scaling parameters still outperform the Gaussian mutation with self-adaptation. However, the advantage of Cauchy mutations disappears when recombination is used in evolution strategies. It is argued that the search step size plays an important role in determining evolution strategies' performance. The large step size of recombination plays a similar role as Cauchy mutation.
Resumo:
Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
The amygdala is intimately involved in emotional behavior, and its role in the generation of anxiety and conditioned fear is well known. Benzodiazepines, which are commonly used for the relief of anxiety, are thought to act by enhancing the action of the inhibitory transmitter GABA. We have examined the properties of GABA-mediated inhibition in the amygdala. Whole-cell recordings were made from neurons in the lateral division of the central amygdala. Application of GABA evoked a current that reversed at the chloride equilibrium potential. Application of the GABA antagonists bicuculline or SR95531 inhibited the GABA-evoked current in a manner consistent with two binding sites. Stimulation of afferents to neurons in the central amygdala evoked an IPSC that was mediated by the release of GABA. The GABA(A) receptor antagonists bicuculline and picrotoxin failed to completely block the IPSC. The bicuculline-resistant IPSC was chloride-selective and was unaffected by GABA(B)-receptor antagonists. Furthermore, this current was insensitive to modulation by general anesthetics or barbiturates. In contrast to their actions at GABA(A) receptors, diazepam and flurazepam inhibited the bicuculline-resistant IPSC in a concentration-dependent manner. These effects were fully antagonized by the benzodiazepine site antagonist Ro15-1788. We conclude that a new type of ionotropic GABA receptor mediates fast inhibitory transmission in the central amygdala. This receptor may be a potential target for the development of new therapeutic strategies for anxiety disorders.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.