957 resultados para root sampling methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the factors controlling fine root respiration (FRR) at different temporal scales will help to improve our knowledge about the spatial and temporal variability of soil respiration (SR) and to improve future predictions of CO2 effluxes to the atmosphere. Here we present a comparative study of how FRR respond to variability in soil temperature and moisture in two widely spread species, Scots pines (Pinus sylvestris L.) and Holm-oaks (HO; Quercus ilex L.). Those two species show contrasting water use strategies during the extreme summer-drought conditions that characterize the Mediterranean climate. The study was carried out on a mixed Mediterranean forest where Scots pines affected by drought induced die-back are slowly being replaced by the more drought resistant HO. FRR was measured in spring and early fall 2013 in excised roots freshly removed from the soil and collected under HO and under Scots pines at three different health stages: dead (D), defoliated (DP) and non-defoliated (NDP). Variations in soil temperature, soil water content and daily mean assimilation per tree were also recorded to evaluate FRR sensibility to abiotic and biotic environmental variations. Our results show that values of FRR were substantially lower under HO (1.26 ± 0.16 microgram CO2 /groot·min) than under living pines (1.89 ± 0.19 microgram CO2 /groot·min) which disagrees with the similar rates of soil respiration previously observed under both canopies and suggest that FRR contribution to total SR varies under different tree species. The similarity of FRR rates under HO and DP furthermore confirms other previous studies suggesting a recent Holm-oak root colonization of the gaps under dead trees. A linear mixed effect model approach indicated that seasonal variations in FRR were best explained by soil temperature (p<0.05) while soil moisture was not exerting any direct control over FRR, despite the low soil moisture values during the summer sampling. Plant assimilation rates were positively related to FRR explaining part of the observed variability (p<0.01). However the positive relations of FRR with plant assimilation occurred mainly during spring, when both soil moisture and plant assimilation rates were higher. Our results finally suggest that plants might be able to maintain relatively high rates of FRR during the sub-optimal abiotic and biotic summer conditions probably thanks to their capacity to re-mobilize carbon reserves and their capacity to passively move water from moister layers to upper layers with lower water potentials (where the FR were collected) by hydraulic lift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The etiology and epidemiology of Pythium root rot in hydroponically-grown crops are reviewed with emphasis on knowledge and concepts considered important for managing the disease in commercial greenhouses. Pythium root rot continually threatens the productivity of numerous kinds of crops in hydroponic systems around the world including cucumber, tomato, sweet pepper, spinach, lettuce, nasturtium, arugula, rose, and chrysanthemum. Principal causal agents include Pythium aphanidermatum, Pythium dissotocum, members of Pythium group F, and Pythium ultimum var. ultimum. Perspectives are given of sources of initial inoculum of Pythium spp. in hydroponic systems, of infection and colonization of roots by the pathogens, symptom development and inoculum production in host roots, and inoculum dispersal in nutrient solutions. Recent findings that a specific elicitor produced by P. aphanidermatum may trigger necrosis (browning) of the roots and the transition from biotrophic to necrotrophic infection are considered. Effects on root rot epidemics of host factors (disease susceptibility, phenological growth stage, root exudates and phenolic substances), the root environment (rooting media, concentrations of dissolved oxygen and phenolic substances in the nutrient solution, microbial communities and temperature) and human interferences (cropping practices and control measures) are reviewed. Recent findings on predisposition of roots to Pythium attack by environmental stress factors are highlighted. The commonly minor impact on epidemics of measures to disinfest nutrient solution as it recirculates outside the crop is contrasted with the impact of treatments that suppress Pythium in the roots and root zone of the crop. New discoveries that infection of roots by P. aphanidermatum markedly slows the increase in leaf area and whole-plant carbon gain without significant effect on the efficiency of photosynthesis per unit area of leaf are noted. The platform of knowledge and understanding of the etiology and epidemiology of root rot, and its effects on the physiology of the whole plant, are discussed in relation to new research directions and development of better practices to manage the disease in hydroponic crops. Focus is on methods and technologies for tracking Pythium and root rot, and on developing, integrating, and optimizing treatments to suppress the pathogen in the root zone and progress of root rot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study uses several measures derived from the error matrix for comparing two thematic maps generated with the same sample set. The reference map was generated with all the sample elements and the map set as the model was generated without the two points detected as influential by the analysis of local influence diagnostics. The data analyzed refer to the wheat productivity in an agricultural area of 13.55 ha considering a sampling grid of 50 x 50 m comprising 50 georeferenced sample elements. The comparison measures derived from the error matrix indicated that despite some similarity on the maps, they are different. The difference between the estimated production by the reference map and the actual production was of 350 kilograms. The same difference calculated with the mode map was of 50 kilograms, indicating that the study of influential points is of fundamental importance to obtain a more reliable estimative and use of measures obtained from the error matrix is a good option to make comparisons between thematic maps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Irrigation with domestic sewage effluent (DSE) has been recommended by subsurface dripping, as it can obtain a high rate of irrigation efficiency and faster use of salts in comparison with other irrigation methods. The study aimed at evaluating the area, the length and the effective depth of the root system of sugarcane irrigated with DSE by subsurface drip system and with different irrigation rates at depths of 0.00-0.20, 0.20-0.40, 0.40-0.60 and 0.60-0.80m. The experiment was carried out in the municipality of Piracicaba, in the state of São Paulo (SP), Brazil, in a sugarcane area irrigated with DSE in a completely randomized blocks set up in furrows, with three replications and four treatments, which are: one area without irrigation (AWI) and three irrigated areas meeting 50% (T50%), 100% (T100%) and 200% (T200%) of the crop's water need between each round of irrigation. T100% and T200% provided smaller areas and lengths of roots in the two deepest layers, as compared to AWI and T50%, which stimulated the development of deeper roots due to the water stress. TWI, T100% and T200% presented 80% of the roots up to a depth of 0.40m and T50% treatment presented 76.43% of roots total.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple method was developed for treating corn seeds with oxamyl. It involved soaking the seeds to ensure oxamyl uptake, centrifugation to draw off excess solution, and drying under a stream of air to prevent the formation of fungus. The seeds were found to have an even distribution of oxamyl. Seeds remained fungus-free even 12 months after treatment. The highest nonphytotoxic treatment level was obtained by using a 4.00 mg/mL oxamyl solution. Extraction methods for the determination of oxamyl (methyl-N'N'-dimethyl-N-[(methylcarbamoyl)oxy]-l-thiooxamimidate), its oxime (methyl-N',N'-dimethyl-N-hydroxy-1-thiooxamimidate), and DMCF (N,N-dimethyl-1-cyanoformanade) in seed" root, and soil were developed. Seeds were processed by homogenizing, then shaking in methanol. Significantly more oxamyl was extracted from hydrated seeds as opposed to dry seeds. Soils were extracted by tumbling in methanol; recoveries range~ from 86 - 87% for oxamyl. Root was extracted to 93% efficiency for oxamyl by homogenizing the tissue in methanol. NucharAttaclay column cleanup afforded suitable extracts for analysis by RP-HPLC on a C18 column and UV detection at 254 nm. In the degradation study, oxamyl was found to dissipate from the seed down into the soil. It was also detected in the root. Oxime was detected in both the seed and soil, but not in the root. DMCF was detected in small amounts only in the seed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach to treating large Z systems by quantum Monte Carlo has been developed. It naturally leads to notion of the 'valence energy'. Possibilities of the new approach has been explored by optimizing the wave function for CuH and Cu and computing dissociation energy and dipole moment of CuH using variational Monte Carlo. The dissociation energy obtained is about 40% smaller than the experimental value; the method is comparable with SCF and simple pseudopotential calculations. The dipole moment differs from the best theoretical estimate by about 50% what is again comparable with other methods (Complete Active Space SCF and pseudopotential methods).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prediction of proteins' conformation helps to understand their exhibited functions, allows for modeling and allows for the possible synthesis of the studied protein. Our research is focused on a sub-problem of protein folding known as side-chain packing. Its computational complexity has been proven to be NP-Hard. The motivation behind our study is to offer the scientific community a means to obtain faster conformation approximations for small to large proteins over currently available methods. As the size of proteins increases, current techniques become unusable due to the exponential nature of the problem. We investigated the capabilities of a hybrid genetic algorithm / simulated annealing technique to predict the low-energy conformational states of various sized proteins and to generate statistical distributions of the studied proteins' molecular ensemble for pKa predictions. Our algorithm produced errors to experimental results within .acceptable margins and offered considerable speed up depending on the protein and on the rotameric states' resolution used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend the class of M-tests for a unit root analyzed by Perron and Ng (1996) and Ng and Perron (1997) to the case where a change in the trend function is allowed to occur at an unknown time. These tests M(GLS) adopt the GLS detrending approach of Dufour and King (1991) and Elliott, Rothenberg and Stock (1996) (ERS). Following Perron (1989), we consider two models : one allowing for a change in slope and the other for both a change in intercept and slope. We derive the asymptotic distribution of the tests as well as that of the feasible point optimal tests PT(GLS) suggested by ERS. The asymptotic critical values of the tests are tabulated. Also, we compute the non-centrality parameter used for the local GLS detrending that permits the tests to have 50% asymptotic power at that value. We show that the M(GLS) and PT(GLS) tests have an asymptotic power function close to the power envelope. An extensive simulation study analyzes the size and power in finite samples under various methods to select the truncation lag for the autoregressive spectral density estimator. An empirical application is also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a theoretical framework to explain the empirical finding that the estimated betas are sensitive to the sampling interval even when using continuously compounded returns. We suppose that stock prices have both permanent and transitory components. The permanent component is a standard geometric Brownian motion while the transitory component is a stationary Ornstein-Uhlenbeck process. The discrete time representation of the beta depends on the sampling interval and two components labelled \"permanent and transitory betas\". We show that if no transitory component is present in stock prices, then no sampling interval effect occurs. However, the presence of a transitory component implies that the beta is an increasing (decreasing) function of the sampling interval for more (less) risky assets. In our framework, assets are labelled risky if their \"permanent beta\" is greater than their \"transitory beta\" and vice versa for less risky assets. Simulations show that our theoretical results provide good approximations for the means and standard deviations of estimated betas in small samples. Our results can be perceived as indirect evidence for the presence of a transitory component in stock prices, as proposed by Fama and French (1988) and Poterba and Summers (1988).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La présente étude porte sur l’évaluation d’une méthode d’acquisition de la solution de sol présente à l’interface sol-racine, dans la rhizosphère. Cette interface constitue le lieu privilégié de prise en charge par les plantes des contaminants, tels que les métaux traces. Comme les plantes acquièrent ces éléments à partir de la phase liquide, la solution de sol de la rhizosphère est une composante clé pour déterminer la fraction de métaux traces biodisponibles. La microlysimétrie est la méthode in situ la plus appropriée pour aborder les difficultés liées à l’échelle microscopique de la rhizosphère. Ainsi, dans les études sur la biodisponibilité des métaux traces au niveau de la rhizosphère, les microlysimètres (Rhizon©) gagnent en popularité sans, toutefois, avoir fait l’objet d’études exhaustives. L’objectif de cette étude est donc d’évaluer la capacité de ces microlysimètres à préserver l’intégrité chimique de la solution, tout en optimisant leur utilisation. Pour ce faire, les microlysimètres ont été soumis à une série d’expériences en présence de solutions et de sols, où la quantité de solution prélevée et le comportement des métaux traces (Cd, Cu, Ni, Pb, Zn) ont été étudiés. Les résultats montrent que les microlysimètres fonctionnent de façon optimale lorsque le contenu en eau du sol est au-dessus de la capacité au champ et lorsqu’il y a peu de matière organique et d’argile. Les sols sableux ayant un faible contenu en C organique reproduisent mieux le volume prélevé et la solution sous la capacité au champ peut être récoltée. L’utilisation des microlysimètres dans ces sols est donc optimale. Dans les essais en solution, les microlysimètres ont atteint un équilibre avec la solution après 10 h de prélèvement. En respectant ce délai et les conditions optimales préalablement établies (pH acide et COD élevé), les microlysimètres préservent la composition chimique de la solution. Dans les essais en sol, cet équilibre n’a pas été atteint après dix jours et huit prélèvements. Le contenu en matière organique et l’activité microbienne semblent responsables de la modification des concentrations en métaux au cours de ces prélèvements, notamment, dans l’horizon FH où les microlysimètres performent très mal. En revanche, dans l’horizon B, les concentrations tendent à se stabiliser vers la fin de la série de prélèvements en se rapprochant des valeurs de référence. Bien que des valeurs plus élevées s’observent pour les microlysimètres, leurs concentrations en métaux sont comparables à celles des méthodes de référence (extrait à l’eau, lysimètres de terrain avec et sans tension). En somme, les microlysimètres se comportent généralement mieux dans l’horizon B. Même si leur utilisation est plus optimale dans un sol sableux, cet horizon est privilégié pour de futures études sur le terrain avec les microlysimètres.