900 resultados para Discrete Sampling


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Weeds tend to aggregate in patches within fields and there is evidence that this is partly owing to variation in soil properties. Because the processes driving soil heterogeneity operate at different scales, the strength of the relationships between soil properties and weed density would also be expected to be scale-dependent. Quantifying these effects of scale on weed patch dynamics is essential to guide the design of discrete sampling protocols for mapping weed distribution. We have developed a general method that uses novel within-field nested sampling and residual maximum likelihood (REML) estimation to explore scale-dependent relationships between weeds and soil properties. We have validated the method using a case study of Alopecurus myosuroides in winter wheat. Using REML, we partitioned the variance and covariance into scale-specific components and estimated the correlations between the weed counts and soil properties at each scale. We used variograms to quantify the spatial structure in the data and to map variables by kriging. Our methodology successfully captured the effect of scale on a number of edaphic drivers of weed patchiness. The overall Pearson correlations between A. myosuroides and soil organic matter and clay content were weak and masked the stronger correlations at >50 m. Knowing how the variance was partitioned across the spatial scales we optimized the sampling design to focus sampling effort at those scales that contributed most to the total variance. The methods have the potential to guide patch spraying of weeds by identifying areas of the field that are vulnerable to weed establishment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most of the wastewater treatment systems in small rural communities of the Cova da Beira region (Portugal) consist of constructed wetlands (CW) with horizontal subsurface flow (HSSF). It is believed that those systems allow the compliance of discharge standards as well as the production of final effluents with suitability for reuse. Results obtained in a nine-month campaign in an HSSF bed pointed out that COD and TSS removal were lower than expected. A discrete sampling also showed that removal of TC, FC and HE was not enough to fulfill international irrigation goals. However, the bed had a very good response to variation of incoming nitrogen loads presenting high removal of nitrogen forms. A good correlation between mass load and mass removal rate was observed for BOD5, COD, TN, NH4-N, TP and TSS, which shows a satisfactory response of the bed to the variable incoming loads. The entrance of excessive loads of organic matter and solids contributed for the decrease of the effective volume for pollutant uptake and therefore, may have negatively influenced the treatment capability. Primary treatment should be improved in order to decrease the variation of incoming organic and solid loads and to improve the removal of COD, solids and pathogenic. The final effluent presented good physical-chemical quality to be reused for irrigation, which is the most likely application in the area.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We investigated diurnal nitrate (NO3-) concentration variability in the San Joaquin River using an in situ optical NO3- sensor and discrete sampling during a 5-day summer period characterized by high algal productivity. Dual NO3- isotopes (delta N-15(NO3) and delta O-18(NO3)) and dissolved oxygen isotopes (delta O-18(DO)) were measured over 2 days to assess NO3- sources and biogeochemical controls over diurnal time-scales. Concerted temporal patterns of dissolved oxygen (DO) concentrations and delta O-18(DO) were consistent with photosynthesis, respiration and atmospheric O-2 exchange, providing evidence of diurnal biological processes independent of river discharge. Surface water NO3- concentrations varied by up to 22% over a single diurnal cycle and up to 31% over the 5-day study, but did not reveal concerted diurnal patterns at a frequency comparable to DO concentrations. The decoupling of delta N-15(NO3) and delta O-18(NO3) isotopes suggests that algal assimilation and denitrification are not major processes controlling diurnal NO3- variability in the San Joaquin River during the study. The lack of a clear explanation for NO3- variability likely reflects a combination of riverine biological processes and time-varying physical transport of NO3- from upstream agricultural drains to the mainstem San Joaquin River. The application of an in situ optical NO3- sensor along with discrete samples provides a view into the fine temporal structure of hydrochemical data and may allow for greater accuracy in pollution assessment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During this work has been developed an innovative methodology for continuous and in situ gas monitoring (24/24 h) of fumarolic and soil diffusive emissions applied to the geothermal and volcanic area of Pisciarelli near Agnano inside the Campi Flegrei caldera (CFc). In literature there are only scattered and in discrete data of the geochemical gas composition of fumarole at Campi Flegrei; it is only since the early ’80 that exist a systematic record of fumaroles with discrete sampling at Solfatara (Bocca Grande and Bocca Nuova fumaroles) and since 1999, even at the degassing areas of Pisciarelli. This type of sampling has resulted in a time series of geochemical analysis with discontinuous periods of time set (in average 2-3 measurements per month) completely inadequate for the purposes of Civil Defence in such high volcanic risk and densely populated areas. For this purpose, and to remedy this lack of data, during this study was introduced a new methodology of continuous and in situ sampling able to continuously detect data related and from its soil diffusive degassing. Due to its high sampling density (about one measurement per minute therefore producing 1440 data daily) and numerous species detected (CO2, Ar, 36Ar, CH4, He, H2S, N2, O2) allowing a good statistic record and the reconstruction of the gas composition evolution of the investigated area. This methodology is based on continuous sampling of fumaroles gases and soil degassing using an extraction line, which after undergoing a series of condensation processes of the water vapour content - better described hereinafter - is analyzed through using a quadrupole mass spectrometer

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We demonstrate that the process of generating smooth transitions Call be viewed as a natural result of the filtering operations implied in the generation of discrete-time series observations from the sampling of data from an underlying continuous time process that has undergone a process of structural change. In order to focus discussion, we utilize the problem of estimating the location of abrupt shifts in some simple time series models. This approach will permit its to address salient issues relating to distortions induced by the inherent aggregation associated with discrete-time sampling of continuous time processes experiencing structural change, We also address the issue of how time irreversible structures may be generated within the smooth transition processes. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Interpolated data are an important part of the environmental information exchange as many variables can only be measured at situate discrete sampling locations. Spatial interpolation is a complex operation that has traditionally required expert treatment, making automation a serious challenge. This paper presents a few lessons learnt from INTAMAP, a project that is developing an interoperable web processing service (WPS) for the automatic interpolation of environmental data using advanced geostatistics, adopting a Service Oriented Architecture (SOA). The “rainbow box” approach we followed provides access to the functionality at a whole range of different levels. We show here how the integration of open standards, open source and powerful statistical processing capabilities allows us to automate a complex process while offering users a level of access and control that best suits their requirements. This facilitates benchmarking exercises as well as the regular reporting of environmental information without requiring remote users to have specialized skills in geostatistics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The determination and monitoring of metallic contaminants in water is a task that must be continuous, leading to the importance of the development, modification and optimization of analytical methodologies capab le of determining the various metal contaminants in natural environments, because, in many cases, the ava ilable instrumentation does not provide enough sensibility for the determination of trace values . In this study, a method of extraction and pre- concentration using a microemulsion system with in the Winsor II equilibrium was tested and optimized for the determination of Co, Cd, P b, Tl, Cu and Ni through the technique of high- resolution atomic absorption spectrometry using a continuum source (HR-CS AAS). The optimization of the temperature program for the graphite furnace (HR-CS AAS GF) was performed through the pyrolysis and atomization curves for the analytes Cd, Pb, Co and Tl with and without the use of different chemical modifiers. Cu and Ni we re analyzed by flame atomization (HR-CS F AAS) after pre-concentr ation, having the sample introduction system optimized for the realization of discrete sampling. Salinity and pH levels were also analyzed as influencing factors in the efficiency of the extraction. As final numbers, 6 g L -1 of Na (as NaCl) and 1% of HNO 3 (v/v) were defined. For the determination of the optimum extraction point, a centroid-simplex statistical plan was a pplied, having chosen as the optimum points of extraction for all of the analytes, the follo wing proportions: 70% aqueous phase, 10% oil phase and 20% co-surfactant/surfactant (C/S = 4). After extraction, the metals were determined and the merit figures obtained for the proposed method were: LOD 0,09, 0,01, 0,06, 0,05, 0,6 and 1,5 μg L -1 for Pb, Cd, Tl, Co, Cu and Ni, re spectively. Line ar ranges of ,1- 2,0 μg L -1 for Pb, 0,01-2,0 μg L -1 for Cd, 1,0 - 20 μg L -1 for Tl, 0,1-5,0 μg L -1 for Co, 2-200 μg L -1 and for Cu e Ni 5-200 μg L -1 were obtained. The enrichment factors obtained ranged between 6 and 19. Recovery testing with the certified sample show ed recovery values (n = 3, certified values) after extraction of 105 and 101, 100 and 104% for Pb, Cd, Cu and Ni respectively. Samples of sweet waters of lake Jiqui, saline water from Potengi river and water produced from the oil industry (PETROBRAS) were spiked and the recovery (n = 3) for the analytes were between 80 and 112% confirming th at the proposed method can be used in the extraction. The proposed method enabled the sepa ration of metals from complex matrices, and with good pre-concentration factor, consistent with the MPV (allowed limits) compared to CONAMA Resolution No. 357/2005 which regulat es the quality of fresh surface water, brackish and saline water in Brazil.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Theories on the link between achievement goals and achievement emotions focus on their within-person functional relationship (i.e., intraindividual relations). However, empirical studies have failed to analyze these intraindividual relations and have instead examined between-person covariation of the two constructs (i.e., interindividual relations). Aiming to better connect theory and empirical research, the present study (N = 120 10th grade students) analyzed intraindividual relations by assessing students’ state goals and emotions using experience sampling (N = 1,409 assessments within persons). In order to replicate previous findings on interindividual relations, students’ trait goals and emotions were assessed using self-report questionnaires. Despite being statistically independent, both types of relations were consistent with theoretical expectations, as shown by multi-level modeling: Mastery goals were positive predictors of enjoyment and negative predictors of boredom and anger; performance-approach goals were positive predictors of pride; and performance-avoidance goals were positive predictors of anxiety and shame. Reasons for the convergence of intra- and interindividual findings, directions for future research, and implications for educational practice are discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Intestinal dendritic cells (DCs) are believed to sample and present commensal bacteria to the gut-associated immune system to maintain immune homeostasis. How antigen sampling pathways handle intestinal pathogens remains elusive. We present a murine colitogenic Salmonella infection model that is highly dependent on DCs. Conditional DC depletion experiments revealed that intestinal virulence of S. Typhimurium SL1344 DeltainvG mutant lacking a functional type 3 secretion system-1 (DeltainvG)critically required DCs for invasion across the epithelium. The DC-dependency was limited to the early phase of infection when bacteria colocalized with CD11c(+)CX3CR1(+) mucosal DCs. At later stages, the bacteria became associated with other (CD11c(-)CX3CR1(-)) lamina propria cells, DC depletion no longer attenuated the pathology, and a MyD88-dependent mucosal inflammation was initiated. Using bone marrow chimeric mice, we showed that the MyD88 signaling within hematopoietic cells, which are distinct from DCs, was required and sufficient for induction of the colitis. Moreover, MyD88-deficient DCs supported transepithelial uptake of the bacteria and the induction of MyD88-dependent colitis. These results establish that pathogen sampling by DCs is a discrete, and MyD88-independent, step during the initiation of a mucosal innate immune response to bacterial infection in vivo.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As the development of a viable quantum computer nears, existing widely used public-key cryptosystems, such as RSA, will no longer be secure. Thus, significant effort is being invested into post-quantum cryptography (PQC). Lattice-based cryptography (LBC) is one such promising area of PQC, which offers versatile, efficient, and high performance security services. However, the vulnerabilities of these implementations against side-channel attacks (SCA) remain significantly understudied. Most, if not all, lattice-based cryptosystems require noise samples generated from a discrete Gaussian distribution, and a successful timing analysis attack can render the whole cryptosystem broken, making the discrete Gaussian sampler the most vulnerable module to SCA. This research proposes countermeasures against timing information leakage with FPGA-based designs of the CDT-based discrete Gaussian samplers with constant response time, targeting encryption and signature scheme parameters. The proposed designs are compared against the state-of-the-art and are shown to significantly outperform existing implementations. For encryption, the proposed sampler is 9x faster in comparison to the only other existing time-independent CDT sampler design. For signatures, the first time-independent CDT sampler in hardware is proposed. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study discusses retention criteria for principal components analysis (PCA) applied to Likert scale items typical in psychological questionnaires. The main aim is to recommend applied researchers to restrain from relying only on the eigenvalue-than-one criterion; alternative procedures are suggested for adjusting for sampling error. An additional objective is to add evidence on the consequences of applying this rule when PCA is used with discrete variables. The experimental conditions were studied by means of Monte Carlo sampling including several sample sizes, different number of variables and answer alternatives, and four non-normal distributions. The results suggest that even when all the items and thus the underlying dimensions are independent, eigenvalues greater than one are frequent and they can explain up to 80% of the variance in data, meeting the empirical criterion. The consequences of using Kaiser"s rule are illustrated with a clinical psychology example. The size of the eigenvalues resulted to be a function of the sample size and the number of variables, which is also the case for parallel analysis as previous research shows. To enhance the application of alternative criteria, an R package was developed for deciding the number of principal components to retain by means of confidence intervals constructed about the eigenvalues corresponding to lack of relationship between discrete variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a theoretical framework to explain the empirical finding that the estimated betas are sensitive to the sampling interval even when using continuously compounded returns. We suppose that stock prices have both permanent and transitory components. The permanent component is a standard geometric Brownian motion while the transitory component is a stationary Ornstein-Uhlenbeck process. The discrete time representation of the beta depends on the sampling interval and two components labelled \"permanent and transitory betas\". We show that if no transitory component is present in stock prices, then no sampling interval effect occurs. However, the presence of a transitory component implies that the beta is an increasing (decreasing) function of the sampling interval for more (less) risky assets. In our framework, assets are labelled risky if their \"permanent beta\" is greater than their \"transitory beta\" and vice versa for less risky assets. Simulations show that our theoretical results provide good approximations for the means and standard deviations of estimated betas in small samples. Our results can be perceived as indirect evidence for the presence of a transitory component in stock prices, as proposed by Fama and French (1988) and Poterba and Summers (1988).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a class of sampling-based decomposition methods to solve risk-averse multistage stochastic convex programs. We prove a formula for the computation of the cuts necessary to build the outer linearizations of the recourse functions. This formula can be used to obtain an efficient implementation of Stochastic Dual Dynamic Programming applied to convex nonlinear problems. We prove the almost sure convergence of these decomposition methods when the relatively complete recourse assumption holds. We also prove the almost sure convergence of these algorithms when applied to risk-averse multistage stochastic linear programs that do not satisfy the relatively complete recourse assumption. The analysis is first done assuming the underlying stochastic process is interstage independent and discrete, with a finite set of possible realizations at each stage. We then indicate two ways of extending the methods and convergence analysis to the case when the process is interstage dependent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer-based sliding mode control (SMC) is analysed. The control law is accomplished using a computer and A/D and D/A converters. Two SMC designs are presented. The first one is a continuous-time conventional SMC design, with a variable structure law, which does not take into consideration the sampling period. The second one is a discrete-time SMC design, with a smooth sliding law, which does not have a structure variable and takes into consideration the sampling period. Both techniques are applied to control an inverted pendulum system. The performance of both the continuous-time and discrete-time controllers are compared. Simulations and experimental results are shown and the effectiveness of the proposed techniques is analysed.