995 resultados para SOIL TEST
Resumo:
The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been adopted as part of the U.S. Environmental Protection Agency’s BASINS (Better Assessment Science Integrating Point & Nonpoint Sources) software package and is being used by many U.S. federal and state agencies, including the USDA within the Conservation Effects Assessment Project. At present, over 250 peer-reviewed, published articles have been identified that report SWAT applications, reviews of SWAT components, or other research that includes SWAT. Many of these peer-reviewed articles are summarized here according to relevant application categories such as streamflow calibration and related hydrologic analyses, climate change impacts on hydrology, pollutant load assessments, comparisons with other models, and sensitivity analyses and calibration techniques. Strengths and weaknesses of the model are presented, and recommended research needs for SWAT are provided.
Resumo:
In a weighted spatial network, as specified by an exchange matrix, the variances of the spatial values are inversely proportional to the size of the regions. Spatial values are no more exchangeable under independence, thus weakening the rationale for ordinary permutation and bootstrap tests of spatial autocorrelation. We propose an alternative permutation test for spatial autocorrelation, based upon exchangeable spatial modes, constructed as linear orthogonal combinations of spatial values. The coefficients obtain as eigenvectors of the standardised exchange matrix appearing in spectral clustering, and generalise to the weighted case the concept of spatial filtering for connectivity matrices. Also, two proposals aimed at transforming an acessibility matrix into a exchange matrix with with a priori fixed margins are presented. Two examples (inter-regional migratory flows and binary adjacency networks) illustrate the formalism, rooted in the theory of spectral decomposition for reversible Markov chains.
Resumo:
Decline in gait stability has been associated with increased fall risk in older adults. Reliable and clinically feasible methods of gait instability assessment are needed. This study evaluated the relative and absolute reliability and concurrent validity of the testing procedure of the clinical version of the Narrow Path Walking Test (NPWT) under single task (ST) and dual task (DT) conditions. Thirty independent community-dwelling older adults (65-87 years) were tested twice. Participants were instructed to walk within the 6-m narrow path without stepping out. Trial time, number of steps, trial velocity, number of step errors, and number of cognitive task errors were determined. Intraclass correlation coefficients (ICCs) were calculated as indices of agreement, and a graphic approach called "mountain plot" was applied to help interpret the direction and magnitude of disagreements between testing procedures. Smallest detectable change and smallest real difference (SRD) were computed to determine clinically relevant improvement at group and individual levels, respectively. Concurrent validity was assessed using Performance Oriented Mobility Assessment Tool (POMA) and the Short Physical Performance Battery (SPPB). Test-retest agreement (ICC1,2) varied from 0.77 to 0.92 in ST and from 0.78 to 0.92 in DT conditions, with no apparent systematic differences between testing procedures demonstrated by the mountain plot graphs. Smallest detectable change and smallest real change were small for motor task performance and larger for cognitive errors. Significant correlations were observed for trial velocity and trial time with POMA and SPPB. The present results indicate that the NPWT testing procedure is highly reliable and reproducible.
Resumo:
This paper reviews the role of alluvial soils in vegetated gravelly river braid plains. When considering decadal time scales of river evolution, we argue that it becomes vital to consider soil development as an emergent property of the developing ecosystem. Soil processes have been relatively overlooked in accounts of the interactions between braided river processes and vegetation, although soils have been observed on vegetated fluvial landforms. We hypothesise that soil development plays a major role in the transition (speed and pathway) from a fresh sediment deposit to a vegetated soil-covered landform. Disturbance (erosion and/or deposition), vertical sediment structure (process history), vegetation succession, biological activity and water table fluctuation are seen as the main controls on early alluvial soil evolution. Erosion and deposition processes may not only act as soil disturbing agents, but also as suppliers of ecosystem resources, because of their role in delivering and changing access (e.g. through avulsion) to fluxes of water, fine sediments and organic matter. In turn, the associated initial ecosystem may influence further fluvial landform development, such as through the trapping of fine-grained sediments (e.g. sand) by the engineering action of vegetation and the deposit stabilisation by the developing above and belowground biomass. This may create a strong feedback between geomorphological processes, vegetation succession and soil evolution which we summarise in a conceptual model. We illustrate this model by an example from the Allondon River (CH) and identify the research questions that follow.
Resumo:
So far, cardiac arrest is still associated with high mortality or severe neurological disability in survivors. At the tissue level, cardiac arrest results into an acute condition of generalized hypoxia. A better understanding of the pathophysiology of ischemia-reperfusion and of the inflammatory response that develops after cardiac arrest could help to design novel therapeutic strategies in the future. It seems unlikely that a single drug, acting as a <magic bullet>, might be able to improve survival or neurological prognosis. Lessons learned from pathophysiological mechanisms rather indicate that combined therapies, involving thrombolysis, neuroprotective agents, antioxidants and anti-inflammatory molecules, together with temperature cooling, might represent helpful strategies to improve patient's outcome after cardiac arrest.
Resumo:
The effectiveness of pre-play communication in achieving efficientoutcomes has long been a subject of controversy. In some environments,cheap talk may help to achieve coordination. However, Aumannconjectures that, in a variant of the Stag Hunt game, a signal forefficient play is not self-enforcing and concludes that an "agreementto play [the efficient outcome] conveys no information about what theplayers will do." Harsanyi and Selten (1988) cite this example as anillustration of risk-dominance vs. payoff-dominance. Farrell and Rabin(1996) agree with the logic, but suspect that cheap talk willnonetheless achieve efficiency. The conjecture is tested with one-waycommunication. When the sender first chooses a signal and then anaction, there is impressive coordination: a 94% probability for thepotentially efficient (but risky) play, given a signal for efficientplay. Without communication, efforts to achieve efficiency wereunsuccessful, as the proportion of B moves is only 35%. I also test ahypothesis that the order of the action and the signal affects theresults, finding that the decision order is indeed important. WhileAumann s conjecture is behaviorally disconfirmed when the signal isdetermined initially, the signal s credibility seems to be much moresuspect when the sender is known to have first chosen an action, andthe results are not statistically distinguishable from those whenthere is no signal. Some applications and issues in communication andcoordination are discussed.
Resumo:
This paper argues that low-stakes test scores, available in surveys, may be partially determinedby test-taking motivation, which is associated with personality traits but not with cognitiveability. Therefore, such test score distributions may not be informative regarding cognitiveability distributions. Moreover, correlations, found in survey data, between high test scoresand economic success may be partially caused by favorable personality traits. To demonstratethese points, I use the coding speed test that was administered without incentives to NationalLongitudinal Survey of Youth 1979 (NLSY) participants. I suggest that due to its simplicityits scores may especially depend on individuals' test-taking motivation. I show that controllingfor conventional measures of cognitive skills, the coding speed scores are correlated with futureearnings of male NLSY participants. Moreover, the coding speed scores of highly motivated,though less educated, population (potential enlists to the armed forces) are higher than NLSYparticipants' scores. I then use controlled experiments to show that when no performance-basedincentives are provided, participants' characteristics, but not their cognitive skills, affect effortinvested in the coding speed test. Thus, participants with the same ability (measured by theirscores on an incentivized test) have significantly different scores on tests without performance-based incentives.
Resumo:
A test-chamber (K&L-Chamber) made of cardboard and acrylic plastic, and consisting in four sections (A, B, C and D) was developed by Klowden & Lea (1978) for Aedes aegypti host-seeking behavior studies. Later, Foster & Lutes (1985) also used an identical chamber to successfully evaluate the efficacy of electronic repellers. It was described here a modified K&L-Chamber for behavioral studies of Ae. aegypti adults. The chamber was made in polystyrene, consisting of three sections (A, B and C) and using a human hand and a fluorescent lamp as stimulus to attract the mosquitoes. The suitability of the present test-chamber was validated assaying 80 replicates and releasing 10 Ae. aegypti females in each replicate. The females were released in the section A and allowed to fly to the section C. A mean of 96.0% (s.e. 0.213) Ae. aegypti females successfully reached section C. The present test-chamber is cheaper and easier to handle and as efficient as K&L-Chamber, when compared to Foster & Lutes (1978) that noticed 93.8% of Ae. aegypti reaching the trap section.
Resumo:
The oxalate-carbonate pathway (OCP) leads to a potential carbon sink in terrestrial environments. This process is linked to the activity of oxalotrophic bacteria. Although isolation and molecular characterizations are used to study oxalotrophic bacteria, these approaches do not give information on the active oxalotrophs present in soil undergoing the OCP. The aim of this study was to assess the diversity of active oxalotrophic bacteria in soil microcosms using the Bromodeoxyuridine (BrdU) DNA labeling technique. Soil was collected near an oxalogenic tree (Milicia excelsa). Different concentrations of calcium oxalate (0.5%, 1%, and 4% w/w) were added to the soil microcosms and compared with an untreated control. After 12days of incubation, a maximal pH of 7.7 was measured for microcosms with oxalate (initial pH 6.4). At this time point, a DGGE profile of the frc gene was performed from BrdU-labeled soil DNA and unlabeled soil DNA. Actinobacteria (Streptomyces- and Kribbella-like sequences), Gammaproteobacteria and Betaproteobacteria were found as the main active oxalotrophic bacterial groups. This study highlights the relevance of Actinobacteria as members of the active bacterial community and the identification of novel uncultured oxalotrophic groups (i.e. Kribbella) active in soils.
Resumo:
This paper studies two important reasons why people violate procedure invariance, loss aversion and scale compatibility. The paper extends previous research on loss aversion and scale compatibility by studying loss aversion and scale compatibility simultaneously, by looking at a new decision domain, medical decision analysis, and by examining the effect of loss aversion and scale compatibility on "well-contemplated preferences." We find significant evidence both of loss aversion and scale compatibility. However, the sizes of the biases due to loss aversion and scale compatibility vary over trade-offs and most participants do not behave consistently according to loss aversion or scale compatibility. In particular, the effect of loss aversion in medical trade-offs decreases with duration. These findings are encouraging for utility measurement and prescriptive decision analysis. There appear to exist decision contexts in which the effects of loss aversion and scale compatibility can be minimized and utilities can be measured that do not suffer from these distorting factors.
Resumo:
This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.
Resumo:
Com características morfológicas e edafo-climáticas extremamente diversificadas, a ilha de Santo Antão em Cabo Verde apresenta uma reconhecida vulnerabilidade ambiental a par de uma elevada carência de estudos científicos que incidam sobre essa realidade e sirvam de base à uma compreensão integrada dos fenómenos. A cartografia digital e as tecnologias de informação geográfica vêm proporcionando um avanço tecnológico na colecção, armazenamento e processamento de dados espaciais. Várias ferramentas actualmente disponíveis permitem modelar uma multiplicidade de factores, localizar e quantificar os fenómenos bem como e definir os níveis de contribuição de diferentes factores no resultado final. No presente estudo, desenvolvido no âmbito do curso de pós-graduação e mestrado em sistemas de Informação geográfica realizado pela Universidade de Trás-os-Montes e Alto Douro, pretende-se contribuir para a minimização do deficit de informação relativa às características biofísicas da citada ilha, recorrendo-se à aplicação de tecnologias de informação geográfica e detecção remota, associadas à análise estatística multivariada. Nesse âmbito, foram produzidas e analisadas cartas temáticas e desenvolvido um modelo de análise integrada de dados. Com efeito, a multiplicidade de variáveis espaciais produzidas, de entre elas 29 variáveis com variação contínua passíveis de influenciar as características biofísicas da região e, possíveis ocorrências de efeitos mútuos antagónicos ou sinergéticos, condicionam uma relativa complexidade à interpretação a partir dos dados originais. Visando contornar este problema, recorre-se a uma rede de amostragem sistemática, totalizando 921 pontos ou repetições, para extrair os dados correspondentes às 29 variáveis nos pontos de amostragem e, subsequente desenvolvimento de técnicas de análise estatística multivariada, nomeadamente a análise em componentes principais. A aplicação destas técnicas permitiu simplificar e interpretar as variáreis originais, normalizando-as e resumindo a informação contida na diversidade de variáveis originais, correlacionadas entre si, num conjunto de variáveis ortogonais (não correlacionadas), e com níveis de importância decrescente, as componentes principais. Fixou-se como meta a concentração de 75% da variância dos dados originais explicadas pelas primeiras 3 componentes principais e, desenvolveu-se um processo interactivo em diferentes etapas, eliminando sucessivamente as variáveis menos representativas. Na última etapa do processo as 3 primeiras CP resultaram em 74,54% da variância dos dados originais explicadas mas, que vieram a demonstrar na fase posterior, serem insuficientes para retratar a realidade. Optou-se pela inclusão da 4ª CP (CP4), com a qual 84% da referida variância era explicada e, representando oito variáveis biofísicas: a altitude, a densidade hidrográfica, a densidade de fracturação geológica, a precipitação, o índice de vegetação, a temperatura, os recursos hídricos e a distância à rede hidrográfica. A subsequente interpolação da 1ª componente principal (CP1) e, das principais variáveis associadas as componentes CP2, CP3 e CP4 como variáveis auxiliares, recorrendo a técnicas geoestatística em ambiente ArcGIS permitiu a obtenção de uma carta representando 84% da variação das características biofísicas no território. A análise em clusters validada pelo teste “t de Student” permitiu reclassificar o território em 6 unidades biofísicas homogéneas. Conclui-se que, as tecnologias de informação geográfica actualmente disponíveis a par de facilitar análises interactivas e flexíveis, possibilitando que se faça variar temas e critérios, integrar novas informações e introduzir melhorias em modelos construídos com bases em informações disponíveis num determinado contexto, associadas a técnicas de análise estatística multivariada, possibilitam, com base em critérios científicos, desenvolver a análise integrada de múltiplas variáveis biofísicas cuja correlação entre si, torna complexa a compreensão integrada dos fenómenos.
Resumo:
This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.