44 resultados para soil data requirements


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick “repairs,” which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions, without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been l in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most l log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degree would have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

EPB tunnelling requires the application of soil conditioning to increase its field of applicability particularly for cohesionless soils. Choosing the most suitable conditioning set for the various soils requires the use of a feasible laboratory test which can permit to define the characteristics of the conditioned soils and provide measurable data. A series of tests has been carried out using a laboratory screw conveyor device which was designed for this purpose and which simulates the extraction of the spoil from a pressure chamber in a similar way as in EPB tunnelling. The tested soils were medium-grain sands with varying amounts of silt and the tested conditioned mixtures were obtained with different water contents and amounts of foam. A simple slump test was also used to analyze the global characteristics of the conditioned soils. The test has shown that the proposed laboratory procedure permits a quantitative comparison to be made between different conditioning amounts and agents on the basis of measurable parameters. © 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick "repairs," which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions,without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been - in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most - log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degreewould have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network. © Springer-Verlag 2012.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil carbon stores are a major component of the annual returns required by EU governments to the Intergovernmental Panel on Climate Change. Peat has a high proportion of soil carbon due to the relatively high carbon density of peat and organic-rich soils. For this reason it has become increasingly important to measure and model soil carbon stores and changes in peat stocks to facilitate the management of carbon changes over time. The approach investigated in this research evaluates the use of airborne geophysical (radiometric) data to estimate peat thickness using the attenuation of bedrock geology radioactivity by superficial peat cover. Remotely sensed radiometric data are validated with ground peat depth measurements combined with non-invasive geophysical surveys. Two field-based case studies exemplify and validate the results. Variography and kriging are used to predict peat thickness from point measurements of peat depth and airborne radiometric data and provide an estimate of uncertainty in the predictions. Cokriging, by assessing the degree of spatial correlation between recent remote sensed geophysical monitoring and previous peat depth models, is used to examine changes in peat stocks over time. The significance of the coregionalisation is that the spatial cross correlation between the remote and ground based data can be used to update the model of peat depth. The result is that by integrating remotely sensed data with ground geophysics, the need is reduced for extensive ground-based monitoring and invasive peat depth measurements. The overall goal is to provide robust estimates of peat thickness to improve estimates of carbon stocks. The implications from the research have a broader significance that promotes a reduction in the need for damaging onsite peat thickness measurement and an increase in the use of remote sensed data for carbon stock estimations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the response of humid mid-latitude forests to changes in precipitation, temperature, nutrient cycling, and disturbance is critical to improving our predictive understanding of changes in the surface-subsurface energy balance due to climate change. Mechanistic understanding of the effects of long-term and transient moisture conditions are needed to quantify
linkages between changing redox conditions, microbial activity, and soil mineral and nutrient interactions on C cycling and greenhouse gas releases. To illuminate relationships between the soil chemistry, microbial communities and organic C we established transects across hydraulic and topographic gradients in a small watershed with transient moisture conditions. Valley bottoms tend to be more frequently saturated than ridge tops and side slopes which generally are only saturated when shallow storm flow zones are active. Fifty shallow (~36”) soil cores were collected during timeframes representative of low CO2, soil winter conditions and high CO2, soil summer conditions. Cores were subdivided into 240 samples based on pedology and analyses of the geochemical (moisture content, metals, pH, Fe species, N, C, CEC, AEC) and microbial (16S rRNA gene
amplification with Illumina MiSeq sequencing) characteristics were conducted and correlated to watershed terrain and hydrology. To associate microbial metabolic activity with greenhouse gas emissions we installed 17 soil gas probes, collected gas samples for 16 months and analyzed them for CO2 and other fixed and greenhouse gasses. Parallel to the experimental efforts our data is being used to support hydrobiogeochemical process modeling by coupling the Community Land Model (CLM) with a subsurface process model (PFLOTRAN) to simulate processes and interactions from the molecular to watershed scales. Including above ground processes (biogeophysics, hydrology, and vegetation dynamics), CLM provides mechanistic water, energy, and organic matter inputs to the surface/subsurface models, in which coupled biogeochemical reaction
networks are used to improve the representation of below-ground processes. Preliminary results suggest that inclusion of above ground processes from CLM greatly improves the prediction of moisture response and water cycle at the watershed scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geogenic nickel (Ni), vanadium (V) and chromium (Cr) are present at elevated levels in soils in Northern Ireland. Whilst Ni, V and Cr total soil concentrations share common geological origins, their respective levels of oral bioaccessibility are influenced by different soil-geochemical factors. Oral bioaccessibility extractions were carried out on 145 soil samples overlying 9 different bedrock types to measure the bioaccessible portions of Ni, V and Cr. Principal component analysis identified two components (PC1 and PC2) accounting for 69% of variance across 13 variables from the Northern Ireland Tellus Survey geochemical data. PC1 was associated with underlying basalt bedrock, higher bioaccessible Cr concentrations and lower Ni bioaccessibility. PC2 was associated with regional variance in soil chemistry and hosted factors accounting for higher Ni and V bioaccessibility. Eight per cent of total V was solubilised by gastric extraction on average across the study area. High median proportions of bioaccessible Ni were observed in soils overlying sedimentary rock types. Whilst Cr bioaccessible fractions were low (max = 5.4%), the highest measured bioaccessible Cr concentration reached 10.0 mg kg-1, explained by factors linked to PC1 including high total Cr concentrations in soils overlying basalt bedrock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IMPORTANCE Systematic reviews and meta-analyses of individual participant data (IPD) aim to collect, check, and reanalyze individual-level data from all studies addressing a particular research question and are therefore considered a gold standard approach to evidence synthesis. They are likely to be used with increasing frequency as current initiatives to share clinical trial data gain momentum and may be particularly important in reviewing controversial therapeutic areas.

OBJECTIVE To develop PRISMA-IPD as a stand-alone extension to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement, tailored to the specific requirements of reporting systematic reviews and meta-analyses of IPD. Although developed primarily for reviews of randomized trials, many items will apply in other contexts, including reviews of diagnosis and prognosis.

DESIGN Development of PRISMA-IPD followed the EQUATOR Network framework guidance and used the existing standard PRISMA Statement as a starting point to draft additional relevant material. A web-based survey informed discussion at an international workshop that included researchers, clinicians, methodologists experienced in conducting systematic reviews and meta-analyses of IPD, and journal editors. The statement was drafted and iterative refinements were made by the project, advisory, and development groups. The PRISMA-IPD Development Group reached agreement on the PRISMA-IPD checklist and flow diagram by consensus.

FINDINGS Compared with standard PRISMA, the PRISMA-IPD checklist includes 3 new items that address (1) methods of checking the integrity of the IPD (such as pattern of randomization, data consistency, baseline imbalance, and missing data), (2) reporting any important issues that emerge, and (3) exploring variation (such as whether certain types of individual benefit more from the intervention than others). A further additional item was created by reorganization of standard PRISMA items relating to interpreting results. Wording was modified in 23 items to reflect the IPD approach.

CONCLUSIONS AND RELEVANCE PRISMA-IPD provides guidelines for reporting systematic reviews and meta-analyses of IPD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two common scenarios in Geoforensics (definition in text) are considered: the provenance, or localization of unknown samples and the question of sample variability at scenes of crime/alibi locations. Both have been discussed in forensic and soil science publications, but mostly within a theoretical or non-forensic context. These previous publications provide context for the two case study scenarios (one actual, one based on a range of criminal casework) that consider provenance and variability. A challenging scientific question in geoforensics is the provenance question: ‘where may this sample have come from?’ A question the Tellus data can assist in answering. The question of variation between samples maybe less of a challenge, yet variation between a suspect sample within a scene of crime requires detailed sampling. Variation on a larger (tens to hundreds of kilometres) scale may provide useful intelligence on where a sample came from. To summarise, databases such as Tellus and TellusBorder may be used as effective tools to assist in the search for the origin of displaced soil and sediment

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaction of organic xenobiotics with soil water-soluble humic material (WSHM) may influence their environmental fate and bioavailability. We utilized bacterial assays (lux-based toxicity and mineralization by Burkholderia sp. RASC) to assess temporal changes in the bioavailability of [14C]-2,4-dichlorophenol (2,4-DCP) in soil water extracts (29.5 μg mL-1 2,4-DCP; 840.2 μg mL-1 organic carbon). HPLC determined and bioavailable concentrations were compared. Gel permeation chromatography (GPC) was used to confirm the association of a fraction (>50%) of [14C]-2,4-DCP with WSHM. Subtle differences in parameters describing 2,4-DCP mineralization curves were recorded for different soil-2,4-DCP contact times. Problems regarding the interpretation of mineralization data when assessing the bioavailability of toxic compounds are discussed. The lux-bioassay revealed a time-dependent reduction in 2,4-DCP bioavailability: after 7 d, less than 20% was bioavailable. However, GPC showed no quantitative difference in the amount of WSHM-associated 2,4-DCP over this time. These data suggest qualitative changes in the nature of the 2,4-DCP-WSHM association and that associated 2,4-DCP may exert a toxic effect. Although GPC distinguished between free- and WSHM-associated 2,4-DCP, it did not resolve the temporal shift in bioavailability revealed by the lux biosensor. These results stress that assessment of risk posed by chemicals must be considered using appropriate biological assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: Evaluate current data sharing activities of UK publicly funded Clinical Trial Units (CTUs) and identify good practices and barriers.

STUDY DESIGN AND SETTING: Web-based survey of Directors of 45 UK Clinical Research Collaboration (UKCRC)-registered CTUs.

RESULTS: Twenty-three (51%) CTUs responded: Five (22%) of these had an established data sharing policy and eight (35%) specifically requested consent to use patient data beyond the scope of the original trial. Fifteen (65%) CTUs had received requests for data, and seven (30%) had made external requests for data in the previous 12 months. CTUs supported the need for increased data sharing activities although concerns were raised about patient identification, misuse of data, and financial burden. Custodianship of clinical trial data and requirements for a CTU to align its policy to their parent institutes were also raised. No CTUs supported the use of an open access model for data sharing.

CONCLUSION: There is support within the publicly funded UKCRC-registered CTUs for data sharing, but many perceived barriers remain. CTUs are currently using a variety of approaches and procedures for sharing data. This survey has informed further work, including development of guidance for publicly funded CTUs, to promote good practice and facilitate data sharing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background English National Quality Requirements mandate out-of-hours primary care services to routinely audit patient experience, but do not state how it should be done.

Objectives We explored how providers collect patient feedback data and use it to inform service provision. We also explored staff views on the utility of out-of-hours questions from the English General Practice Patient Survey (GPPS).

Methods A qualitative study was conducted with 31 staff (comprising service managers, general practitioners and administrators) from 11 out-of-hours primary care providers in England, UK. Staff responsible for patient experience audits within their service were sampled and data collected via face-to-face semistructured interviews.

Results Although most providers regularly audited their patients’ experiences by using patient surveys, many participants expressed a strong preference for additional qualitative feedback. Staff provided examples of small changes to service delivery resulting from patient feedback, but service-wide changes were not instigated. Perceptions that patients lacked sufficient understanding of the urgent care system in which out-of-hours primary care services operate were common and a barrier to using feedback to enable change. Participants recognised the value of using patient experience feedback to benchmark services, but perceived weaknesses in the out-of-hours items from the GPPS led them to question the validity of using these data for benchmarking in its current form.

Conclusions The lack of clarity around how out-of-hours providers should audit patient experience hinders the utility of the National Quality Requirements. Although surveys were common, patient feedback data had only a limited role in service change. Data derived from the GPPS may be used to benchmark service providers, but refinement of the out-of-hours items is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this research, an agent-based model (ABM) was developed to generate human movement routes between homes and water resources in a rural setting, given commonly available geospatial datasets on population distribution, land cover and landscape resources. ABMs are an object-oriented computational approach to modelling a system, focusing on the interactions of autonomous agents, and aiming to assess the impact of these agents and their interactions on the system as a whole. An A* pathfinding algorithm was implemented to produce walking routes, given data on the terrain in the area. A* is an extension of Dijkstra's algorithm with an enhanced time performance through the use of heuristics. In this example, it was possible to impute daily activity movement patterns to the water resource for all villages in a 75 km long study transect across the Luangwa Valley, Zambia, and the simulated human movements were statistically similar to empirical observations on travel times to the water resource (Chi-squared, 95% confidence interval). This indicates that it is possible to produce realistic data regarding human movements without costly measurement as is commonly achieved, for example, through GPS, or retrospective or real-time diaries. The approach is transferable between different geographical locations, and the product can be useful in providing an insight into human movement patterns, and therefore has use in many human exposure-related applications, specifically epidemiological research in rural areas, where spatial heterogeneity in the disease landscape, and space-time proximity of individuals, can play a crucial role in disease spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional practice in Regional Geochemistry includes as a final step of any geochemical campaign the generation of a series of maps, to show the spatial distribution of each of the components considered. Such maps, though necessary, do not comply with the compositional, relative nature of the data, which unfortunately make any conclusion based on them sensitive
to spurious correlation problems. This is one of the reasons why these maps are never interpreted isolated. This contribution aims at gathering a series of statistical methods to produce individual maps of multiplicative combinations of components (logcontrasts), much in the flavor of equilibrium constants, which are designed on purpose to capture certain aspects of the data.
We distinguish between supervised and unsupervised methods, where the first require an external, non-compositional variable (besides the compositional geochemical information) available in an analogous training set. This external variable can be a quantity (soil density, collocated magnetics, collocated ratio of Th/U spectral gamma counts, proportion of clay particle fraction, etc) or a category (rock type, land use type, etc). In the supervised methods, a regression-like model between the external variable and the geochemical composition is derived in the training set, and then this model is mapped on the whole region. This case is illustrated with the Tellus dataset, covering Northern Ireland at a density of 1 soil sample per 2 square km, where we map the presence of blanket peat and the underlying geology. The unsupervised methods considered include principal components and principal balances
(Pawlowsky-Glahn et al., CoDaWork2013), i.e. logcontrasts of the data that are devised to capture very large variability or else be quasi-constant. Using the Tellus dataset again, it is found that geological features are highlighted by the quasi-constant ratios Hf/Nb and their ratio against SiO2; Rb/K2O and Zr/Na2O and the balance between these two groups of two variables; the balance of Al2O3 and TiO2 vs. MgO; or the balance of Cr, Ni and Co vs. V and Fe2O3. The largest variability appears to be related to the presence/absence of peat.