982 resultados para high-resolution methods
Resumo:
A multi-proxy study of a Holocene sediment core (RF 93-30) from the western flank of the central Adriatic, in 77 m of water, reveals a sequence of changes in terrestrial vegetation, terrigenous sediment input and benthic fauna, as well as evidence for variations in sea surface temperature spanning most of the last 7000 yr. The chronology of sedimentation is based on several lines of evidence, including AMS 14C dates of foraminifera extracted from the core, palaeomagnetic secular variation, pollen indicators and dated tephra. The temporal resolution increases towards the surface and, for some of the properties measured, is sub-decadal for the last few centuries. The main changes recorded in vegetation, sedimentation and benthic foraminiferal assemblages appear to be directly related to human activity in the sediment source area, which includes the Po valley and the eastern flanks of the central and northern Appenines. The most striking episodes of deforestation and expanding human impact begin around 3600 BP (Late Bronze Age) and 700 BP (Medieval) and each leads to an acceleration in mass sedimentation and an increase in the proportion of terrigenous material, reflecting the response of surface processes to widespread forest clearance and cultivation. Although human impact appears to be the proximal cause of these changes, climatic effects may also have been important. During these periods, signs of stress are detectable in the benthic foram morphotype assemblages. Between these two periods of increased terrigeneous sedimentation there is smaller peak in sedimentation rate around 2400BP which is not associated with evidence for deforestation, shifts in the balance between terrigenous and authigenic sedimentation, or changes in benthic foraminifera. The mineral magnetic record provides a sensitive indicator of changing sediment sources: during forested periods of reduced terrigenous input it is dominated by authigenic bacterial magnetite, whereas during periods of increased erosion, anti-ferromagetic minerals (haematite and/or goethite) become more important, as well as both paramagnetic minerals and super-paramagnetic magnetite. Analysis of the alkenone, U37k′, record provides an indication of possible changes in sea surface temperature during the period, but it is premature to place too much reliance on these inferred changes until the indirect effects of past changes in the depth of the halocline and in circulation have been more fully evaluated. The combination of methods used and the results obtained illustrate the potential value of such high resolution near-shore marine sedimentary sequences for recording wide-scale human impact, documenting the effects of this on marine sedimentation and fauna and, potentially, disentangling evidence for human activities from that for past changes in climate.
Resumo:
As low carbon technologies become more pervasive, distribution network operators are looking to support the expected changes in the demands on the low voltage networks through the smarter control of storage devices. Accurate forecasts of demand at the single household-level, or of small aggregations of households, can improve the peak demand reduction brought about through such devices by helping to plan the appropriate charging and discharging cycles. However, before such methods can be developed, validation measures are required which can assess the accuracy and usefulness of forecasts of volatile and noisy household-level demand. In this paper we introduce a new forecast verification error measure that reduces the so called “double penalty” effect, incurred by forecasts whose features are displaced in space or time, compared to traditional point-wise metrics, such as Mean Absolute Error and p-norms in general. The measure that we propose is based on finding a restricted permutation of the original forecast that minimises the point wise error, according to a given metric. We illustrate the advantages of our error measure using half-hourly domestic household electrical energy usage data recorded by smart meters and discuss the effect of the permutation restriction.
Resumo:
Background: Targeted Induced Loci Lesions IN Genomes (TILLING) is increasingly being used to generate and identify mutations in target genes of crop genomes. TILLING populations of several thousand lines have been generated in a number of crop species including Brassica rapa. Genetic analysis of mutants identified by TILLING requires an efficient, high-throughput and cost effective genotyping method to track the mutations through numerous generations. High resolution melt (HRM) analysis has been used in a number of systems to identify single nucleotide polymorphisms (SNPs) and insertion/deletions (IN/DELs) enabling the genotyping of different types of samples. HRM is ideally suited to high-throughput genotyping of multiple TILLING mutants in complex crop genomes. To date it has been used to identify mutants and genotype single mutations. The aim of this study was to determine if HRM can facilitate downstream analysis of multiple mutant lines identified by TILLING in order to characterise allelic series of EMS induced mutations in target genes across a number of generations in complex crop genomes. Results: We demonstrate that HRM can be used to genotype allelic series of mutations in two genes, BraA.CAX1a and BraA.MET1.a in Brassica rapa. We analysed 12 mutations in BraA.CAX1.a and five in BraA.MET1.a over two generations including a back-cross to the wild-type. Using a commercially available HRM kit and the Lightscanner™ system we were able to detect mutations in heterozygous and homozygous states for both genes. Conclusions: Using HRM genotyping on TILLING derived mutants, it is possible to generate an allelic series of mutations within multiple target genes rapidly. Lines suitable for phenotypic analysis can be isolated approximately 8-9 months (3 generations) from receiving M3 seed of Brassica rapa from the RevGenUK TILLING service.
Resumo:
Tests of the new Rossby wave theories that have been developed over the past decade to account for discrepancies between theoretical wave speeds and those observed by satellite altimeters have focused primarily on the surface signature of such waves. It appears, however, that the surface signature of the waves acts only as a rather weak constraint, and that information on the vertical structure of the waves is required to better discriminate between competing theories. Due to the lack of 3-D observations, this paper uses high-resolution model data to construct realistic vertical structures of Rossby waves and compares these to structures predicted by theory. The meridional velocity of a section at 24° S in the Atlantic Ocean is pre-processed using the Radon transform to select the dominant westward signal. Normalized profiles are then constructed using three complementary methods based respectively on: (1) averaging vertical profiles of velocity, (2) diagnosing the amplitude of the Radon transform of the westward propagating signal at different depths, and (3) EOF analysis. These profiles are compared to profiles calculated using four different Rossby wave theories: standard linear theory (SLT), SLT plus mean flow, SLT plus topographic effects, and theory including mean flow and topographic effects. Our results support the classical theoretical assumption that westward propagating signals have a well-defined vertical modal structure associated with a phase speed independent of depth, in contrast with the conclusions of a recent study using the same model but for different locations in the North Atlantic. The model structures are in general surface intensified, with a sign reversal at depth in some regions, notably occurring at shallower depths in the East Atlantic. SLT provides a good fit to the model structures in the top 300 m, but grossly overestimates the sign reversal at depth. The addition of mean flow slightly improves the latter issue, but is too surface intensified. SLT plus topography rectifies the overestimation of the sign reversal, but overestimates the amplitude of the structure for much of the layer above the sign reversal. Combining the effects of mean flow and topography provided the best fit for the mean model profiles, although small errors at the surface and mid-depths are carried over from the individual effects of mean flow and topography respectively. Across the section the best fitting theory varies between SLT plus topography and topography with mean flow, with, in general, SLT plus topography performing better in the east where the sign reversal is less pronounced. None of the theories could accurately reproduce the deeper sign reversals in the west. All theories performed badly at the boundaries. The generalization of this method to other latitudes, oceans, models and baroclinic modes would provide greater insight into the variability in the ocean, while better observational data would allow verification of the model findings.
Resumo:
This paper presents a new method to calculate sky view factors (SVFs) from high resolution urban digital elevation models using a shadow casting algorithm. By utilizing weighted annuli to derive SVF from hemispherical images, the distance light source positions can be predefined and uniformly spread over the whole hemisphere, whereas another method applies a random set of light source positions with a cosine-weighted distribution of sun altitude angles. The 2 methods have similar results based on a large number of SVF images. However, when comparing variations at pixel level between an image generated using the new method presented in this paper with the image from the random method, anisotropic patterns occur. The absolute mean difference between the 2 methods is 0.002 ranging up to 0.040. The maximum difference can be as much as 0.122. Since SVF is a geometrically derived parameter, the anisotropic errors created by the random method must be considered as significant.
Resumo:
Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.
Resumo:
An all-in-one version of a capacitively coupled contactless conductivity detector is introduced. The absence of moving parts (potentiometers and connectors) makes it compact (6.5 cm(3)) and robust. A local oscillator, working at 1.1 MHz, was optimized to use capillaries of id from 20 to 100 lam. Low noise circuitry and a high-resolution analog-to-digital converter (ADC) (21 bits effective) grant good sensitivities for capillaries and background electrolytes currently used in capillary electrophoresis. The fixed frequency and amplitude of the signal generator is a drawback that is compensated by the steady calibration curves for conductivity. Another advantage is the possibility of determining the inner diameter of a capillary by reading the ADC when air and subsequently water flow through the capillary. The difference of ADC reading may be converted into the inner diameter by a calibration curve. This feature is granted by the 21-bit ADC, which eliminates the necessity of baseline compensation by hardware. In a typical application, the limits of detection based on the 3 sigma criterion (without baseline filtering) were 0.6, 0.4, 0.3, 0.5, 0.6, and 0.8 mu mol/L for K(+), Ba(2+), Ca(2+), Na(+), Mg(2+), and Li(+), respectively, which is comparable to other high-quality implementations of a capacitively coupled contactless conductivity detector.
Resumo:
A suit able decision-making on managing a contaminated site characterization program is strongly dependent of the diagnosis process. A detailed diagnosis can be done based on a Conceptual Site Model (CSM) elaboration using high resolution site characterization tools. The piezocone (CPTu) test is a high resolution tool which allows attaching several specific sensors, like the resistivity probe. This hybrid device is called the resistivity piezocone (RCPTu). A simulated geo-environmental site characterization program was performed on an erosion site using different tools (direct push tools soil samplers, hollow stem auger (HSA) drilling and RCPTu tests) to develop the CSM for a site similar to the Brazilian conditions. It was observed a good agreement between the site profiles interpreted by the different methods. The resistivity sensor attached to the piezocone improved the interpretation and the decision-making process on site was significantly better for the CSM elaboration. The RCPTu test data also allowed identifying the hydrogeological heterogeneities. The present study shows that the RCPTu test is also a useful and powerful tool to development an accurate CSM in a Brazilian condition, especially in an approach that prioritizes high resolution geo-environmental investigation. © 2013 Taylor & Francis Group.
Resumo:
A bounded upwinding scheme for numerical solution of hyperbolic conservation laws and Navier-Stokes equations is presented. The scheme is based on convection boundedness criterion and total variation diminishing stability criteria and developed by employing continuously differentiable functions. The accuracy of the scheme is verified by assessing the error and observed convergence rate on 1-D benchmark test cases. A comparative study between the new scheme and conventional total variation diminishing/convection boundedness criterion-based upwind schemes to solve standard nonlinear hyperbolic conservation laws is also accomplished. The scheme is then examined in the simulation of Newtonian and non-Newtonian fluid flows of increasing complexity; a satisfactory agreement has been observed in terms of the overall behavior. Finally, the scheme is used to study the hydrodynamics of a gas-solid flow in a bubbling fluidized bed. © 2013 John Wiley & Sons, Ltd.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A computational pipeline combining texture analysis and pattern classification algorithms was developed for investigating associations between high-resolution MRI features and histological data. This methodology was tested in the study of dentate gyrus images of sclerotic hippocampi resected from refractory epilepsy patients. Images were acquired using a simple surface coil in a 3.0T MRI scanner. All specimens were subsequently submitted to histological semiquantitative evaluation. The computational pipeline was applied for classifying pixels according to: a) dentate gyrus histological parameters and b) patients' febrile or afebrile initial precipitating insult history. The pipeline results for febrile and afebrile patients achieved 70% classification accuracy, with 78% sensitivity and 80% specificity [area under the reader observer characteristics (ROC) curve: 0.89]. The analysis of the histological data alone was not sufficient to achieve significant power to separate febrile and afebrile groups. Interesting enough, the results from our approach did not show significant correlation with histological parameters (which per se were not enough to classify patient groups). These results showed the potential of adding computational texture analysis together with classification methods for detecting subtle MRI signal differences, a method sufficient to provide good clinical classification. A wide range of applications of this pipeline can also be used in other areas of medical imaging. Magn Reson Med, 2012. (c) 2012 Wiley Periodicals, Inc.
Resumo:
Determination of chlorine using the molecular absorption of aluminum mono-chloride (AlCl) at the 261.418 nm wavelength was accomplished by high-resolution continuum source molecular absorption spectrometry using a transversely heated graphite tube furnace with an integrated platform. For the analysis. 10 mu L of the sample followed by 10 mu L of a solution containing Al-Ag-Sr modifier, (1 g L-1 each), were directly injected onto the platform. A spectral interference due to the use of Al-Ag-Sr as mixed modifier was easily corrected by the least-squares algorithm present in the spectrometer software. The pyrolysis and vaporization temperatures were 500 degrees C and 2200 degrees C, respectively. To evaluate the feasibility of a simple procedure for the determination of chlorine in food samples present in our daily lives, two different digestion methods were applied, namely (A) an acid digestion method using HNO3 only at room temperature, and (B) a digestion method with Ag, HNO3 and H2O2, where chlorine is precipitated as a low-solubility salt (AgCl), which is then dissolved with ammonia solution. The experimental results obtained with method B were in good agreement with the certified values and demonstrated that the proposed method is more accurate than method A. This is because the formation of silver chloride prevented analyte losses by volatilization. The limit of detection (LOD, 3 sigma/s) for Cl in methods A and B was 18 mu g g(-1) and 9 mu g g(-1), respectively, 1.7 and 3.3 times lower compared to published work using inductively coupled plasma optical emission spectrometry, and absolute LODs were 2.4 and 1.2 ng, respectively. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
[EN]A three-dimensional air pollution model for the short-term simulation of emission, transport and reaction of pollutants is presented. In the finite element simulation of these environmental processes over a complex terrain, a mesh generator capable of adapting itself to the topographic characteristics is essential, A local refinement of tetrahedra is used in order to capture the plume rise. Then a wind field is computed by using a mass-consistent model and perturbing its vertical component to introduce the plume rise effect. Finally, an Eulerian convection-diffusionreaction model is used to simulate the pollutant dispersion…