75 resultados para Annular Aperture Array
Resumo:
Affymetrix GeneChip (R) arrays are used widely to study transcriptional changes in response to developmental and environmental stimuli. GeneChip (R) arrays comprise multiple 25-mer oligonucleotide probes per gene and retain certain advantages over direct sequencing. For plants, there are several public GeneChip (R) arrays whose probes are localised primarily in 39 exons. Plant whole-transcript (WT) GeneChip (R) arrays are not yet publicly available, although WT resolution is needed to study complex crop genomes such as Brassica, which are typified by segmental duplications containing paralogous genes and/or allopolyploidy. Available sequence data were sampled from the Brassica A and C genomes, and 142,997 gene models identified. The assembled gene models were then used to establish a comprehensive public WT exon array for transcriptomics studies. The Affymetrix GeneChip (R) Brassica Exon 1.0 ST Array is a 5 mu M feature size array, containing 2.4 million 25-base oligonucleotide probes representing 135,201 gene models, with 15 probes per gene distributed among exons. Discrimination of the gene models was based on an E-value cut-off of 1E(-5), with <= 98 sequence identity. The 135 k Brassica Exon Array was validated by quantifying transcriptome differences between leaf and root tissue from a reference Brassica rapa line (R-o-18), and categorisation by Gene Ontologies (GO) based on gene orthology with Arabidopsis thaliana. Technical validation involved comparison of the exon array with a 60-mer array platform using the same starting RNA samples. The 135 k Brassica Exon Array is a robust platform. All data relating to the array design and probe identities are available in the public domain and are curated within the BrassEnsembl genome viewer at http://www.brassica.info/BrassEnsembl/index.html.
Resumo:
In this paper we present a compliant neural interface designed to record bladder afferent activity. We developed the implant's microfabrication process using multiple layers of silicone rubber and thin metal so that a gold microelectrode array is embedded within four parallel polydimethylsiloxane (PDMS) microchannels (5 mm long, 100 μm wide, 100 μm deep). Electrode impedance at 1 kHz was optimized using a reactive ion etching (RIE) step, which increased the porosity of the electrode surface. The electrodes did not deteriorate after a 3 month immersion in phosphate buffered saline (PBS) at 37 °C. Due to the unique microscopic topography of the metal film on PDMS, the electrodes are extremely compliant and can withstand handling during implantation (twisting and bending) without electrical failure. The device was transplanted acutely to anaesthetized rats, and strands of the dorsal branch of roots L6 and S1 were surgically teased and inserted in three microchannels under saline immersion to allow for simultaneous in vivo recordings in an acute setting. We utilized a tripole electrode configuration to maintain background noise low and improve the signal to noise ratio. The device could distinguish two types of afferent nerve activity related to increasing bladder filling and contraction. To our knowledge, this is the first report of multichannel recordings of bladder afferent activity.
Resumo:
We are reporting on the fabrication and electrical characterization of a novel elastomer based micro-cuff neural interface. Electrodes are gold (Au) tracks of sub-100nm thickness and are thermally evaporated on a 0.5 mm thick polydimethylsiloxane (PDMS) substrate. We investigate how electrode area and immersion in phosphate buffered saline (PBS) at 37°C influence electrode impedance. A microfluidic channel is bonded to the electrode array to form the cuff. In an acute, in-vivo, proof-of-principle recording, the device is capable of detecting light stroking and pinch of a hind leg of an anaesthetized rat.
Resumo:
We have fabricated a compliant neural interface to record afferent nerve activity. Stretchable gold electrodes were evaporated on a polydimethylsiloxane (PDMS) substrate and were encapsulated using photo-patternable PDMS. The built-in microstructure of the gold film on PDMS allows the electrodes to twist and flex repeatedly, without loss of electrical conductivity. PDMS microchannels (5mm long, 100μm wide, 100μm deep) were then plasma bonded irreversibly on top of the electrode array to define five parallel-conduit implants. The soft gold microelectrodes have a low impedance of ~200kΩ at the 1kHz frequency range. Teased nerves from the L6 dorsal root of an anaesthetized Sprague Dawley rat were threaded through the microchannels. Acute tripolar recordings of cutaneous activity are demonstrated, from multiple nerve rootlets simultaneously. Confinement of the axons within narrow microchannels allows for reliable recordings of low amplitude afferents. This electrode technology promises exciting applications in neuroprosthetic devices including bladder fullness monitors and peripheral nervous system implants.
Resumo:
Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding situation, the method could lead to improved detection of flooding in urban areas.
Resumo:
Sensible heat fluxes (QH) are determined using scintillometry and eddy covariance over a suburban area. Two large aperture scintillometers provide spatially integrated fluxes across path lengths of 2.8 km and 5.5 km over Swindon, UK. The shorter scintillometer path spans newly built residential areas and has an approximate source area of 2-4 km2, whilst the long path extends from the rural outskirts to the town centre and has a source area of around 5-10 km2. These large-scale heat fluxes are compared with local-scale eddy covariance measurements. Clear seasonal trends are revealed by the long duration of this dataset and variability in monthly QH is related to the meteorological conditions. At shorter time scales the response of QH to solar radiation often gives rise to close agreement between the measurements, but during times of rapidly changing cloud cover spatial differences in the net radiation (Q*) coincide with greater differences between heat fluxes. For clear days QH lags Q*, thus the ratio of QH to Q* increases throughout the day. In summer the observed energy partitioning is related to the vegetation fraction through use of a footprint model. The results demonstrate the value of scintillometry for integrating surface heterogeneity and offer improved understanding of the influence of anthropogenic materials on surface-atmosphere interactions.
Resumo:
The use of antibiotics in birds and animals intended for human consumption within the European Union (EU) and elsewhere has been subject to regulation prohibiting the use of antimicrobials as growth promoters and the use of last resort antibiotics in an attempt to reduce the spread of multi-resistant Gram negative bacteria. Given the inexorable spread of antibiotic resistance there is an increasing need for improved monitoring of our food. Using selective media, Gram negative bacteria were isolated from retail chicken of UK-Intensively reared (n = 27), Irish-Intensively reared (n = 19) and UK-Free range (n = 30) origin and subjected to an oligonucleotide based array system for the detection of 47 clinically relevant antibiotic resistance genes (ARGs) and two integrase genes. High incidences of β-lactamase genes were noted in all sample types, acc (67%), cmy (80%), fox (55%) and tem (40%) while chloramphenicol resistant determinants were detected in bacteria from the UK poultry portions and were absent in bacteria from the Irish samples. Denaturing Gradient Gel Electrophoresis (DGGE) was used to qualitatively analyse the Gram negative population in the samples and showed the expected diversity based on band stabbing and DNA sequencing. The array system proved to be a quick method for the detection of antibiotic resistance gene (ARG) burden within a mixed Gram negative bacterial population.
Resumo:
We propose and analyse a hybrid numerical–asymptotic hp boundary element method (BEM) for time-harmonic scattering of an incident plane wave by an arbitrary collinear array of sound-soft two-dimensional screens. Our method uses an approximation space enriched with oscillatory basis functions, chosen to capture the high-frequency asymptotics of the solution. We provide a rigorous frequency-explicit error analysis which proves that the method converges exponentially as the number of degrees of freedom N increases, and that to achieve any desired accuracy it is sufficient to increase N in proportion to the square of the logarithm of the frequency as the frequency increases (standard BEMs require N to increase at least linearly with frequency to retain accuracy). Our numerical results suggest that fixed accuracy can in fact be achieved at arbitrarily high frequencies with a frequency-independent computational cost, when the oscillatory integrals required for implementation are computed using Filon quadrature. We also show how our method can be applied to the complementary ‘breakwater’ problem of propagation through an aperture in an infinite sound-hard screen.
Resumo:
The automatic transformation of sequential programs for efficient execution on parallel computers involves a number of analyses and restructurings of the input. Some of these analyses are based on computing array sections, a compact description of a range of array elements. Array sections describe the set of array elements that are either read or written by program statements. These sections can be compactly represented using shape descriptors such as regular sections, simple sections, or generalized convex regions. However, binary operations such as Union performed on these representations do not satisfy a straightforward closure property, e.g., if the operands to Union are convex, the result may be nonconvex. Approximations are resorted to in order to satisfy this closure property. These approximations introduce imprecision in the analyses and, furthermore, the imprecisions resulting from successive operations have a cumulative effect. Delayed merging is a technique suggested and used in some of the existing analyses to minimize the effects of approximation. However, this technique does not guarantee an exact solution in a general setting. This article presents a generalized technique to precisely compute Union which can overcome these imprecisions.
Resumo:
A basic data requirement of a river flood inundation model is a Digital Terrain Model (DTM) of the reach being studied. The scale at which modeling is required determines the accuracy required of the DTM. For modeling floods in urban areas, a high resolution DTM such as that produced by airborne LiDAR (Light Detection And Ranging) is most useful, and large parts of many developed countries have now been mapped using LiDAR. In remoter areas, it is possible to model flooding on a larger scale using a lower resolution DTM, and in the near future the DTM of choice is likely to be that derived from the TanDEM-X Digital Elevation Model (DEM). A variable-resolution global DTM obtained by combining existing high and low resolution data sets would be useful for modeling flood water dynamics globally, at high resolution wherever possible and at lower resolution over larger rivers in remote areas. A further important data resource used in flood modeling is the flood extent, commonly derived from Synthetic Aperture Radar (SAR) images. Flood extents become more useful if they are intersected with the DTM, when water level observations (WLOs) at the flood boundary can be estimated at various points along the river reach. To illustrate the utility of such a global DTM, two examples of recent research involving WLOs at opposite ends of the spatial scale are discussed. The first requires high resolution spatial data, and involves the assimilation of WLOs from a real sequence of high resolution SAR images into a flood model to update the model state with observations over time, and to estimate river discharge and model parameters, including river bathymetry and friction. The results indicate the feasibility of such an Earth Observation-based flood forecasting system. The second example is at a larger scale, and uses SAR-derived WLOs to improve the lower-resolution TanDEM-X DEM in the area covered by the flood extents. The resulting reduction in random height error is significant.
Resumo:
We assess the roles of long-lived greenhouse gases and ozone depletion in driving meridional surface pressure gradients in the southern extratropics; these gradients are a defining feature of the Southern Annular Mode. Stratospheric ozone depletion is thought to have caused a strengthening of this mode during summer, with increasing long-lived greenhouse gases playing a secondary role. Using a coupled atmosphere-ocean chemistry-climate model, we show that there is cancelation between the direct, radiative effect of increasing greenhouse gases by the also substantial indirect—chemical and dynamical—feedbacks that greenhouse gases have via their impact on ozone. This sensitivity of the mode to greenhouse gas-induced ozone changes suggests that a consistent implementation of ozone changes due to long-lived greenhouse gases in climate models benefits the simulation of this important aspect of Southern Hemisphere climate.
Resumo:
The topography of many floodplains in the developed world has now been surveyed with high resolution sensors such as airborne LiDAR (Light Detection and Ranging), giving accurate Digital Elevation Models (DEMs) that facilitate accurate flood inundation modelling. This is not always the case for remote rivers in developing countries. However, the accuracy of DEMs produced for modelling studies on such rivers should be enhanced in the near future by the high resolution TanDEM-X WorldDEM. In a parallel development, increasing use is now being made of flood extents derived from high resolution Synthetic Aperture Radar (SAR) images for calibrating, validating and assimilating observations into flood inundation models in order to improve these. This paper discusses an additional use of SAR flood extents, namely to improve the accuracy of the TanDEM-X DEM in the floodplain covered by the flood extents, thereby permanently improving this DEM for future flood modelling and other studies. The method is based on the fact that for larger rivers the water elevation generally changes only slowly along a reach, so that the boundary of the flood extent (the waterline) can be regarded locally as a quasi-contour. As a result, heights of adjacent pixels along a small section of waterline can be regarded as samples with a common population mean. The height of the central pixel in the section can be replaced with the average of these heights, leading to a more accurate estimate. While this will result in a reduction in the height errors along a waterline, the waterline is a linear feature in a two-dimensional space. However, improvements to the DEM heights between adjacent pairs of waterlines can also be made, because DEM heights enclosed by the higher waterline of a pair must be at least no higher than the corrected heights along the higher waterline, whereas DEM heights not enclosed by the lower waterline must in general be no lower than the corrected heights along the lower waterline. In addition, DEM heights between the higher and lower waterlines can also be assigned smaller errors because of the reduced errors on the corrected waterline heights. The method was tested on a section of the TanDEM-X Intermediate DEM (IDEM) covering an 11km reach of the Warwickshire Avon, England. Flood extents from four COSMO-SKyMed images were available at various stages of a flood in November 2012, and a LiDAR DEM was available for validation. In the area covered by the flood extents, the original IDEM heights had a mean difference from the corresponding LiDAR heights of 0.5 m with a standard deviation of 2.0 m, while the corrected heights had a mean difference of 0.3 m with standard deviation 1.2 m. These figures show that significant reductions in IDEM height bias and error can be made using the method, with the corrected error being only 60% of the original. Even if only a single SAR image obtained near the peak of the flood was used, the corrected error was only 66% of the original. The method should also be capable of improving the final TanDEM-X DEM and other DEMs, and may also be of use with data from the SWOT (Surface Water and Ocean Topography) satellite.
Resumo:
Spatial and temporal fluctuations in the concentration field from an ensemble of continuous point-source releases in a regular building array are analyzed from data generated by direct numerical simulations. The release is of a passive scalar under conditions of neutral stability. Results are related to the underlying flow structure by contrasting data for an imposed wind direction of 0 deg and 45 deg relative to the buildings. Furthermore, the effects of distance from the source and vicinity to the plume centreline on the spatial and temporal variability are documented. The general picture that emerges is that this particular geometry splits the flow domain into segments (e.g. “streets” and “intersections”) in each of which the air is, to a first approximation, well mixed. Notable exceptions to this general rule include regions close to the source, near the plume edge, and in unobstructed channels when the flow is aligned. In the oblique (45 deg) case the strongly three-dimensional nature of the flow enhances mixing of a scalar within the canopy leading to reduced temporal and spatial concentration fluctuations within the plume core. These fluctuations are in general larger for the parallel flow (0 deg) case, especially so in the long unobstructed channels. Due to the more complex flow structure in the canyon-type streets behind buildings, fluctuations are lower than in the open channels, though still substantially larger than for oblique flow. These results are relevant to the formulation of simple models for dispersion in urban areas and to the quantification of the uncertainties in their predictions.
Resumo:
Lagged correlation analysis is often used to infer intraseasonal dynamical effects but is known to be affected by non-stationarity. We highlight a pronounced quasi-two-year peak in the anomalous zonal wind and eddy momentum flux convergence power spectra in the Southern Hemisphere, which is prima facie evidence for non-stationarity. We then investigate the consequences of this non-stationarity for the Southern Annular Mode and for eddy momentum flux convergence. We argue that positive lagged correlations previously attributed to the existence of an eddy feedback are more plausibly attributed to non-stationary interannual variability external to any potential feedback process in the mid-latitude troposphere. The findings have implications for the diagnosis of feedbacks in both models and re-analysis data as well as for understanding the mechanisms underlying variations in the zonal wind.