13 resultados para Network Resolution

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X data to detect flooded regions in urban areas is described. An important application for this would be the calibration and validation of the flood extent predicted by an urban flood inundation model. To date, research on such models has been hampered by lack of suitable distributed validation data. The study uses a 3m resolution TerraSAR-X image of a 1-in-150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SETES SAR simulator was used in conjunction with airborne LiDAR data to estimate regions of the TerraSAR-X image in which water would not be visible due to radar shadow or layover caused by buildings and taller vegetation, and these regions were masked out in the flood detection process. A semi-automatic algorithm for the detection of floodwater was developed, based on a hybrid approach. Flooding in rural areas adjacent to the urban areas was detected using an active contour model (snake) region-growing algorithm seeded using the un-flooded river channel network, which was applied to the TerraSAR-X image fused with the LiDAR DTM to ensure the smooth variation of heights along the reach. A simpler region-growing approach was used in the urban areas, which was initialized using knowledge of the flood waterline in the rural areas. Seed pixels having low backscatter were identified in the urban areas using supervised classification based on training areas for water taken from the rural flood, and non-water taken from the higher urban areas. Seed pixels were required to have heights less than a spatially-varying height threshold determined from nearby rural waterline heights. Seed pixels were clustered into urban flood regions based on their close proximity, rather than requiring that all pixels in the region should have low backscatter. This approach was taken because it appeared that urban water backscatter values were corrupted in some pixels, perhaps due to contributions from side-lobes of strong reflectors nearby. The TerraSAR-X urban flood extent was validated using the flood extent visible in the aerial photos. It turned out that 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. These findings indicate that TerraSAR-X is capable of providing useful data for the calibration and validation of urban flood inundation models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in different domains. In the EU project Hydra high-level security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the. Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios. This paper gives a short introduction to the Hydra project and its approach to ensure security by design. Based on the results of a focus group analysis of the user domain "building automation" typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta-Model. How concepts such as context, semantic resolution of security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of it technical building automation scenario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate prediction of the biochemical function of a protein is becoming increasingly important, given the unprecedented growth of both structural and sequence databanks. Consequently, computational methods are required to analyse such data in an automated manner to ensure genomes are annotated accurately. Protein structure prediction methods, for example, are capable of generating approximate structural models on a genome-wide scale. However, the detection of functionally important regions in such crude models, as well as structural genomics targets, remains an extremely important problem. The method described in the current study, MetSite, represents a fully automatic approach for the detection of metal-binding residue clusters applicable to protein models of moderate quality. The method involves using sequence profile information in combination with approximate structural data. Several neural network classifiers are shown to be able to distinguish metal sites from non-sites with a mean accuracy of 94.5%. The method was demonstrated to identify metal-binding sites correctly in LiveBench targets where no obvious metal-binding sequence motifs were detectable using InterPro. Accurate detection of metal sites was shown to be feasible for low-resolution predicted structures generated using mGenTHREADER where no side-chain information was available. High-scoring predictions were observed for a recently solved hypothetical protein from Haemophilus influenzae, indicating a putative metal-binding site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

World-wide structural genomics initiatives are rapidly accumulating structures for which limited functional information is available. Additionally, state-of-the art structural prediction programs are now capable of generating at least low resolution structural models of target proteins. Accurate detection and classification of functional sites within both solved and modelled protein structures therefore represents an important challenge. We present a fully automatic site detection method, FuncSite, that uses neural network classifiers to predict the location and type of functionally important sites in protein structures. The method is designed primarily to require only backbone residue positions without the need for specific side-chain atoms to be present. In order to highlight effective site detection in low resolution structural models FuncSite was used to screen model proteins generated using mGenTHREADER on a set of newly released structures. We found effective metal site detection even for moderate quality protein models illustrating the robustness of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces new insights into the hydrochemical functioning of lowland river systems using field-based spectrophotometric and electrode technologies. The streamwater concentrations of nitrogen species and phosphorus fractions were measured at hourly intervals on a continuous basis at two contrasting sites on tributaries of the River Thames – one draining a rural catchment, the River Enborne, and one draining a more urban system, The Cut. The measurements complement those from an existing network of multi-parameter water quality sondes maintained across the Thames catchment and weekly monitoring based on grab samples. The results of the sub-daily monitoring show that streamwater phosphorus concentrations display highly complex dynamics under storm conditions dependent on the antecedent catchment wetness, and that diurnal phosphorus and nitrogen cycles occur under low flow conditions. The diurnal patterns highlight the dominance of sewage inputs in controlling the streamwater phosphorus and nitrogen concentrations at low flows, even at a distance of 7 km from the nearest sewage treatment works in the rural River Enborne. The time of sample collection is important when judging water quality against ecological thresholds or standards. An exhaustion of the supply of phosphorus from diffuse and multiple septic tank sources during storm events was evident and load estimation was not improved by sub-daily monitoring beyond that achieved by daily sampling because of the eventual reduction in the phosphorus mass entering the stream during events. The results highlight the utility of sub-daily water quality measurements and the discussion considers the practicalities and challenges of in situ, sub-daily monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ground-based Atmospheric Radiation Measurement Program (ARM) and NASA Aerosol Robotic Net- work (AERONET) routinely monitor clouds using zenith ra- diances at visible and near-infrared wavelengths. Using the transmittance calculated from such measurements, we have developed a new retrieval method for cloud effective droplet size and conducted extensive tests for non-precipitating liquid water clouds. The underlying principle is to combine a liquid-water-absorbing wavelength (i.e., 1640 nm) with a non-water-absorbing wavelength for acquiring information on cloud droplet size and optical depth. For simulated stratocumulus clouds with liquid water path less than 300 g m−2 and horizontal resolution of 201 m, the retrieval method underestimates the mean effective radius by 0.8μm, with a root-mean-squared error of 1.7 μm and a relative deviation of 13%. For actual observations with a liquid water path less than 450 g m−2 at the ARM Oklahoma site during 2007– 2008, our 1.5-min-averaged retrievals are generally larger by around 1 μm than those from combined ground-based cloud radar and microwave radiometer at a 5-min temporal resolution. We also compared our retrievals to those from combined shortwave flux and microwave observations for relatively homogeneous clouds, showing that the bias between these two retrieval sets is negligible, but the error of 2.6 μm and the relative deviation of 22 % are larger than those found in our simulation case. Finally, the transmittance-based cloud effective droplet radii agree to better than 11 % with satellite observations and have a negative bias of 1 μm. Overall, the retrieval method provides reasonable cloud effective radius estimates, which can enhance the cloud products of both ARM and AERONET.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As low carbon technologies become more pervasive, distribution network operators are looking to support the expected changes in the demands on the low voltage networks through the smarter control of storage devices. Accurate forecasts of demand at the single household-level, or of small aggregations of households, can improve the peak demand reduction brought about through such devices by helping to plan the appropriate charging and discharging cycles. However, before such methods can be developed, validation measures are required which can assess the accuracy and usefulness of forecasts of volatile and noisy household-level demand. In this paper we introduce a new forecast verification error measure that reduces the so called “double penalty” effect, incurred by forecasts whose features are displaced in space or time, compared to traditional point-wise metrics, such as Mean Absolute Error and p-norms in general. The measure that we propose is based on finding a restricted permutation of the original forecast that minimises the point wise error, according to a given metric. We illustrate the advantages of our error measure using half-hourly domestic household electrical energy usage data recorded by smart meters and discuss the effect of the permutation restriction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our previous work we developed a successful protocol to pattern the human hNT neuron (derived from the human teratocarcinoma cell line (hNT)) on parylene-C/SiO2 substrates. This communication, reports how we have successfully managed to pattern the supportive cell to the neuron, the hNT astrocyte, on such substrates. Here we disseminate the nanofabrication, cell differentiation and cell culturing protocols necessary to successfully pattern the first human hNT astrocytes to single cell resolution on parylene-C/SiO2 substrates. This is performed for varying parylene strip widths providing excellent contrast to the SiO2 substrate and elegant single cell isolation at 10μm strip widths. The breakthrough in patterning human cells on a silicon chip has widespread implications and is valuable as a platform technology as it enables a detailed study of the human brain at the cellular and network level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform simulations of several convective events over the southern UK with the Met Office Unified Model (UM) at horizontal grid lengths ranging from 1.5 km to 200 m. Comparing the simulated storms on these days with the Met Office rainfall radar network allows us to apply a statistical approach to evaluate the properties and evolution of the simulated storms over a range of conditions. Here we present results comparing the storm morphology in the model and reality which show that the simulated storms become smaller as grid length decreases and that the grid length that fits the observations best changes with the size of the observed cells. We investigate the sensitivity of storm morphology in the model to the mixing length used in the subgrid turbulence scheme. As the subgrid mixing length is decreased, the number of small storms with high area-averaged rain rates increases. We show that by changing the mixing length we can produce a lower resolution simulation that produces similar morphologies to a higher resolution simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation (DA) systems are evolving to meet the demands of convection-permitting models in the field of weather forecasting. On 19 April 2013 a special interest group meeting of the Royal Meteorological Society brought together UK researchers looking at different aspects of the data assimilation problem at high resolution, from theory to applications, and researchers creating our future high resolution observational networks. The meeting was chaired by Dr Sarah Dance of the University of Reading and Dr Cristina Charlton-Perez from the MetOffice@Reading. The purpose of the meeting was to help define the current state of high resolution data assimilation in the UK. The workshop assembled three main types of scientists: observational network specialists, operational numerical weather prediction researchers and those developing the fundamental mathematical theory behind data assimilation and the underlying models. These three working areas are intrinsically linked; therefore, a holistic view must be taken when discussing the potential to make advances in high resolution data assimilation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The urban heat island is a well-known phenomenon that impacts a wide variety of city operations. With greater availability of cheap meteorological sensors, it is possible to measure the spatial patterns of urban atmospheric characteristics with greater resolution. To develop robust and resilient networks, recognizing sensors may malfunction, it is important to know when measurement points are providing additional information and also the minimum number of sensors needed to provide spatial information for particular applications. Here we consider the example of temperature data, and the urban heat island, through analysis of a network of sensors in the Tokyo metropolitan area (Extended METROS). The effect of reducing observation points from an existing meteorological measurement network is considered, using random sampling and sampling with clustering. The results indicated the sampling with hierarchical clustering can yield similar temperature patterns with up to a 30% reduction in measurement sites in Tokyo. The methods presented have broader utility in evaluating the robustness and resilience of existing urban temperature networks and in how networks can be enhanced by new mobile and open data sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new generation of high-resolution (1 km) forecast models promises to revolutionize the prediction of hazardous weather such as windstorms, flash floods, and poor air quality. To realize this promise, a dense observing network, focusing on the lower few kilometers of the atmosphere, is required to verify these new forecast models with the ultimate goal of assimilating the data. At present there are insufficient systematic observations of the vertical profiles of water vapor, temperature, wind, and aerosols; a major constraint is the absence of funding to install new networks. A recent research program financed by the European Union, tasked with addressing this lack of observations, demonstrated that the assimilation of observations from an existing wind profiler network reduces forecast errors, provided that the individual instruments are strategically located and properly maintained. Additionally, it identified three further existing European networks of instruments that are currently underexploited, but with minimal expense they could deliver quality-controlled data to national weather services in near–real time, so the data could be assimilated into forecast models. Specifically, 1) several hundred automatic lidars and ceilometers can provide backscatter profiles associated with aerosol and cloud properties and structures with 30-m vertical resolution every minute; 2) more than 20 Doppler lidars, a fairly new technology, can measure vertical and horizontal winds in the lower atmosphere with a vertical resolution of 30 m every 5 min; and 3) about 30 microwave profilers can estimate profiles of temperature and humidity in the lower few kilometers every 10 min. Examples of potential benefits from these instruments are presented.