46 resultados para Progress.

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining how El Niño and its impacts may change over the next 10 to 100 years remains a difficult scientific challenge. Ocean–atmosphere coupled general circulation models (CGCMs) are routinely used both to analyze El Niño mechanisms and teleconnections and to predict its evolution on a broad range of time scales, from seasonal to centennial. The ability to simulate El Niño as an emergent property of these models has largely improved over the last few years. Nevertheless, the diversity of model simulations of present-day El Niño indicates current limitations in our ability to model this climate phenomenon and to anticipate changes in its characteristics. A review of the several factors that contribute to this diversity, as well as potential means to improve the simulation of El Niño, is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pressing global environmental problems highlight the need to develop tools to measure progress towards "sustainability." However, some argue that any such attempt inevitably reflects the views of those creating such tools and only produce highly contested notions of "reality." To explore this tension, we critically assesses the Environmental Sustainability Index (ESI), a well-publicized product of the World Economic Forum that is designed to measure 'sustainability' by ranking nations on league tables based on extensive databases of environmental indicators. By recreating this index, and then using statistical tools (principal components analysis) to test relations between various components of the index, we challenge ways in which countries are ranked in the ESI. Based on this analysis, we suggest (1) that the approach taken to aggregate, interpret and present the ESI creates a misleading impression that Western countries are more sustainable than the developing world; (2) that unaccounted methodological biases allowed the authors of the ESI to over-generalize the relative 'sustainability' of different countries; and, (3) that this has resulted in simplistic conclusions on the relation between economic growth and environmental sustainability. This criticism should not be interpreted as a call for the abandonment of efforts to create standardized comparable data. Instead, this paper proposes that indicator selection and data collection should draw on a range of voices, including local stakeholders as well as international experts. We also propose that aggregating data into final league ranking tables is too prone to error and creates the illusion of absolute and categorical interpretations. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The task of assessing the likelihood and extent of coastal flooding is hampered by the lack of detailed information on near-shore bathymetry. This is required as an input for coastal inundation models, and in some cases the variability in the bathymetry can impact the prediction of those areas likely to be affected by flooding in a storm. The constant monitoring and data collection that would be required to characterise the near-shore bathymetry over large coastal areas is impractical, leaving the option of running morphodynamic models to predict the likely bathymetry at any given time. However, if the models are inaccurate the errors may be significant if incorrect bathymetry is used to predict possible flood risks. This project is assessing the use of data assimilation techniques to improve the predictions from a simple model, by rigorously incorporating observations of the bathymetry into the model, to bring the model closer to the actual situation. Currently we are concentrating on Morecambe Bay as a primary study site, as it has a highly dynamic inter-tidal zone, with changes in the course of channels in this zone impacting the likely locations of flooding from storms. We are working with SAR images, LiDAR, and swath bathymetry to give us the observations over a 2.5 year period running from May 2003 – November 2005. We have a LiDAR image of the entire inter-tidal zone for November 2005 to use as validation data. We have implemented a 3D-Var data assimilation scheme, to investigate the improvements in performance of the data assimilation compared to the previous scheme which was based on the optimal interpolation method. We are currently evaluating these different data assimilation techniques, using 22 SAR data observations. We will also include the LiDAR data and swath bathymetry to improve the observational coverage, and investigate the impact of different types of observation on the predictive ability of the model. We are also assessing the ability of the data assimilation scheme to recover the correct bathymetry after storm events, which can dramatically change the bathymetry in a short period of time.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The alphaviruses were amongst the first arboviruses to be isolated, characterized and assigned a taxonomic status. They are globally very widespread, infecting a large variety of terrestrial animals, insects and even fish, and circulate both in the sylvatic and urban/peri-urban environment, causing considerable human morbidity and mortality. Nevertheless, despite their obvious importance as pathogens, there are currently no effective antiviral drugs with which to treat humans or animals infected by any of these viruses. The EU-supported project—VIZIER (Comparative Structural Genomics of Viral Enzymes Involved in Replication, FP6 Project: 2004-511960) was instigated with an ultimate view of contributing to the development of antiviral therapies for RNA viruses, including the alphaviruses [Coutard, B., Gorbalenya, A.E., Snijder, E.J., Leontovich, A.M., Poupon, A., De Lamballerie, X., Charrel, R., Gould, E.A., Gunther, S., Norder, H., Klempa, B., Bourhy, H., Rohayemj, J., L’hermite, E., Nordlund, P., Stuart, D.I., Owens, R.J., Grimes, J.M., Tuckerm, P.A., Bolognesi, M., Mattevi, A., Coll, M., Jones, T.A., Åqvist, J., Unger, T., Hilgenfeld, R., Bricogne, G., Neyts, J., La Colla, P., Puerstinger, G., Gonzalez, J.P., Leroy, E., Cambillau, C., Romette, J.L., Canard, B., 2008. The VIZIER project: preparedness against pathogenic RNA viruses. Antiviral Res. 78, 37–46]. This review highlights some of the major features of alphaviruses that have been investigated during recent years. After describing their classification, epidemiology and evolutionary history and the expanding geographic distribution of Chikungunya virus, we review progress in understanding the structure and function of alphavirus replicative enzymes achieved under the VIZIER programme and the development of new disease control strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assembly of HIV is relatively poorly investigated when compared with the process of virus entry. Yet a detailed understanding of the mechanism of assembly is fundamental to our knowledge of the complete life cycle of this virus and also has the potential to inform the development of new antiviral strategies. The repeated multiple interaction of the basic structural unit, Gag, might first appear to be little more than concentration dependent self-assembly but the precise mechanisms emerging for HIV are far from simple. Gag interacts not only with itself but also with host cell lipids and proteins in an ordered and stepwise manner. It binds both the genomic RNA and the virus envelope protein and must do this at an appropriate time and place within the infected cell. The assembled virus particle must successfully release from the cell surface and, whilst being robust enough for transmission between hosts, must nonetheless be primed for rapid disassembly when infection occurs. Our current understanding of these processes and the domains of Gag involved at each stage is the subject of this review. Copyright (C) 2004 John Wiley Sons, Ltd.