918 resultados para Amazon Floodplain
Resumo:
The control of fishing mortality via fishing effort remains fundamental to most fisheries management strategies even at the local community or co-management level. Decisions to support such strategies require knowledge of the underlying response of the catch to changes in effort. Even under adaptive management strategies, imprecise knowledge of the response is likely to help accelerate the adaptive learning process. Data and institutional capacity requirements to employ multi-species biomass dynamics and age-structured models invariably render their use impractical particularly in less developed regions of the world. Surplus production models fitted to catch and effort data aggregated across all species offer viable alternatives. The current paper seeks models of this type that best describe the multi-species catch–effort responses in floodplain-rivers, lakes and reservoirs and reef-based fisheries based upon among fishery comparisons, building on earlier work. Three alternative surplus production models were fitted to estimates of catch per unit area (CPUA) and fisher density for 258 fisheries in Africa, Asia and South America. In all cases examined, the best or equal best fitting model was the Fox type, explaining up to 90% of the variation in CPUA. For lake and reservoir fisheries in Africa and Asia, the Schaefer and an asymptotic model fitted equally well. The Fox model estimates of fisher density (fishers km−2) at maximum yield (iMY) for floodplain-rivers, African lakes and reservoirs and reef-based fisheries are 13.7 (95% CI [11.8, 16.4]); 27.8 (95% CI [17.5, 66.7]) and 643 (95% CI [459,1075]), respectively and compare well with earlier estimates. Corresponding estimates of maximum yield are also given. The significantly higher value of iMY for reef-based fisheries compared to estimates for rivers and lakes reflects the use of a different measure of fisher density based upon human population size estimates. The models predict that maximum yield is achieved at a higher fishing intensity in Asian lakes compared to those in Africa. This may reflect the common practice in Asia of stocking lakes to augment natural recruitment. Because of the equilibrium assumptions underlying the models, all the estimates of maximum yield and corresponding levels of effort should be treated with caution.
Resumo:
Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, the model spatial resolution required to represent flows through a typical street network often results in an impractical computational cost at the city scale. This paper presents the calibration and evaluation of a recently developed formulation of the LISFLOOD-FP model, which is more computationally efficient at these resolutions. Aerial photography was available for model evaluation on 3 days from the 24 to the 31 of July. The new formulation was benchmarked against the original version of the model at 20 and 40 m resolutions, demonstrating equally accurate simulation, given the evaluation data but at a 67 times faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in more accurate simulation of the floodplain drying dynamics compared with the coarse resolution models, although maximum inundation levels were simulated equally well at all resolutions tested.
Resumo:
The Seille Valley in eastern France was home to one of Europe’s largest Iron Age salt industries. Sedimentology, palynology and geochronology have been integrated within ongoing archaeological investigations to reconstruct the Holocene palaeoenvironmental history of the Seille Valley and to elucidate the human–environment relationship of salt production. A sedimentary model of the valley has been constructed from a borehole survey of the floodplain and pollen analyses have been undertaken to reconstruct the vegetation history. Alluvial records have been successfully dated using optically stimulated luminescence and radiocarbon techniques, thereby providing a robust chronological framework. The results have provided an insight into the development of favourable conditions for salt production and there is evidence in the sedimentary record to suggest that salt production may have taken place during the mid-to-late Bronze Age. The latter has yet to be identified in the archaeological record and targeted excavation is therefore underway to test this finding. The development of the Iron Age industry had a major impact on the hydrological regime of the valley and its sedimentological history, with evidence for accelerated alluviation arising from floodplain erosion at salt production sites and modification of the local fluvial regime due to briquetage accumulation on the floodplain. This research provides an important insight into the environmental implications of early industrial activities, in addition to advancing knowledge about the Holocene palaeoenvironmental and social history of this previously poorly studied region of France.
Resumo:
Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.
Resumo:
The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk
Resumo:
A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.
Resumo:
Satellite-based Synthetic Aperture Radar (SAR) has proved useful for obtaining information on flood extent, which, when intersected with a Digital Elevation Model (DEM) of the floodplain, provides water level observations that can be assimilated into a hydrodynamic model to decrease forecast uncertainty. With an increasing number of operational satellites with SAR capability, information on the relationship between satellite first visit and revisit times and forecast performance is required to optimise the operational scheduling of satellite imagery. By using an Ensemble Transform Kalman Filter (ETKF) and a synthetic analysis with the 2D hydrodynamic model LISFLOOD-FP based on a real flooding case affecting an urban area (summer 2007,Tewkesbury, Southwest UK), we evaluate the sensitivity of the forecast performance to visit parameters. We emulate a generic hydrologic-hydrodynamic modelling cascade by imposing a bias and spatiotemporal correlations to the inflow error ensemble into the hydrodynamic domain. First, in agreement with previous research, estimation and correction for this bias leads to a clear improvement in keeping the forecast on track. Second, imagery obtained early in the flood is shown to have a large influence on forecast statistics. Revisit interval is most influential for early observations. The results are promising for the future of remote sensing-based water level observations for real-time flood forecasting in complex scenarios.
Resumo:
Middle Pleistocene deposits at Hackney, north London comprise a thick unit of organic sands and silts occupying a channel near the confluence of the River Thames in south-eastern England and its left-bank tributary the River Lea. They represent a short time interval, perhaps no more than a few years, within a late Middle Pleistocene interglacial. The organic sediments are overlain by unfossiliferous sands and gravels indicating deposition on the floodplain of a braided river under cool or cold climatic conditions. The fossil plant, insect, mollusc and vertebrate remains from the interglacial deposits all indicate climatic conditions with summers warmer than the present in SE England, and winters with a similar thermal climate. The biostratigraphic evidence suggests that the time period represented by the organic unit is part of MIS 9, although the geochronological evidence for such an age is inconclusive. The palaeontological evidence strongly suggests that this temperate stage was warmer than the succeeding temperate stage MIS 7 or the Holocene, and approaching the Ipswichian (MISs 5e) in its warmth. The multidisciplinary description of the Hackney deposits is one of the first to reconstruct terrestrial conditions in Marine Isotope Stage 9 in Western Europe.
Resumo:
The interpretation of Neotropical fossil phytolith assemblages for palaeoenvironmental and archaeological reconstructions relies on the development of appropriate modern analogues. We analyzed modern phytolith assemblages from the soils of ten distinctive tropical vegetation communities in eastern lowland Bolivia, ranging from terra firme humid evergreen forest to seasonally-inundated savannah. Results show that broad ecosystems – evergreen tropical forest, semi-deciduous dry tropical forest, and savannah – can be clearly differentiated by examination of their phytolith spectra and the application of Principal Component Analysis (PCA). Differences in phytolith assemblages between particular vegetation communities within each of these ecosystems are more subtle, but can still be identified. Comparison of phytolith assemblages with pollen rain data and stable carbon isotope analyses from the same vegetation plots show that these proxies are not only complementary, but significantly improve taxonomic and ecosystem resolution, and therefore our ability to interpret palaeoenvironmental and archaeological records. Our data underline the utility of phytolith analyses for reconstructing Amazon Holocene vegetation histories and pre-Columbian land use, particularly the high spatial resolution possible with terrestrial soil-based phytolith studies.
Resumo:
This paper uses a palaeoecological approach to examine the impact of drier climatic conditions of the Early-Mid-Holocene (ca 8000-4000 years ago) upon Amazonia's forests and their fire regimes. Palaeovegetation (pollen data) and palaeofire (charcoal) records are synthesized from 20 sites within the present tropical forest biome, and the underlying causes of any emergent patterns or changes are explored by reference to independent palaeoclimate data and present-day patterns of precipitation, forest cover and fire activity across Amazonia. During the Early-Mid-Holocene, Andean cloud forest taxa were replaced by lowland tree taxa as the cloud base rose while lowland ecotonal areas, which are presently covered by evergreen rainforest, were instead dominated by savannahs and/or semi-deciduous dry forests. Elsewhere in the Amazon Basin there is considerable spatial and temporal variation in patterns of vegetation disturbance and fire, which probably reflects the complex heterogeneous patterns in precipitation and seasonality across the basin, and the interactions between climate change, drought- and fire susceptibility of the forests, and Palaeo-Indian land use. Our analysis shows that the forest biome in most parts of Amazonia appears to have been remarkably resilient to climatic conditions significantly drier than those of today, despite widespread evidence of forest burning. Only in ecotonal areas is there evidence of biome replacement in the Holocene. From this palaeoecological perspective, we argue against the Amazon forest 'dieback' scenario simulated for the future.
Resumo:
We present a multiproxy study of land use by a pre-Columbian earth mounds culture in the Bolivian Amazon. The Monumental Mounds Region (MMR) is an archaeological sub-region characterized by hundreds of pre-Columbian habitation mounds associated with a complex network of canals and causeways, and situated in the forest–savanna mosaic of the Llanos de Moxos. Pollen, phytolith, and charcoal analyses were performed on a sediment core from a large lake (14 km2), Laguna San José (14°56.97′S, 64°29.70′W).We found evidence of high levels of anthropogenic burning from AD 400 to AD 1280, corroborating dated occupation layers in two nearby excavated habitation mounds. The charcoal decline pre-dates the arrival of Europeans by at least 100 yr, and challenges the notion that the mounds culture declined because of European colonization. We show that the surrounding savanna soils were sufficiently fertile to support crops, and the presence of maize throughout the record shows that the area was continuously cultivated despite land-use change at the end of the earthmounds culture. We suggest that burning was largely confined to the savannas, rather than forests, and that pre-Columbian deforestation was localized to the vicinity of individual habitation mounds, whereas the inter-mound areas remained largely forested.
Resumo:
Structured abstract: Purpose: LibraryThing is a Web 2.0 tool allowing users to catalogue books using data drawn from sources such as Amazon and the Library of Congress and has facilities such as tagging and interest groups. This study evaluates whether LibraryThing is a valuable tool for libraries to use for promotional and user engagement purposes. Methodology: This study used a sequential mixed methods 3 phase design: (1) the identification of LibraryThing features for user engagement or promotional purposes, (2) exploratory semi-structured interviews (3) a questionnaire. Findings: Several uses of LibraryThing for promotional and user engagement purposes were identified. The most popular reason libraries used LibraryThing was to promote the library or library stock, with most respondents using it specifically to highlight collections of books. Monitoring of patron usage was low and many respondents had not received any feedback. LibraryThing was commonly reported as being easy to use, remotely accessible, and having low cost, whilst its main drawbacks were the 200 book limit for free accounts, and it being a third-party site. The majority of respondents felt LibraryThing was a useful tool for libraries. Practical implications: LibraryThing has most value as a promotional tool for libraries. Libraries should actively monitor patron usage of their LibraryThing account or request user feedback to ensure that LibraryThing provides a truly valuable service for their library. Orginality : There is little research on the value of LibraryThing for libraries, or librarians perceptions of LibraryThing as a Web 2.0 tool.
Resumo:
Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.
Resumo:
Deposit modelling based on archived borehole logs supplemented by a small number of dedicated boreholes is used to reconstruct the main boundary surfaces and the thickness of the main sediment units within the succession of Holocene alluvial deposits underlying the floodplain in the Barking Reach of the Lower Thames Valley. The basis of the modelling exercise is discussed and the models are used to assess the significance of floodplain relief in determining patterns of sedimentation. This evidence is combined with the results of biostratigraphical and geochronological investigations to reconstruct the environmental conditions associated with each successive stage of floodplain aggradation. The two main factors affecting the history and spatial pattern of Holocene sedimentation are shown to be the regional behaviour of relative sea level and the pattern of relief on the surface of the sub-alluvial, Late Devensian Shepperton Gravel. As is generally the case in the Lower Thames Valley, three main stratigraphic units are recognised, the Lower Alluvium, a peat bed broadly equivalent to the Tilbury III peat of Devoy (1979) and an Upper Alluvium. There is no evidence to suggest that the floodplain was substantially re-shaped by erosion during the Holocene. Instead, the relief inherited from the Shepperton Gravel surface was gradually buried either by the accumulation of peat or by deposition of fine-grained sediment from suspension in standing or slow-moving water. The palaeoenvironmental record from Barking confirms important details of the Holocene record observed elsewhere in the Lower Thames Valley, including the presence of Taxus in the valley-floor fen carr woodland between about 5000 and 4000 cal BP, and the subsequent growth of Ulmus on the peat surface.
Resumo:
Recent research into flood modelling has primarily concentrated on the simulation of inundation flow without considering the influences of channel morphology. River channels are often represented by a simplified geometry that is implicitly assumed to remain unchanged during flood simulations. However, field evidence demonstrates that significant morphological changes can occur during floods to mobilise the boundary sediments. Despite this, the effect of channel morphology on model results has been largely unexplored. To address this issue, the impact of channel cross-section geometry and channel long-profile variability on flood dynamics is examined using an ensemble of a 1D-2D hydraulic model (LISFLOOD-FP) of the 1:2102 year recurrence interval floods in Cockermouth, UK, within an uncertainty framework. A series of hypothetical scenarios of channel morphology were constructed based on a simple velocity based model of critical entrainment. A Monte-Carlo simulation framework was used to quantify the effects of channel morphology together with variations in the channel and floodplain roughness coefficients, grain size characteristics, and critical shear stress on measures of flood inundation. The results showed that the bed elevation modifications generated by the simplistic equations reflected a good approximation of the observed patterns of spatial erosion despite its overestimation of erosion depths. The effect of uncertainty on channel long-profile variability only affected the local flood dynamics and did not significantly affect the friction sensitivity and flood inundation mapping. The results imply that hydraulic models generally do not need to account for within event morphodynamic changes of the type and magnitude modelled, as these have a negligible impact that is smaller than other uncertainties, e.g. boundary conditions. Instead morphodynamic change needs to happen over a series of events to become large enough to change the hydrodynamics of floods in supply limited gravel-bed rivers like the one used in this research.