887 resultados para flood extent mapping
Resumo:
An ability to quantify the reliability of probabilistic flood inundation predictions is a requirement not only for guiding model development but also for their successful application. Probabilistic flood inundation predictions are usually produced by choosing a method of weighting the model parameter space, but previous study suggests that this choice leads to clear differences in inundation probabilities. This study aims to address the evaluation of the reliability of these probabilistic predictions. However, a lack of an adequate number of observations of flood inundation for a catchment limits the application of conventional methods of evaluating predictive reliability. Consequently, attempts have been made to assess the reliability of probabilistic predictions using multiple observations from a single flood event. Here, a LISFLOOD-FP hydraulic model of an extreme (>1 in 1000 years) flood event in Cockermouth, UK, is constructed and calibrated using multiple performance measures from both peak flood wrack mark data and aerial photography captured post-peak. These measures are used in weighting the parameter space to produce multiple probabilistic predictions for the event. Two methods of assessing the reliability of these probabilistic predictions using limited observations are utilized; an existing method assessing the binary pattern of flooding, and a method developed in this paper to assess predictions of water surface elevation. This study finds that the water surface elevation method has both a better diagnostic and discriminatory ability, but this result is likely to be sensitive to the unknown uncertainties in the upstream boundary condition
Resumo:
We have performed systematic Monte Carlo studies on the influence of shifting the walls in slit-like systems constructed from folded graphene sheets on their adsorption properties. Specifically, we have analysed the effect on the mechanism of argon adsorption (T = 87 K) and on adsorption and separation of three binary gas mixtures: CO2/N2, CO2/CH4 and CH4/N2 (T = 298 K). The effects of the changes in interlayer distance were also determined. We show that folding of the walls significantly improves the adsorption and separation properties in comparison to ideal slit-like systems. Moreover, we demonstrate that mutual shift of sheets (for small interlayer distances) causes the appearance of small pores between opposite bulges. This causes an increase in vapour adsorption at low pressures. Due to overlapping of interactions with opposite walls causing an increase in adsorption energy, the mutual shift of sheets is also connected with the rise in efficiency of mixtures separation. The effects connected with sheet orientation vanish as the interlayer distance increases.
Resumo:
The emergence and development of digital imaging technologies and their impact on mainstream filmmaking is perhaps the most familiar special effects narrative associated with the years 1981-1999. This is in part because some of the questions raised by the rise of the digital still concern us now, but also because key milestone films showcasing advancements in digital imaging technologies appear in this period, including Tron (1982) and its computer generated image elements, the digital morphing in The Abyss (1989) and Terminator 2: Judgment Day (1991), computer animation in Jurassic Park (1993) and Toy Story (1995), digital extras in Titanic (1997), and ‘bullet time’ in The Matrix (1999). As a result it is tempting to characterize 1981-1999 as a ‘transitional period’ in which digital imaging processes grow in prominence and technical sophistication, and what we might call ‘analogue’ special effects processes correspondingly become less common. But such a narrative risks eliding the other practices that also shape effects sequences in this period. Indeed, the 1980s and 1990s are striking for the diverse range of effects practices in evidence in both big budget films and lower budget productions, and for the extent to which analogue practices persist independently of or alongside digital effects work in a range of production and genre contexts. The chapter seeks to document and celebrate this diversity and plurality, this sustaining of earlier traditions of effects practice alongside newer processes, this experimentation with materials and technologies old and new in the service of aesthetic aspirations alongside budgetary and technical constraints. The common characterization of the period as a series of rapid transformations in production workflows, practices and technologies will be interrogated in relation to the persistence of certain key figures as Douglas Trumbull, John Dykstra, and James Cameron, but also through a consideration of the contexts for and influences on creative decision-making. Comparative analyses of the processes used to articulate bodies, space and scale in effects sequences drawn from different generic sites of special effects work, including science fiction, fantasy, and horror, will provide a further frame for the chapter’s mapping of the commonalities and specificities, continuities and variations in effects practices across the period. In the process, the chapter seeks to reclaim analogue processes’ contribution both to moments of explicit spectacle, and to diegetic verisimilitude, in the decades most often associated with the digital’s ‘arrival’.
Resumo:
Heterosis refers to the phenomenon in which an F1 hybrid exhibits enhanced growth or agronomic performance. However, previous theoretical studies on heterosis have been based on bi-parental segregating populations instead of F1 hybrids. To understand the genetic basis of heterosis, here we used a subset of F1 hybrids, named a partial North Carolina II design, to perform association mapping for dependent variables: original trait value, general combining ability (GCA), specific combining ability (SCA) and mid-parental heterosis (MPH). Our models jointly fitted all the additive, dominance and epistatic effects. The analyses resulted in several important findings: 1) Main components are additive and additive-by-additive effects for GCA and dominance-related effects for SCA and MPH, and additive-by-dominant effect for MPH was partly identified as additive effect; 2) the ranking of factors affecting heterosis was dominance > dominance-by-dominance > over-dominance > complete dominance; and 3) increasing the proportion of F1 hybrids in the population could significantly increase the power to detect dominance-related effects, and slightly reduce the power to detect additive and additive-by-additive effects. Analyses of cotton and rapeseed datasets showed that more additive-by-additive QTL were detected from GCA than from trait phenotype, and fewer QTL were from MPH than from other dependent variables.
Resumo:
This study investigates flash flood forecast and warning communication, interpretation, and decision making, using data from a survey of 418 members of the public in Boulder, Colorado, USA. Respondents to the public survey varied in their perceptions and understandings of flash flood risks in Boulder, and some had misconceptions about flash flood risks, such as the safety of crossing fast-flowing water. About 6% of respondents indicated consistent reversals of US watch-warning alert terminology. However, more in-depth analysis illustrates the multi-dimensional, situationally dependent meanings of flash flood alerts, as well as the importance of evaluating interpretation and use of warning information along with alert terminology. Some public respondents estimated low likelihoods of flash flooding given a flash flood warning; these were associated with lower anticipated likelihood of taking protective action given a warning. Protective action intentions were also lower among respondents who had less trust in flash flood warnings, those who had not made prior preparations for flash flooding, and those who believed themselves to be safer from flash flooding. Additional analysis, using open-ended survey questions about responses to warnings, elucidates the complex, contextual nature of protective decision making during flash flood threats. These findings suggest that warnings can play an important role not only by notifying people that there is a threat and helping motivate people to take protective action, but also by helping people evaluate what actions to take given their situation.
Resumo:
Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.
Resumo:
Heavy precipitation affected Central Europe in May/June 2013, triggering damaging floods both on the Danube and the Elbe rivers. Based on a modelling approach with COSMO-CLM, moisture fluxes, backward trajectories, cyclone tracks and precipitation fields are evaluated for the relevant time period 30 May–2 June 2013. We identify potential moisture sources and quantify their contribution to the flood event focusing on the Danube basin through sensitivity experiments: Control simulations are performed with undisturbed ERA-Interim boundary conditions, while multiple sensitivity experiments are driven with modified evaporation characteristics over selected marine and land areas. Two relevant cyclones are identified both in reanalysis and in our simulations, which moved counter-clockwise in a retrograde path from Southeastern Europe over Eastern Europe towards the northern slopes of the Alps. The control simulations represent the synoptic evolution of the event reasonably well. The evolution of the precipitation event in the control simulations shows some differences in terms of its spatial and temporal characteristics compared to observations. The main precipitation event can be separated into two phases concerning the moisture sources. Our modelling results provide evidence that the two main sources contributing to the event were the continental evapotranspiration (moisture recycling; both phases) and the North Atlantic Ocean (first phase only). The Mediterranean Sea played only a minor role as a moisture source. This study confirms the importance of continental moisture recycling for heavy precipitation events over Central Europe during the summer half year.
Resumo:
Searching for and mapping the physical extent of unmarked graves using geophysical techniques has proven difficult in many cases. The success of individual geophysical techniques for detecting graves depends on a site-by-site basis. Significantly, detection of graves often results from measured contrasts that are linked to the background soils rather than the type of archaeological feature associated with the grave. It is evident that investigation of buried remains should be considered within a 3D space as the variation in burial environment can be extremely varied through the grave. Within this paper, we demonstrate the need for a multi-method survey strategy to investigate unmarked graves, as applied at a “planned” but unmarked pauper’s cemetery. The outcome from this case study provides new insights into the strategy that is required at such sites. Perhaps the most significant conclusion is that unmarked graves are best understood in terms of characterization rather than identification. In this paper, we argue for a methodological approach that, while following the current trends to use multiple techniques, is fundamentally dependent on a structured approach to the analysis of the data. The ramifications of this case study illustrate the necessity of an integrated strategy to provide a more holistic understanding of unmarked graves that may help aid in management of these unseen but important aspects of our heritage. It is concluded that the search for graves is still a current debate and one that will be solved by methodological rather than technique-based arguments.
Resumo:
Rootstock-induced dwarfing of apple scions revolutionized global apple production during the twentieth century, leading to the development of modern intensive orchards. A high root bark percentage (the percentage of the whole root area constituted by root cortex) has previously been associated with rootstock induced dwarfing in apple. In this study, the root bark percentage was measured in a full-sib family of ungrafted apple rootstocks and found to be under the control of three loci. Two QTL for root bark percentage were found to co-localise to the same genomic regions on chromosome 5 and chromosome 11 previously identified as controlling dwarfing, Dw1 and Dw2, respectively. A third QTL was identified on chromosome 13 in a region that has not been previously associated with dwarfing. The development of closely linked 3 Sequence-tagged site STS markers improved the resolution of allelic classes thereby allowing the detection of dominance and epistatic interactions between loci, with high root bark percentage only occurring in specific allelic combinations. In addition, we report a significant negative correlation between root bark percentage and stem diameter (an indicator of tree vigour), measured on a clonally propagated grafted subset of the mapping population. The demonstrated link between root bark percentage and rootstock-induced dwarfing of the scion leads us to propose a three-locus model that is able to explain levels of dwarfing from the dwarf ‘M.27’ to the semi-invigorating rootstock ‘M.116’. Moreover, we suggest that the QTL on chromosome 13 (Rb3) might be analogous to a third dwarfing QTL, Dw3 that has not previously been identified.
Resumo:
This paper describes new advances in the exploitation of oxygen A-band measurements from POLDER3 sensor onboard PARASOL, satellite platform within the A-Train. These developments result from not only an account of the dependence of POLDER oxygen parameters to cloud optical thickness τ and to the scene's geometrical conditions but also, and more importantly, from the finer understanding of the sensitivity of these parameters to cloud vertical extent. This sensitivity is made possible thanks to the multidirectional character of POLDER measurements. In the case of monolayer clouds that represent most of cloudy conditions, new oxygen parameters are obtained and calibrated from POLDER3 data colocalized with the measurements of the two active sensors of the A-Train: CALIOP/CALIPSO and CPR/CloudSat. From a parameterization that is (μs, τ) dependent, with μs the cosine of the solar zenith angle, a cloud top oxygen pressure (CTOP) and a cloud middle oxygen pressure (CMOP) are obtained, which are estimates of actual cloud top and middle pressures (CTP and CMP). Performances of CTOP and CMOP are presented by class of clouds following the ISCCP classification. In 2008, the coefficient of the correlation between CMOP and CMP is 0.81 for cirrostratus, 0.79 for stratocumulus, 0.75 for deep convective clouds. The coefficient of the correlation between CTOP and CTP is 0.75, 0.73, and 0.79 for the same cloud types. The score obtained by CTOP, defined as the confidence in the retrieval for a particular range of inferred value and for a given error, is higher than the one of MODIS CTP estimate. Scores of CTOP are the highest for bin value of CTP superior in numbers. For liquid (ice) clouds and an error of 30 hPa (50 hPa), the score of CTOP reaches 50% (70%). From the difference between CTOP and CMOP, a first estimate of the cloud vertical extent h is possible. A second estimate of h comes from the correlation between the angular standard deviation of POLDER oxygen pressure σPO2 and the cloud vertical extent. This correlation is studied in detail in the case of liquid clouds. It is shown to be spatially and temporally robust, except for clouds above land during winter months. The analysis of the correlation's dependence on the scene's characteristics leads to a parameterization providing h from σPO2. For liquid water clouds above ocean in 2008, the mean difference between the actual cloud vertical extent and the one retrieved from σPO2 (from the pressure difference) is 5 m (−12 m). The standard deviation of the mean difference is close to 1000 m for the two methods. POLDER estimates of the cloud geometrical thickness obtain a global score of 50% confidence for a relative error of 20% (40%) of the estimate for ice (liquid) clouds over ocean. These results need to be validated outside of the CALIPSO/CloudSat track.
Resumo:
On 23 November 1981, a strong cold front swept across the U.K., producing tornadoes from the west to the east coasts. An extensive campaign to collect tornado reports by the Tornado and Storm Research Organisation (TORRO) resulted in 104 reports, the largest U.K. outbreak. The front was simulated with a convection-permitting numerical model down to 200-m horizontal grid spacing to better understand its evolution and meteorological environment. The event was typical of tornadoes in the U.K., with convective available potential energy (CAPE) less than 150 J kg-1, 0-1-km wind shear of 10-20 m s-1, and a narrow cold-frontal rainband forming precipitation cores and gaps. A line of cyclonic absolute vorticity existed along the front, with maxima as large as 0.04 s-1. Some hook-shaped misovortices bore kinematic similarity to supercells. The narrow swath along which the line was tornadic was bounded on the equatorward side by weak vorticity along the line and on the poleward side by zero CAPE, enclosing a region where the environment was otherwise favorable for tornadogenesis. To determine if the 104 tornado reports were plausible, first possible duplicate reports were eliminated, resulting in as few as 58 tornadoes to as many as 90. Second, the number of possible parent misovortices that may have spawned tornadoes is estimated from model output. The number of plausible tornado reports in the 200-m grid-spacing domain was 22 and as many as 44, whereas the model simulation was used to estimate 30 possible parent misovortices within this domain. These results suggest that 90 reports was plausible.
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Several biotic crises during the past 300 million years have been linked to episodes of continental flood basalt volcanism, and in particular to the release of massive quantities of magmatic sulphur gas species. Flood basalt provinces were typically formed by numerous individual eruptions, each lasting years to decades. However, the environmental impact of these eruptions may have been limited by the occurrence of quiescent periods that lasted hundreds to thousands of years. Here we use a global aerosol model to quantify the sulphur-induced environmental effects of individual, decade-long flood basalt eruptions representative of the Columbia River Basalt Group, 16.5–14.5 million years ago, and the Deccan Traps, 65 million years ago. For a decade-long eruption of Deccan scale, we calculate a decadal-mean reduction in global surface temperature of 4.5 K, which would recover within 50 years after an eruption ceased unless climate feedbacks were very different in deep-time climates. Acid mists and fogs could have caused immediate damage to vegetation in some regions, but acid-sensitive land and marine ecosystems were well-buffered against volcanic sulphur deposition effects even during century-long eruptions. We conclude that magmatic sulphur from flood basalt eruptions would have caused a biotic crisis only if eruption frequencies and lava discharge rates had been high and sustained for several centuries at a time.
Resumo:
Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.