700 resultados para Urban flood
Resumo:
Toxic chemical pollutants such as heavy metals (HMs) are commonly present in urban stormwater. These pollutants can pose a significant risk to human health and hence a significant barrier for urban stormwater reuse. The primary aim of this study was to develop an approach for quantitatively assessing the risk to human health due to the presence of HMs in stormwater. This approach will lead to informed decision making in relation to risk management of urban stormwater reuse, enabling efficient implementation of appropriate treatment strategies. In this study, risks to human health from heavy metals were assessed as hazard index (HI) and quantified as a function of traffic and land use related parameters. Traffic and land use are the primary factors influencing heavy metal loads in the urban environment. The risks posed by heavy metals associated with total solids and fine solids (<150µm) were considered to represent the maximum and minimum risk levels, respectively. The study outcomes confirmed that Cr, Mn and Pb pose the highest risks, although these elements are generally present in low concentrations. The study also found that even though the presence of a single heavy metal does not pose a significant risk, the presence of multiple heavy metals could be detrimental to human health. These findings suggest that stormwater guidelines should consider the combined risk from multiple heavy metals rather than the threshold concentration of an individual species. Furthermore, it was found that risk to human health from heavy metals in stormwater is significantly influenced by traffic volume and the risk associated with stormwater from industrial areas is generally higher than that from commercial and residential areas.
Resumo:
The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality is influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigations in four urban residential catchments based at Gold Coast, Australia, and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling results confirmed that high intensity-short duration events produce 58.0% of TS load while they only generated 29.1% of total runoff volume. Additionally, rainfall events smaller than 6-month average recurrence interval (ARI) generates a greater cumulative runoff volume (68.4% of the total annual runoff volume) and TS load (68.6% of the TS load exported) than the rainfall events larger than 6-month ARI. The results suggest that for the study catchments, stormwater treatment design could be based on the rainfall which had a mean value of 31 mm/h average intensity and 0.4 h duration. These outcomes also confirmed that selecting smaller ARI rainfall events with high intensity-short duration as the threshold for treatment system design is the most feasible approach since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.
Resumo:
Hedonic property price analysis tells us that property prices can be affected by natural hazards such as floods. This paper examines the impact of flood-related variables (among other factors) on property values, and examines the effect of the release of flood risk map information on property values by comparing the impact with the effect of an actual flood incidence. An examination of the temporal variation of flood impacts on property values is also made. The study is the first of its kind where the impact of the release of flood risk map information to the public is compared with an actual flood incident. In this study, we adopt a spatial quasi-experimental analysis using the release of flood risk maps by Brisbane City Council in Queensland, Australia, in 2009 and the actual floods of 2011. The results suggest that property buyers are more responsive to the actual incidence of floods than to the disclosure of information to the public on the risk of floods.
Resumo:
Architecture focuses on designing built environments in response to society’s needs, reflecting culture through materials and forms. The physical boundaries of the city have become blurred through the integration of digital media, connecting the physical environment with the digital. In the recent past the future was imagined as highly technological; 1982 Ridley Scott’s Blade Runner is set in 2019 and introduces a world where supersized screens inject advertisements in the cluttered urban space. Now, in 2015 screens are central to everyday life, but in a completely different way in respect to what had been imagined. Through ubiquitous computing and social media, information is abundant. Digital technologies have changed the way people relate to cities supporting discussion on multiple levels, allowing citizens to be more vocal than ever before. We question how architects can use the affordances of urban informatics to obtain and navigate useful social information to inform design. This chapter investigates different approaches to engage communities in the debate on cities, in particular it aims to capture citizens’ opinions on the use and design of public places. Physical and digital discussions have been initiated to capture citizens’ opinions on the use and design of public places. In addition to traditional consultation methods, Web 2.0 platforms, urban screens, and mobile apps are used in the context of Brisbane, Australia to explore contemporary strategies of engagement (Gray 2014).
Resumo:
Open biomass burning from wildfires and the prescribed burning of forests and farmland is a frequent occurrence in South-East Queensland (SEQ), Australia. This work reports on data collected from 10-30 September 2011, which covers the days before (10-14 September), during (15-20 September) and after (21-30 September) a period of biomass burning in SEQ. The aim of this project was to comprehensively quantify the impact of the biomass burning on air quality in Brisbane, the capital city of Queensland. A multi-parameter field measurement campaign was conducted and ambient air quality data from 13 monitoring stations across SEQ were analysed. During the burning period, the average concentrations of all measured pollutants increased (from 20% to 430%) compared to the non-burning period (both before and after burning), except for total xylenes. The average concentration of O3, NO2, SO2, benzene, formaldehyde, PM10, PM2.5 and visibility-reducing particles reached their highest levels for the year, which were up to 10 times higher than annual average levels, while PM10, PM2.5 and SO2 concentrations exceeded the WHO 24-hour guidelines and O3 concentration exceeded the WHO maximum 8-hour average threshold during the burning period. Overall spatial variations showed that all measured pollutants, with the exception of O3, were closer to spatial homogeneity during the burning compared to the non-burning period. In addition to the above, elevated concentrations of three biomass burning organic tracers (levoglucosan, mannosan and galactosan), together with the amount of non-refractory organic particles (PM1) and the average value of f60 (attributed to levoglucosan), reinforce that elevated pollutant concentration levels were due to emissions from open biomass burning events, 70% of which were prescribed burning events. This study, which is the first and most comprehensive of its kind in Australia, provides quantitative evidence of the significant impact of open biomass burning events, especially prescribed burning, on urban air quality. The current results provide a solid platform for more detailed health and modelling investigations in the future.
Resumo:
This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.
Resumo:
The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.
Resumo:
China’s urbanization and industrialization are occupying farmland in large amounts, which is strongly driven by land finance regime. This is due to the intensified regional/local competition for manufacturing investment opportunities that push local governments to expropriate farmland at low prices while lease land at high market value to property developers. The additional revenue obtained in this way, termed financial increment in land values, can drive local economic growth, and provide associated infrastructure and other public services. At the same time, however, a floating population of large numbers of inadequately compensated land-lost farmers, although unable to become citizens, have to migrate into the urban areas for work, causing overheated employment and housing markets, with rocketing unaffordable housing prices. This, together with various micro factors relating to the party/state’s promotion/evaluation system play an essential role leading to some serious economic, environment and social consequences, e.g., on migrant welfare, the displacement of peasants and the loss of land resources that requires immediate attention. Our question is: whether such type of urbanization is sustainable? What are the mechanisms behind such a phenomenal urbanization process? From the perspective of institutionalism, this paper aims to investigate the institutional background of the urban growth dilemma and solutions in urban China and to introduce further an inter-regional game theoretical framework to indicate why the present urbanization pattern is unsustainable. Looking forward to 2030, paradigm policy changes are made from the triple consideration of floating population, social security and urban environmental pressures. This involves: (1) changing land increment based finance regime into land stock finance system; (2) the citizenization of migrant workers with affordable housing, and; (3) creating a more enlightened local government officer appraisal system to better take into account societal issues such as welfare and beyond.
Resumo:
The most difficult operation in flood inundation mapping using optical flood images is to map the ‘wet’ areas where trees and houses are partly covered by water. This can be referred to as a typical problem of the presence of mixed pixels in the images. A number of automatic information extracting image classification algorithms have been developed over the years for flood mapping using optical remote sensing images, with most labelling a pixel as a particular class. However, they often fail to generate reliable flood inundation mapping because of the presence of mixed pixels in the images. To solve this problem, spectral unmixing methods have been developed. In this thesis, methods for selecting endmembers and the method to model the primary classes for unmixing, the two most important issues in spectral unmixing, are investigated. We conduct comparative studies of three typical spectral unmixing algorithms, Partial Constrained Linear Spectral unmixing, Multiple Endmember Selection Mixture Analysis and spectral unmixing using the Extended Support Vector Machine method. They are analysed and assessed by error analysis in flood mapping using MODIS, Landsat and World View-2 images. The Conventional Root Mean Square Error Assessment is applied to obtain errors for estimated fractions of each primary class. Moreover, a newly developed Fuzzy Error Matrix is used to obtain a clear picture of error distributions at the pixel level. This thesis shows that the Extended Support Vector Machine method is able to provide a more reliable estimation of fractional abundances and allows the use of a complete set of training samples to model a defined pure class. Furthermore, it can be applied to analysis of both pure and mixed pixels to provide integrated hard-soft classification results. Our research also identifies and explores a serious drawback in relation to endmember selections in current spectral unmixing methods which apply fixed sets of endmember classes or pure classes for mixture analysis of every pixel in an entire image. However, as it is not accurate to assume that every pixel in an image must contain all endmember classes, these methods usually cause an over-estimation of the fractional abundances in a particular pixel. In this thesis, a subset of adaptive endmembers in every pixel is derived using the proposed methods to form an endmember index matrix. The experimental results show that using the pixel-dependent endmembers in unmixing significantly improves performance.
Resumo:
The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the “Smart City” as a centralised service delivery platform predicted to optimise and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another “IT bubble” emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term “slacktivism” is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a “Like” button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and liveable human habitats. With this article, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centrepiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues. The rationale for this approach is an alternative to smart cities in a “perpetual tomorrow,” based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasise and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city poses. We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.