6 resultados para spatial data analysis
em CUNY Academic Works
Resumo:
Researchers analyzing spatiotemporal or panel data, which varies both in location and over time, often find that their data has holes or gaps. This thesis explores alternative methods for filling those gaps and also suggests a set of techniques for evaluating those gap-filling methods to determine which works best.
Resumo:
The Enriquillo and Azuei are saltwater lakes located in a closed water basin in the southwestern region of the island of La Hispaniola, these have been experiencing dramatic changes in total lake-surface area coverage during the period 1980-2012. The size of Lake Enriquillo presented a surface area of approximately 276 km2 in 1984, gradually decreasing to 172 km2 in 1996. The surface area of the lake reached its lowest point in the satellite observation record in 2004, at 165 km2. Then the recent growth of the lake began reaching its 1984 size by 2006. Based on surface area measurement for June and July 2013, Lake Enriquillo has a surface area of ~358 km2. Sumatra sizes at both ends of the record are 116 km2 in 1984 and 134 km2in 2013, an overall 15.8% increase in 30 years. Determining the causes of lake surface area changes is of extreme importance due to its environmental, social, and economic impacts. The overall goal of this study is to quantify the changing water balance in these lakes and their catchment area using satellite and ground observations and a regional atmospheric-hydrologic modeling approach. Data analyses of environmental variables in the region reflect a hydrological unbalance of the lakes due to changing regional hydro-climatic conditions. Historical data show precipitation, land surface temperature and humidity, and sea surface temperature (SST), increasing over region during the past decades. Salinity levels have also been decreasing by more than 30% from previously reported baseline levels. Here we present a summary of the historical data obtained, new sensors deployed in the sourrounding sierras and the lakes, and the integrated modeling exercises. As well as the challenges of gathering, storing, sharing, and analyzing this large volumen of data in a remote location from such a diverse number of sources.
Resumo:
ABSTRACT World Heritage sites provide a glimpse into the stories and civilizations of the past. There are currently 1007 unique World Heritage properties with 779 being classified as cultural sites, 197 as natural sites, and 31 falling into the categories of both cultural and natural sites (UNESCO & World Heritage Centre, 1992-2015). However, of these 1007 World Heritage sites, at least 46 are categorized as in danger and this number continues to grow. These unique and irreplaceable sites are exceptional because of their universality. Consequently, since World Heritage sites belong to all the people of the world and provide inspiration and admiration to all who visit them, it is our responsibility to help preserve these sites. The key form of preservation involves the individual monitoring of each site over time. While traditional methods are still extremely valuable, more recent advances in the field of geographic and spatial technologies including geographic information systems (GIS), laser scanning, and remote sensing, are becoming more beneficial for the monitoring and overall safeguarding of World Heritage sites. Through the employment and analysis of more accurately detailed spatial data, World Heritage sites can be better managed. There is a strong urgency to protect these sites. The purpose of this thesis is to describe the importance of taking care of World Heritage sites and to depict a way in which spatial technologies can be used to monitor and in effect preserve World Heritage sites through the utilization of remote sensing imagery. The research conducted in this thesis centers on the Everglades National Park, a World Heritage site that is continually affected by changes in vegetation. Data used include Landsat satellite imagery that dates from 2001-2003, the Everglades' boundaries shapefile, and Google Earth imagery. In order to conduct the in-depth analysis of vegetation change within the selected World Heritage site, three main techniques were performed to study changes found within the imagery. These techniques consist of conducting supervised classification for each image, incorporating a vegetation index known as Normalized Vegetation Index (NDVI), and utilizing the change detection tool available in the Environment for Visualizing Images (ENVI) software. With the research and analysis conducted throughout this thesis, it has been shown that within the three year time span (2001-2003), there has been an overall increase in both areas of barren soil (5.760%) and areas of vegetation (1.263%) with a decrease in the percentage of areas classified as sparsely vegetated (-6.987%). These results were gathered through the use of the maximum likelihood classification process available in the ENVI software. The results produced by the change detection tool which further analyzed vegetation change correlate with the results produced by the classification method. As well, by utilizing the NDVI method, one is able to locate changes by selecting a specific area and comparing the vegetation index generated for each date. It has been found that through the utilization of remote sensing technology, it is possible to monitor and observe changes featured within a World Heritage site. Remote sensing is an extraordinary tool that can and should be used by all site managers and organizations whose goal it is to preserve and protect World Heritage sites. Remote sensing can be used to not only observe changes over time, but it can also be used to pinpoint threats within a World Heritage site. World Heritage sites are irreplaceable sources of beauty, culture, and inspiration. It is our responsibility, as citizens of this world, to guard these treasures.
Resumo:
The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.
Resumo:
Instrumentation and automation plays a vital role to managing the water industry. These systems generate vast amounts of data that must be effectively managed in order to enable intelligent decision making. Time series data management software, commonly known as data historians are used for collecting and managing real-time (time series) information. More advanced software solutions provide a data infrastructure or utility wide Operations Data Management System (ODMS) that stores, manages, calculates, displays, shares, and integrates data from multiple disparate automation and business systems that are used daily in water utilities. These ODMS solutions are proven and have the ability to manage data from smart water meters to the collaboration of data across third party corporations. This paper focuses on practical, utility successes in the water industry where utility managers are leveraging instantaneous access to data from proven, commercial off-the-shelf ODMS solutions to enable better real-time decision making. Successes include saving $650,000 / year in water loss control, safeguarding water quality, saving millions of dollars in energy management and asset management. Immediate opportunities exist to integrate the research being done in academia with these ODMS solutions in the field and to leverage these successes to utilities around the world.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.