4 resultados para interactive web site
em Digital Commons at Florida International University
Resumo:
The primary purpose of this thesis was to design and create an Interactive Audit to conduct Environmental Site Assessments according to American Society of Testing Material's (ASTM) Phase I Standards at the Wagner Creek study area. ArcPad and ArcIMS are the major software that were used to create the model and ArcGIS Desktop was used for data analysis and to export shapefile symbology to ArcPad. Geographic Information Systems (GIS) is an effective tool to deploy these purposes. This technology was utilized to carry out data collection, data analysis and to display data interactively on the Internet. Electronic forms, customized for mobile devices were used to survey sites. This is an easy and fast way to collect and modify field data. New data such as land use, recognized environmental conditions, and underground storage tanks can be added into existing datasets. An updated map is then generated and uploaded to the Internet using ArcIMS technology. The field investigator has the option to generate and view the Inspection Form at the end of his survey on site, or print a hardcopy at base. The mobile device also automatically generates preliminary editable Executive Reports for any inspected site.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. ^ Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a twofold “custom wrapper” approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. ^ Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. ^ This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases. ^
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.
Resumo:
The Everglades R-EMAP project for year 2005 produced large quantities of data collected at 232 sampling sites. Data collection and analysis is an on-going long-term activity conducted by scientists of different disciplines at irregular intervals of several years. The data sets collected for 2005 include bio-geo-chemical (including mercury and hydro period), fish, invertebrate, periphyton, and plant data. Each sampling site is associated with a location, a description of the site to provide a general overview and photographs to provide a pictorial impression. The Geographic Information Systems and Remote Sensing Center(GISRSC) at Florida International University (FIU) has designed and implemented an enterprise database for long-term storage of the project�s data in a central repository, providing the framework of data storage for the continuity of future sampling campaigns and allowing integration of new sample data as it becomes available. In addition GISRSC provides this interactive web application for easy, quick and effective retrieval and visualization of that data.