10 resultados para web management
em Digital Commons at Florida International University
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. ^ Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a twofold “custom wrapper” approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. ^ Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. ^ This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases. ^
Resumo:
This study investigated how harvest and water management affected the ecology of the Pig Frog, Rana grylio. It also examined how mercury levels in leg muscle tissue vary spatially across the Everglades. Rana grylio is an intermediate link in the Everglades food web. Although common, this inconspicuous species can be affected by three forms of anthropogenic disturbance: harvest, water management and mercury contamination. This frog is harvested both commercially and recreationally for its legs, is aquatic and thus may be susceptible to water management practices, and can transfer mercury throughout the Everglades food web. ^ This two-year study took place in three major regions: Everglades National Park (ENP), Water Conservation Areas 3A (A), and Water Conservation Area 3B (B). The study categorized the three sites by their relative harvest level and hydroperiod. During the spring of 2001, areas of the Everglades dried completely. On a regional and local scale Pig Frog abundance was highest in Site A, the longest hydroperiod, heavily harvested site, followed by ENP and B. More frogs were found along survey transects and in capture-recapture plots before the dry-down than after the dry-down in Sites ENP and B. Individual growth patterns were similar across all sites, suggesting differences in body size may be due to selective harvest. Frogs from Site A, the flooded and harvested site, had no differences in survival rates between adults and juveniles. Site B populations shifted from a juvenile to adult dominated population after the dry-down. Dry-downs appeared to affect survival rates more than harvest. ^ Total mercury in frog leg tissue was highest in protected areas of Everglades National Park with a maximum concentration of 2.3 mg/kg wet mass where harvesting is prohibited. Similar spatial patterns in mercury levels were found among pig frogs and other wildlife throughout parts of the Everglades. Pig Frogs may be transferring substantial levels of mercury to other wildlife species in ENP. ^ In summary, although it was found that abundance and survival were reduced by dry-down, lack of adult size classes in Site A, suggest harvest also plays a role in regulating population structure. ^
Resumo:
This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^
Resumo:
The web has emerged as a potent business channel. Yet many hospitality websites are irrelevant in a new and cluttered technical world. Knowing how to promote and advertise a website and capitalizing on available resources are the keys to success. The authors lay out a marketing plan for increasing hospitality website traffic.
Resumo:
The authors report the generally poor results attained when the NAACP assessed the diversity management performance of 16 major hotel companies. Then, as an alternative means of assessing the same hotel companies’ commitment to diversity, they report the results of an analysis of the world-wide web pages the companies use to represent themselves in the electronic marketplace. Analysis of the web sites found virtually no evidence of corporate concern for diversity.
Resumo:
The authors describe a project undertaken at the School of Hotel and Restaurant Management at Northern Arizona University in which the internet is used to present Native American tribes in Arizona with customer service training. It discusses why the project was instigated looks at its development and funding, and highlights the educational and technological challenges that had to be overcome. This is the second in a series of articles on the uses of the internet in educating non-university student constituencies interested in hospitality management.'
Resumo:
Menu analysis is the gathering and processing of key pieces of information to make it more manageable and understand- able. Ultimately, menu analysis allows managers to make more informed decisions about prices, costs, and items to be included on a menu. The author discusses If labor as well as food casts need to be included in menu analysis and if managers need to categorize menu items differently when doing menu analysis based on customer eating patterns.
Resumo:
Context The internet is gaining popularity as a means of delivering employee-based cardiovascular (CV) wellness interventions though little is known about the cardiovascular health outcomes of these programs. In this review, we examined the effectiveness of internet-based employee cardiovascular wellness and prevention programs. Evidence Acquisition We conducted a systematic review by searching PubMed, Web of Science and Cochrane library for all published studies on internet-based programs aimed at improving CV health among employees up to November 2012. We grouped the outcomes according to the American Heart Association (AHA) indicators of cardiovascular wellbeing – weight, BP, lipids, smoking, physical activity, diet, and blood glucose. Evidence Synthesis A total of 18 randomized trials and 11 follow-up studies met our inclusion/exclusion criteria. Follow-up duration ranged from 6 – 24 months. There were significant differences in intervention types and number of components in each intervention. Modest improvements were observed in more than half of the studies with weight related outcomes while no improvement was seen in virtually all the studies with physical activity outcome. In general, internet-based programs were more successful if the interventions also included some physical contact and environmental modification, and if they were targeted at specific disease entities such as hypertension. Only a few of the studies were conducted in persons at-risk for CVD, none in blue-collar workers or low-income earners. Conclusion Internet based programs hold promise for improving the cardiovascular wellness among employees however much work is required to fully understand its utility and long term impact especially in special/at-risk populations.
Resumo:
Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.