928 resultados para Web-Management Blog


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The manufacturing industry faces many challenges such as reducing time-to-market and cutting costs. In order to meet these increasing demands, effective methods are need to support the early product development stages by bridging the gap of communicating early design ideas and the evaluation of manufacturing performance. This paper introduces methods of linking design and manufacturing domains using disparate technologies. The combined technologies include knowledge management supporting for product lifecycle management systems, Enterprise Resource Planning (ERP) systems, aggregate process planning systems, workflow management and data exchange formats. A case study has been used to demonstrate the use of these technologies, illustrated by adding manufacturing knowledge to generate alternative early process plan which are in turn used by an ERP system to obtain and optimise a rough-cut capacity plan. Copyright © 2010 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Risk management and knowledge management have so far been studied almost independently. The evolution of risk management to the holistic view of Enterprise Risk Management requires the destruction of barriers between organizational silos and the exchange and application of knowledge from different risk management areas. However, knowledge management has received little or no attention in risk management. This paper examines possible relationships between knowledge management constructs related to knowledge sharing, and two risk management concepts: perceived quality of risk control and perceived value of enterprise risk management. From a literature review, relationships with eight knowledge management variables covering people, process and technology aspects were hypothesised. A survey was administered to risk management employees in financial institutions. The results showed that the perceived quality of risk control is significantly associated with four knowledge management variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the knowledge management variables to the perceived value of enterprise risk management are not significant. We conclude that better knowledge management is associated with better risk control, but that more effort needs to be made to break down organizational silos in order to support true Enterprise Risk Management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis: Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and dissemination: The study does not require ethical approval as primary data will not be collected. The results of the study will be published in a peer-reviewed journal and presented at relevant conferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. ^ Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a twofold “custom wrapper” approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. ^ Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. ^ This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigated how harvest and water management affected the ecology of the Pig Frog, Rana grylio. It also examined how mercury levels in leg muscle tissue vary spatially across the Everglades. Rana grylio is an intermediate link in the Everglades food web. Although common, this inconspicuous species can be affected by three forms of anthropogenic disturbance: harvest, water management and mercury contamination. This frog is harvested both commercially and recreationally for its legs, is aquatic and thus may be susceptible to water management practices, and can transfer mercury throughout the Everglades food web. ^ This two-year study took place in three major regions: Everglades National Park (ENP), Water Conservation Areas 3A (A), and Water Conservation Area 3B (B). The study categorized the three sites by their relative harvest level and hydroperiod. During the spring of 2001, areas of the Everglades dried completely. On a regional and local scale Pig Frog abundance was highest in Site A, the longest hydroperiod, heavily harvested site, followed by ENP and B. More frogs were found along survey transects and in capture-recapture plots before the dry-down than after the dry-down in Sites ENP and B. Individual growth patterns were similar across all sites, suggesting differences in body size may be due to selective harvest. Frogs from Site A, the flooded and harvested site, had no differences in survival rates between adults and juveniles. Site B populations shifted from a juvenile to adult dominated population after the dry-down. Dry-downs appeared to affect survival rates more than harvest. ^ Total mercury in frog leg tissue was highest in protected areas of Everglades National Park with a maximum concentration of 2.3 mg/kg wet mass where harvesting is prohibited. Similar spatial patterns in mercury levels were found among pig frogs and other wildlife throughout parts of the Everglades. Pig Frogs may be transferring substantial levels of mercury to other wildlife species in ENP. ^ In summary, although it was found that abundance and survival were reduced by dry-down, lack of adult size classes in Site A, suggest harvest also plays a role in regulating population structure. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The web has emerged as a potent business channel. Yet many hospitality websites are irrelevant in a new and cluttered technical world. Knowing how to promote and advertise a website and capitalizing on available resources are the keys to success. The authors lay out a marketing plan for increasing hospitality website traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors report the generally poor results attained when the NAACP assessed the diversity management performance of 16 major hotel companies. Then, as an alternative means of assessing the same hotel companies’ commitment to diversity, they report the results of an analysis of the world-wide web pages the companies use to represent themselves in the electronic marketplace. Analysis of the web sites found virtually no evidence of corporate concern for diversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors describe a project undertaken at the School of Hotel and Restaurant Management at Northern Arizona University in which the internet is used to present Native American tribes in Arizona with customer service training. It discusses why the project was instigated looks at its development and funding, and highlights the educational and technological challenges that had to be overcome. This is the second in a series of articles on the uses of the internet in educating non-university student constituencies interested in hospitality management.'

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Menu analysis is the gathering and processing of key pieces of information to make it more manageable and understand- able. Ultimately, menu analysis allows managers to make more informed decisions about prices, costs, and items to be included on a menu. The author discusses If labor as well as food casts need to be included in menu analysis and if managers need to categorize menu items differently when doing menu analysis based on customer eating patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context The internet is gaining popularity as a means of delivering employee-based cardiovascular (CV) wellness interventions though little is known about the cardiovascular health outcomes of these programs. In this review, we examined the effectiveness of internet-based employee cardiovascular wellness and prevention programs. Evidence Acquisition We conducted a systematic review by searching PubMed, Web of Science and Cochrane library for all published studies on internet-based programs aimed at improving CV health among employees up to November 2012. We grouped the outcomes according to the American Heart Association (AHA) indicators of cardiovascular wellbeing – weight, BP, lipids, smoking, physical activity, diet, and blood glucose. Evidence Synthesis A total of 18 randomized trials and 11 follow-up studies met our inclusion/exclusion criteria. Follow-up duration ranged from 6 – 24 months. There were significant differences in intervention types and number of components in each intervention. Modest improvements were observed in more than half of the studies with weight related outcomes while no improvement was seen in virtually all the studies with physical activity outcome. In general, internet-based programs were more successful if the interventions also included some physical contact and environmental modification, and if they were targeted at specific disease entities such as hypertension. Only a few of the studies were conducted in persons at-risk for CVD, none in blue-collar workers or low-income earners. Conclusion Internet based programs hold promise for improving the cardiovascular wellness among employees however much work is required to fully understand its utility and long term impact especially in special/at-risk populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing can be defined as a distributed computational model by through resources (hardware, storage, development platforms and communication) are shared, as paid services accessible with minimal management effort and interaction. A great benefit of this model is to enable the use of various providers (e.g a multi-cloud architecture) to compose a set of services in order to obtain an optimal configuration for performance and cost. However, the multi-cloud use is precluded by the problem of cloud lock-in. The cloud lock-in is the dependency between an application and a cloud platform. It is commonly addressed by three strategies: (i) use of intermediate layer that stands to consumers of cloud services and the provider, (ii) use of standardized interfaces to access the cloud, or (iii) use of models with open specifications. This paper outlines an approach to evaluate these strategies. This approach was performed and it was found that despite the advances made by these strategies, none of them actually solves the problem of lock-in cloud. In this sense, this work proposes the use of Semantic Web to avoid cloud lock-in, where RDF models are used to specify the features of a cloud, which are managed by SPARQL queries. In this direction, this work: (i) presents an evaluation model that quantifies the problem of cloud lock-in, (ii) evaluates the cloud lock-in from three multi-cloud solutions and three cloud platforms, (iii) proposes using RDF and SPARQL on management of cloud resources, (iv) presents the cloud Query Manager (CQM), an SPARQL server that implements the proposal, and (v) comparing three multi-cloud solutions in relation to CQM on the response time and the effectiveness in the resolution of cloud lock-in.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negli ultimi vent'anni con lo sviluppo di Internet, il modo di comunicare tra le persone �è totalmente cambiato. Grazie a Internet si sono ridotte le distanze e soprattutto tramite i siti web le aziende hanno una propria vetrina sul mondo sempre accessibile. Tutto ci�ò ha portato a nuovi comportamenti da parte dei consumatori che divengono sempre pi�u esigenti nella vastità di informazioni presenti sul Web. Perciò è necessario che le web companies riescano a produrre website efficienti e usabili per favorire l'interazione con l'utente. Inoltre il web ha avuto una rapida espansione per quanto concerne le metodologie di sviluppo e analisi del comportamento del consumatore. Si cercano sempre nuovi spunti per poter acquisire quello che �è il percorso di un utente affinché porti a termine una determinata azione nel proprio dominio. Per questo motivo, oltre agli strumenti gi�à consolidati come il riempimento di questionari o il tracking per mezzo di piattaforme come Google Analytics, si �è pensato di andare oltre e cercare di analizzare ancora pi�u a fondo il "consumAttore". Grazie ad un eye-tracker �è possibile riconoscere quelli che sono i modelli cognitivi che riguardano il percorso di ricerca, valutazione e acquisto di un prodotto o una call to action, e come i contenuti di una web application influenzano l'attenzione e la user experience. Pertanto l'obiettivo che si pone questo studio �è quello di poter misurare l'engagement della navigazione utente di una web application e, nel caso fosse necessario, ottimizzare i contenuti al suo interno. Per il rilevamento delle informazioni necessarie durante l'esperimento, mi sono servito di uno strumento a supporto delle decisioni, ovvero un eye-tracker e della successiva somministrazione di questionari.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O principal objectivo deste relatório é documentar a experiência como estagiário na empresa The Silver Factory – Marketing, Design e Gestão Comercial, Lda, no âmbito da unidade curricular de Estágio do Mestrado em Engenharia-Informática – Computação Móvel (MEI-CM). Sendo este estágio uma parte integrante para a obtenção do grau de Mestre, foram postos em prática conhecimentos adquiridos ao longo do meu percurso universitário (Licenciatura e 1º ano de Mestrado) de modo a concluir com sucesso todos os projetos que me foram propostos. Durante este estágio trabalhei na área da comunicação online, através do desenvolvimento de websites e auxiliando na gestão do alojamento dos clientes. O principal objetivo foi desenvolver um backoffice e os seus respetivos componentes. Dos vários componentes desenvolvidos os mais importantes foram o Content Management System que permite fazer a gestão do conteúdo apresentado em frontend, e o Projetct Management Tool, que permite a uma empresa gerir os seus clientes, e os projetos que desenvolve para os mesmos, com a possibilidade de visualização do progresso de desenvolvimento desses projetos utilizando um gráfico de Gantt interativo.