900 resultados para Web services. Service Composition. PEWS. Runtime systems
Resumo:
The ontological analysis of conceptual modelling techniques is of increasing popularity. Related research did not only explore the ontological deficiencies of classical techniques such as ER or UML, but also business process modelling techniques such as ARIS or even Web services standards such as BPEL4WS. While the selected ontologies are reasonably mature, it is the actual process of an ontological analysis that still lacks rigor. The current procedure leaves significant room for individual interpretations and is one reason for criticism of the entire ontological analysis. This paper proposes a procedural model for the ontological analysis based on the use of meta models, the involvement of more than one coder and metrics. This model is explained with examples from various ontological analyses.
Resumo:
O trabalho desenvolvido analisa a Comunicação Social no contexto da internet e delineia novas metodologias de estudo para a área na filtragem de significados no âmbito científico dos fluxos de informação das redes sociais, mídias de notícias ou qualquer outro dispositivo que permita armazenamento e acesso a informação estruturada e não estruturada. No intento de uma reflexão sobre os caminhos, que estes fluxos de informação se desenvolvem e principalmente no volume produzido, o projeto dimensiona os campos de significados que tal relação se configura nas teorias e práticas de pesquisa. O objetivo geral deste trabalho é contextualizar a área da Comunicação Social dentro de uma realidade mutável e dinâmica que é o ambiente da internet e fazer paralelos perante as aplicações já sucedidas por outras áreas. Com o método de estudo de caso foram analisados três casos sob duas chaves conceituais a Web Sphere Analysis e a Web Science refletindo os sistemas de informação contrapostos no quesito discursivo e estrutural. Assim se busca observar qual ganho a Comunicação Social tem no modo de visualizar seus objetos de estudo no ambiente das internet por essas perspectivas. O resultado da pesquisa mostra que é um desafio para o pesquisador da Comunicação Social buscar novas aprendizagens, mas a retroalimentação de informação no ambiente colaborativo que a internet apresenta é um caminho fértil para pesquisa, pois a modelagem de dados ganha corpus analítico quando o conjunto de ferramentas promovido e impulsionado pela tecnologia permite isolar conteúdos e possibilita aprofundamento dos significados e suas relações.
Resumo:
This thesis provides an interoperable language for quantifying uncertainty using probability theory. A general introduction to interoperability and uncertainty is given, with particular emphasis on the geospatial domain. Existing interoperable standards used within the geospatial sciences are reviewed, including Geography Markup Language (GML), Observations and Measurements (O&M) and the Web Processing Service (WPS) specifications. The importance of uncertainty in geospatial data is identified and probability theory is examined as a mechanism for quantifying these uncertainties. The Uncertainty Markup Language (UncertML) is presented as a solution to the lack of an interoperable standard for quantifying uncertainty. UncertML is capable of describing uncertainty using statistics, probability distributions or a series of realisations. The capabilities of UncertML are demonstrated through a series of XML examples. This thesis then provides a series of example use cases where UncertML is integrated with existing standards in a variety of applications. The Sensor Observation Service - a service for querying and retrieving sensor-observed data - is extended to provide a standardised method for quantifying the inherent uncertainties in sensor observations. The INTAMAP project demonstrates how UncertML can be used to aid uncertainty propagation using a WPS by allowing UncertML as input and output data. The flexibility of UncertML is demonstrated with an extension to the GML geometry schemas to allow positional uncertainty to be quantified. Further applications and developments of UncertML are discussed.
Resumo:
Overlaying maps using a desktop GIS is often the first step of a multivariate spatial analysis. The potential of this operation has increased considerably as data sources and Web services to manipulate them are becoming widely available via the Internet. Standards from the OGC enable such geospatial mashups to be seamless and user driven, involving discovery of thematic data. The user is naturally inclined to look for spatial clusters and correlation of outcomes. Using classical cluster detection scan methods to identify multivariate associations can be problematic in this context, because of a lack of control on or knowledge about background populations. For public health and epidemiological mapping, this limiting factor can be critical but often the focus is on spatial identification of risk factors associated with health or clinical status. Spatial entropy index HSu for the ScankOO analysis of the hypothetical dataset using a vicinity which is fixed by the number of points without distinction between their labels. (The size of the labels is proportional to the inverse of the index) In this article we point out that this association itself can ensure some control on underlying populations, and develop an exploratory scan statistic framework for multivariate associations. Inference using statistical map methodologies can be used to test the clustered associations. The approach is illustrated with a hypothetical data example and an epidemiological study on community MRSA. Scenarios of potential use for online mashups are introduced but full implementation is left for further research.
Resumo:
The American Academy of Optometry (AAO) had their annual meeting in San Diego in December 2005 and the BCLA and CLAE were well represented there. The BCLA does have a reasonable number of non-UK based members and hopefully in the future will attract more. This will certainly be beneficial to the society as a whole and may draw more delegates to the BCLA annual conference. To increase awareness of the BCLA at the AAO a special evening seminar was arranged where BCLA president Dr. James Wolffsohn gave his presidential address. Dr. Wolffsohn has given the presidential address in the UK, Ireland, Hong Kong and Japan – making it the most travelled presidential address for the BCLA to date. Aside from the BCLA activity at the AAO there were numerous lectures of interest to all, truly a “something for everyone” meeting. All the sessions were multi-track (often up to 10 things occurring at the same time) and the biggest dilemma was often deciding what to attend and more importantly what you will miss! Nearly 200 new AAO Fellows were inducted at the Gala Dinner from many countries including 3 new fellows from the UK (this year they all just happened to be from Aston University!). It is certainly one of the highlights of the AAO to see fellows from different schools of training from around the world fulfilling the same criteria and being duly rewarded for their commitment to the profession. BCLA members will be aware that 2006 sees the introduction of the new fellowship scheme of the BCLA and by the time you read this the first set of fellowship examinations will have taken place. For more details of the FBCLA scheme see the BCLA web site http://www.bcla.org.uk. Since many of CLAE's editorial panel were at the AAO an informal meeting and dinner was arranged for them where ideas were exchanged about the future of the journal. It is envisaged that the panel will meet twice a year – the next meeting will be at the BCLA conference. The biggest excitement by far was the fact that CLAE is now Medline/PubMed indexed. You may ask why is this significant to CLAE? PubMed is the free web-based service from the US National Library of Medicine. It holds over 15 million biomedical citations and abstracts from the Medline database. Medline is the largest component of PubMed and covers over 4800 journals published in more than 70 countries. The impact of this is that CLAE is starting to attract more submissions as researchers and authors are not worried that their work will not be hidden from other colleagues in the field but rather the work is available to view on the World Wide Web. CLAE is one of a very small number of contact lens journals that is indexed this way. Amongst the other CL journals listed you will note that the International Contact Lens Clinic has now merged with CLAE and the journal CLAO has been renamed Eye and Contact Lenses – making the list of indexed CL journals even smaller than it appears. The on-line submission and reviewing system introduced in 2005 has also made it easier for authors to submit their work and easier for reviewers to check the content. This ease of use has lead to quicker times from submission to publication. Looking back at the articles published in CLAE in 2005 reveals some interesting facts. The majority of the material still tends to be from UK groups related to the field of Optometry, although we hope that in the future we will attract more work from non-UK groups and also from non-Optometric areas such as refractive surgery or anterior eye pathology. Interestingly in 2005 the most downloaded article from CLAE was “Wavefront technology: Past, present and future” by Professor W. Neil Charman, who was also the recipient of the Charles F. Prentice award at the AAO – one of the highest awards honours that the AAO can bestow. Professor Charman was also the keynote speaker at the BCLA's first Pioneer's Day meeting in 2004. In 2006, readers of CLAE will notice more changes, firstly we are moving to 5 issues per year. It is hoped that in the future, depending on increased submissions, a move to 6 issues may be feasible. Secondly, CLAE will aim to have one article per issue that carries CL CET points. You will see in this issue there is an article from Professor Mark Wilcox (who was a keynote speaker at the BCLA conference in 2005). In future articles that carry CET points will be either reviews from BCLA conference keynote speakers, members of the editorial panel or material from other invited persons that will be of interest to the readership of CLAE. Finally, in 2006, you will notice a change to the Editorial Panel, some of the distinguished panel felt that it was good time to step down and new members have been invited to join the remaining panel. The panel represent some of the most eminent names in the fields of contact lenses and/or anterior eye and have varying backgrounds and interests from many of the prominent institutions around the world. One of the tasks that the Editorial Panel undertake is to seek out possible submissions to the journal, either from conferences they attend (posters and papers that they will see and hear) and from their own research teams. However, on behalf of CLAE I would like to extend that invitation to seek original articles to all readers – if you hear a talk and think it could make a suitable publication to CLAE please ask the presenters to submit the work via the on-line submission system. If you found the work interesting then the chances are so will others. CLAE invites submissions that are original research, full length articles, short case reports, full review articles, technical reports and letters to the editor. The on-line submission web page is http://www.ees.elsevier.com/clae/.
Resumo:
OpenMI is a widely used standard allowing exchange of data between integrated models, which has mostly been applied to dynamic, deterministic models. Within the FP7 UncertWeb project we are developing mechanisms and tools to support the management of uncertainty in environmental models. In this paper we explore the integration of the UncertWeb framework with OpenMI, to assess the issues that arise when propagating uncertainty in OpenMI model compositions, and the degree of integration possible with UncertWeb tools. In particular we develop an uncertainty-enabled model for a simple Lotka-Volterra system with an interface conforming to the OpenMI standard, exploring uncertainty in the initial predator and prey levels, and the parameters of the model equations. We use the Elicitator tool developed within UncertWeb to identify the initial condition uncertainties, and show how these can be integrated, using UncertML, with simple Monte Carlo propagation mechanisms. The mediators we develop for OpenMI models are generic and produce standard Web services that expose the OpenMI models to a Web based framework. We discuss what further work is needed to allow a more complete system to be developed and show how this might be used practically.
Resumo:
The paper has been presented at the International Conference Pioneers of Bulgarian Mathematics, Dedicated to Nikola Obreshko and Lubomir Tschakalo , So a, July, 2006.
Resumo:
Florida State University and University of Helsinki Information technology has the potential to deliver education to everybody by high quality online courses and associated services, and to enhance traditional face-to-face instruction by, e.g., web services offering virtually unlimited practice and step-bystep solutions to practice problems. Regardless of this, tools of information technology have not yet penetrated mathematics education in any meaningful way. This is mostly due to the inertia of academia: instructors are slow to change their working habits. This paper reports on an experiment where all the instructors (seven instructors and six teaching assistants) of a large calculus course were required to base their instruction on online content. The paper will analyze the effectiveness of various solutions used, and finishes with recommendations regarding best practices.
Resumo:
In the recent years the East-Christian iconographical art works have been digitized providing a large volume of data. The need for effective classification, indexing and retrieval of iconography repositories was the motivation of the design and development of a systemized ontological structure for description of iconographical art objects. This paper presents the ontology of the East-Christian iconographical art, developed to provide content annotation in the Virtual encyclopedia of Bulgarian iconography multimedia digital library. The ontology’s main classes, relations, facts, rules, and problems appearing during the design and development are described. The paper also presents an application of the ontology for learning analysis on an iconography domain implemented during the SINUS project “Semantic Technologies for Web Services and Technology Enhanced Learning”.
Resumo:
ACM Computing Classification System (1998): D.0, D.2.11.
Resumo:
Purpose – The paper challenges the focal firm perspective of much resource/capability research, identifying how a dyadic perspective facilitates identification of capabilities required for servitization. Design/methodology/approach – Exploratory study consisting of seven dyadic relationships in five sectors. Findings – An additional dimension of capabilities should be recognised; whether they are developed independently or interactively (with another actor). The following examples of interactively developed capabilities are identified: knowledge development, where partners interactively communicate to understand capabilities; service enablement, manufacturers work with suppliers and customers to support delivery of new services; service development, partners interact to optimise performance of existing services; risk management, customers work with manufacturers to manage risks of product acquisition/operation. Six propositions were developed to articulate these findings. Research implications/limitations – Interactively developed capabilities are created when two or more actors interact to create value. Interactively developed capabilities do not just reside within one firm and, therefore, cannot be a source of competitive advantage for one firm alone. Many of the capabilities required for servitization are interactive, yet have received little research attention. The study does not provide an exhaustive list of interactively developed capabilities, but demonstrates their existence in manufacturer/supplier and manufacturer/customer dyads. Practical implications – Manufacturers need to understand how to develop capabilities interactively to create competitive advantage and value and identify other actors with whom these capabilities can be developed. Originality/value – Previous research has focused on relational capabilities within a focal firm. This study extends existing theories to include interactively developed capabilities. The paper proposes that interactivity is a key dimension of actors’ complementary capabilities.
Resumo:
Extensive data sets on water quality and seagrass distributions in Florida Bay have been assembled under complementary, but independent, monitoring programs. This paper presents the landscape-scale results from these monitoring programs and outlines a method for exploring the relationships between two such data sets. Seagrass species occurrence and abundance data were used to define eight benthic habitat classes from 677 sampling locations in Florida Bay. Water quality data from 28 monitoring stations spread across the Bay were used to construct a discriminant function model that assigned a probability of a given benthic habitat class occurring for a given combination of water quality variables. Mean salinity, salinity variability, the amount of light reaching the benthos, sediment depth, and mean nutrient concentrations were important predictor variables in the discriminant function model. Using a cross-validated classification scheme, this discriminant function identified the most likely benthic habitat type as the actual habitat type in most cases. The model predicted that the distribution of benthic habitat types in Florida Bay would likely change if water quality and water delivery were changed by human engineering of freshwater discharge from the Everglades. Specifically, an increase in the seasonal delivery of freshwater to Florida Bay should cause an expansion of seagrass beds dominated by Ruppia maritima and Halodule wrightii at the expense of the Thalassia testudinum-dominated community that now occurs in northeast Florida Bay. These statistical techniques should prove useful for predicting landscape-scale changes in community composition in diverse systems where communities are in quasi-equilibrium with environmental drivers.
Resumo:
En esta investigación analizaremos las capacidades de la nube, centrándonos en la computación (entendiendo esta como la capacidad de procesamiento) y el almacenamiento de datos geolocalizados (redes de entrega de contenidos). Estas capacidades, unidas al modelo de negocio que implica un coste cero por aprovisionamiento y el pago por uso, además de la reducción de costes mediante sinergias y optimización de los centros de datos, suponen un escenario interesante para las PYMES dedicadas a la producción audiovisual. Utilizando los servicios en la nube, una empresa pequeña puede, en cuestión de minutos, alcanzar la potencia de procesamiento y distribución de gigantes de la producción audiovisual, sin necesidad de ocupar sus instalaciones ni pagar por adelantado el alquiler de los equipos. Describiremos los servicios de AWS (Amazon Web Services) que serían útiles a una empresa, el uso que darían de dichos servicios y su coste, y lo compararemos con el presupuesto por realizar una misma instalación físicamente en dicha empresa, teniendo que comprar e instalar los equipos.
Resumo:
PURPOSE: Radiation therapy is used to treat cancer using carefully designed plans that maximize the radiation dose delivered to the target and minimize damage to healthy tissue, with the dose administered over multiple occasions. Creating treatment plans is a laborious process and presents an obstacle to more frequent replanning, which remains an unsolved problem. However, in between new plans being created, the patient's anatomy can change due to multiple factors including reduction in tumor size and loss of weight, which results in poorer patient outcomes. Cloud computing is a newer technology that is slowly being used for medical applications with promising results. The objective of this work was to design and build a system that could analyze a database of previously created treatment plans, which are stored with their associated anatomical information in studies, to find the one with the most similar anatomy to a new patient. The analyses would be performed in parallel on the cloud to decrease the computation time of finding this plan. METHODS: The system used SlicerRT, a radiation therapy toolkit for the open-source platform 3D Slicer, for its tools to perform the similarity analysis algorithm. Amazon Web Services was used for the cloud instances on which the analyses were performed, as well as for storage of the radiation therapy studies and messaging between the instances and a master local computer. A module was built in SlicerRT to provide the user with an interface to direct the system on the cloud, as well as to perform other related tasks. RESULTS: The cloud-based system out-performed previous methods of conducting the similarity analyses in terms of time, as it analyzed 100 studies in approximately 13 minutes, and produced the same similarity values as those methods. It also scaled up to larger numbers of studies to analyze in the database with a small increase in computation time of just over 2 minutes. CONCLUSION: This system successfully analyzes a large database of radiation therapy studies and finds the one that is most similar to a new patient, which represents a potential step forward in achieving feasible adaptive radiation therapy replanning.