923 resultados para Negotiation Support Environment


Relevância:

80.00% 80.00%

Publicador:

Resumo:

"This column is distinguished from previous Impact columns in that it concerns the development tightrope between research and commercial take-up and the role of the LGPL in an open source workflow toolkit produced in a University environment. Many ubiquitous systems have followed this route, (Apache, BSD Unix, ...), and the lessons this Service Oriented Architecture produces cast yet more light on how software diffuses out to impact us all." Michiel van Genuchten and Les Hatton Workflow management systems support the design, execution and analysis of business processes. A workflow management system needs to guarantee that work is conducted at the right time, by the right person or software application, through the execution of a workflow process model. Traditionally, there has been a lack of broad support for a workflow modeling standard. Standardization efforts proposed by the Workflow Management Coalition in the late nineties suffered from limited support for routing constructs. In fact, as later demonstrated by the Workflow Patterns Initiative (www.workflowpatterns.com), a much wider range of constructs is required when modeling realistic workflows in practice. YAWL (Yet Another Workflow Language) is a workflow language that was developed to show that comprehensive support for the workflow patterns is achievable. Soon after its inception in 2002, a prototype system was built to demonstrate that it was possible to have a system support such a complex language. From that initial prototype, YAWL has grown into a fully-fledged, open source workflow management system and support environment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Business Process Management domain has evolved at a dramatic pace over the past two decades and the notion of the business process has become a ubiquitous part of the modern business enterprise. Most organizations now view their operations in terms of business processes and manage these business processes in the same way as other corporate assets. In recent years, an increasingly broad range of generic technology has become available for automating business processes. This is part of a growing trend in the software engineering field throughout the past 40 years, where aspects of functionality that are potentially reusable on a widespread basis have coalesced into generic software components. Figure 2.1 illustrates this trend and shows how software systems have evolved from the monolithic applications of the 1960s developed in their entirety often by a single development team to today’s offerings that are based on the integration of a range of generic technologies with only a small component of the application actually being developed from scratch. In the 1990s, generic functionality for the automation of business processes first became commercially available in the form of workflow technology and subsequently evolved in the broader field of business process management systems (BPMS). This technology alleviated the necessity to develop process support within applications from scratch and provided a variety of off-the-shelf options on which these requirements could be based. The demand for this technology was significant and it is estimated that by 2000 there were well over 200 distinct workflow offerings in the market, each with a distinct conceptual foundation. Anticipating the difficulties that would be experienced by organizations seeking to utilize and integrate distinct workflow offerings, the Workflow Management Coalition (WfMC), an industry group formed to advance technology in this area, proposed a standard reference model for workflow technology with an express desire to seek a common platform for achieving workflow interoperation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

开展高技术情报跟踪与服务是科技情报工作的重要任务之一,结合完成国家科委、科学院和国家863等高技术情报研究项目的多年实践,叙述了为我国高技术研究创造良好的文献情报支撑环境,主动为高技术研究服务的几点做法。

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research is based on consumer complaints with respect to recently purchased consumer electronics. This research document will investigate the instances of development and device management as a tool used to aid consumer and manage consumer’s mobile products in order to resolve issues in or before the consumers is aware one exists. The problem at the present time is that mobile devices are becoming very advanced pieces of technology, and not all manufacturers and network providers have kept up the support element of End users. As such, the subject of the research is to investigate how device management could possibly be used as a method to promote research and development of mobile devices, and provide a better experience for the consumer. The wireless world is becoming increasingly complex as revenue opportunities are driven by new and innovative data services. We can no longer expect the customer to have the knowledge or ability to configure their own device. Device Management platforms can address the challenges of device configuration and support through new enabling technologies. Leveraging these technologies will allow a network operator to reduce the cost of subscriber ownership, drive increased ARPU (Average Revenue per User) by removing barriers to adoption, reduce churn by improving the customer experience and increase customer loyalty. DM technologies provide a flexible and powerful management method but are managing the same device features that have historically been configured manually through call centers or by the end user making changes directly on the device. For this reason DM technologies must be treated as part of a wider support solution. The traditional requirement for discovery, fault finding, troubleshooting and diagnosis are still as relevant with DM as they are in the current human support environment yet the current generation of solutions do little to address this problem. In the deployment of an effective Device Management solution the network operator must consider the integration of the DM platform, interfacing with many areas of the business, supported by knowledge of the relationship between devices, applications, solutions and services maintained on an ongoing basis. Complementing the DM solution with published device information, setup guides, training material and web based tools will ensure the quality of the customer experience, ensuring that problems are completely resolved, driving data usage by focusing customer education on the use of the wireless service In this way device management becomes a tool used both internally within the network or device vendor and by the customer themselves, with each user empowered to effectively manage the device without any prior knowledge or experience, confident that changes they apply will be relevant, accurate, stable and compatible. The value offered by an effective DM solution with an expert knowledge service will become a significant differentiator for the network operator in an ever competitive wireless market. This research document is intended to highlight some of the issues the industry faces as device management technologies become more prevalent, and offers some potential solutions to simplify the increasingly complex task of managing devices on the network, where device management can be used as a tool to aid customer relations and manage customer’s mobile products in order to resolve issues before the user is aware one exists. The research is broken down into the following, Customer Relationship Management, Device management, the role of knowledge with the DM, Companies that have successfully implemented device management, and the future of device management and CRM. And it also consists of questionnaires aimed at technical support agents and mobile device users. Interview was carried out with CRM managers within support centre to further the evidence gathered. To conclude, the document is to consider the advantages and disadvantages of device management and attempt to determine the influence it will have over customer support centre, and what methods could be used to implement it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article describes research in a new theory of decision support in negotiation in family law mediation. AssetDivider was based on the principles of Family_Winner. As a Negotiation Decision Support System Family_Winner takes ratings assigned to items by the parties involved and develops a list of allocations to each party; based on trade-offs inherently present in the dispute. Given advice provided from our industry partners Relationships Australia (Queensland) - RAQ, AssetDivider uses an ideal “percentage split” to guide the development of an allocation list for parties. The system has been tested informally by our contacts at RAQ, and we now look forward to extensive testing and evaluation by mediators at RAQ in the near future. We hope to report on a comprehensive evaluation which will report on the effectiveness of this program in practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To provide statistician end users with a visual language environment for complex statistical survey design and implementation. Methods: We have developed, in conjunction with professional statisticians, the Statistical Design Language (SDL), an integrated suite of visual languages aimed at supporting the process of designing statistical surveys, and its support environment, SDLTool. SDL comprises five diagrammatic notations: survey diagrams, data diagrams, technique diagrams, task diagrams and process diagrams. SDLTool provides an integrated environment supporting design, coordination, execution, sharing and publication of complex statistical survey techniques as web services. SDLTool allows association of model components with survey artefacts, including data sets, metadata, and statistical package analysis scripts, with the ability to execute elements of the survey design model to implement survey analysis. Results: We describe three evaluations of SDL and SDLTool: use of the notation by expert statistician to design and execute surveys; useability evaluation of the environment; and assessment of several generated statistical analysis web services. Conclusion: We have shown the effectiveness of SDLTool for supporting statistical survey design and implementation. Practice implications: We have developed a more effective approach to supporting statisticians in their survey design work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Critics have emerged in recent times as a specific tool feature to support users in computer-mediated tasks. These computer-supported critics provide proactive guidelines or suggestions for improvement to designs, code, and other digital artifacts. The concept of a critic has been adopted in various domains, including medical, programming, software engineering, design sketching, and others. Critics have been shown to be an effective mechanism for providing feedback to users. We propose a new critic taxonomy based on extensive review of the critic literature. The groups and elements of our critic taxonomy are presented and explained collectively with examples, including the mapping of 13 existing critic tools, predominantly for software engineering and programming education tasks to the taxonomy. We believe this critic taxonomy will assist others in identifying, categorizing, developing, and deploying computer-supported critics in a range of domains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A presente tese visa contribuir na construção de ambientes de desenvolvimento de software através da proposição de uma arquitetura reflexiva para ambiente de suporte a processo, nomeada WRAPPER (Webbased Reflective Architecture for Process suPport EnviRonment). O objetivo desta arquitetura é prover uma infra-estrutura para um ambiente de suporte a processo de software, integrando tecnologias da World Wide Web, objetos distribuídos e reflexão computacional. A motivação principal para esta arquitetura vem da necessidade de se obter maior flexibilidade na gerência de processo de software. Esta flexibilidade é obtida através do uso de objetos reflexivos que permitem a um gerente de processo obter informações e também alterar o processo de software de forma dinâmica. Para se obter um ambiente integrado, a arquitetura provê facilidades para a agregação de ferramentas CASE de plataformas e fabricantes diversos, mesmo disponibilizadas em locais remotos. A integração de ferramentas heterogêneas e distribuídas é obtida através do uso de tecnologias Web e de objetos distribuídos. Reflexão computacional é usada no ambiente tanto para extrair dados da execução do processo, quanto para permitir a adaptação do mesmo. Isto é feito através da introdução e controle de meta-objetos, no metanível da arquitetura, que podem monitorar e mesmo alterar os objetos do nível base. Como resultado, a arquitetura provê as seguintes características: flexibilidade na gerência de processo, permitindo o controle e adaptação do processo; distribuição do ambiente na Web, permitindo a distribuição de tarefas do processo de software e a integração de ferramentas em locais remotos; e heterogeneidade para agregar componentes ao ambiente, permitindo o uso de ferramentas de plataformas e fornecedores diversos. Neste contexto, o presente trabalho apresenta a estrutura da arquitetura reflexiva, bem como os mecanismos usados (e suas interações) para a modelagem e execução de processo dentro do ambiente de suporte ao processo de software.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pós-graduação em Ciência da Computação - IBILCE

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this report it was designed an innovative satellite-based monitoring approach applied on the Iraqi Marshlands to survey the extent and distribution of marshland re-flooding and assess the development of wetland vegetation cover. The study, conducted in collaboration with MEEO Srl , makes use of images collected from the sensor (A)ATSR onboard ESA ENVISAT Satellite to collect data at multi-temporal scales and an analysis was adopted to observe the evolution of marshland re-flooding. The methodology uses a multi-temporal pixel-based approach based on classification maps produced by the classification tool SOIL MAPPER ®. The catalogue of the classification maps is available as web service through the Service Support Environment Portal (SSE, supported by ESA). The inundation of the Iraqi marshlands, which has been continuous since April 2003, is characterized by a high degree of variability, ad-hoc interventions and uncertainty. Given the security constraints and vastness of the Iraqi marshlands, as well as cost-effectiveness considerations, satellite remote sensing was the only viable tool to observe the changes taking place on a continuous basis. The proposed system (ALCS – AATSR LAND CLASSIFICATION SYSTEM) avoids the direct use of the (A)ATSR images and foresees the application of LULCC evolution models directly to „stock‟ of classified maps. This approach is made possible by the availability of a 13 year classified image database, conceived and implemented in the CARD project (http://earth.esa.int/rtd/Projects/#CARD).The approach here presented evolves toward an innovative, efficient and fast method to exploit the potentiality of multi-temporal LULCC analysis of (A)ATSR images. The two main objectives of this work are both linked to a sort of assessment: the first is to assessing the ability of modeling with the web-application ALCS using image-based AATSR classified with SOIL MAPPER ® and the second is to evaluate the magnitude, the character and the extension of wetland rehabilitation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many industries and academic institutions share the vision that an appropriate use of information originated from the environment may add value to services in multiple domains and may help humans in dealing with the growing information overload which often seems to jeopardize our life. It is also clear that information sharing and mutual understanding between software agents may impact complex processes where many actors (humans and machines) are involved, leading to relevant socioeconomic benefits. Starting from these two input, architectural and technological solutions to enable “environment-related cooperative digital services” are here explored. The proposed analysis starts from the consideration that our environment is physical space and here diversity is a major value. On the other side diversity is detrimental to common technological solutions, and it is an obstacle to mutual understanding. An appropriate environment abstraction and a shared information model are needed to provide the required levels of interoperability in our heterogeneous habitat. This thesis reviews several approaches to support environment related applications and intends to demonstrate that smart-space-based, ontology-driven, information-sharing platforms may become a flexible and powerful solution to support interoperable services in virtually any domain and even in cross-domain scenarios. It also shows that semantic technologies can be fruitfully applied not only to represent application domain knowledge. For example semantic modeling of Human-Computer Interaction may support interaction interoperability and transformation of interaction primitives into actions, and the thesis shows how smart-space-based platforms driven by an interaction ontology may enable natural ad flexible ways of accessing resources and services, e.g, with gestures. An ontology for computational flow execution has also been built to represent abstract computation, with the goal of exploring new ways of scheduling computation flows with smart-space-based semantic platforms.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research is concerned with the experimental software engineering area, specifically experiment replication. Replication has traditionally been viewed as a complex task in software engineering. This is possibly due to the present immaturity of the experimental paradigm applied to software development. Researchers usually use replication packages to replicate an experiment. However, replication packages are not the solution to all the information management problems that crop up when successive replications of an experiment accumulate. This research borrows ideas from the software configuration management and software product line paradigms to support the replication process. We believe that configuration management can help to manage and administer information from one replication to another: hypotheses, designs, data analysis, etc. The software product line paradigm can help to organize and manage any changes introduced into the experiment by each replication. We expect the union of the two paradigms in replication to improve the planning, design and execution of further replications and their alignment with existing replications. Additionally, this research work will contribute a web support environment for archiving information related to different experiment replications. Additionally, it will provide flexible enough information management support for running replications with different numbers and types of changes. Finally, it will afford massive storage of data from different replications. Experimenters working collaboratively on the same experiment must all have access to the different experiments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Antecedentes: Esta investigación se enmarca principalmente en la replicación y secundariamente en la síntesis de experimentos en Ingeniería de Software (IS). Para poder replicar, es necesario disponer de todos los detalles del experimento original. Sin embargo, la descripción de los experimentos es habitualmente incompleta debido a la existencia de conocimiento tácito y a la existencia de otros problemas tales como: La carencia de un formato estándar de reporte, la inexistencia de herramientas que den soporte a la generación de reportes experimentales, etc. Esto provoca que no se pueda reproducir fielmente el experimento original. Esta problemática limita considerablemente la capacidad de los experimentadores para llevar a cabo replicaciones y por ende síntesis de experimentos. Objetivo: La investigación tiene como objetivo formalizar el proceso experimental en IS, de modo que facilite la comunicación de información entre experimentadores. Contexto: El presente trabajo de tesis doctoral ha sido desarrollado en el seno del Grupo de Investigación en Ingeniería del Software Empírica (GrISE) perteneciente a la Escuela Técnica Superior de Ingenieros Informáticos (ETSIINF) de la Universidad Politécnica de Madrid (UPM), como parte del proyecto TIN2011-23216 denominado “Tecnologías para la Replicación y Síntesis de Experimentos en Ingeniería de Software”, el cual es financiado por el Gobierno de España. El grupo GrISE cumple a la perfección con los requisitos necesarios (familia de experimentos establecida, con al menos tres líneas experimentales y una amplia experiencia en replicaciones (16 replicaciones hasta 2011 en la línea de técnicas de pruebas de software)) y ofrece las condiciones para que la investigación se lleve a cabo de la mejor manera, como por ejemplo, el acceso total a su información. Método de Investigación: Para cumplir este objetivo se opta por Action Research (AR) como el método de investigación más adecuado a las características de la investigación, para obtener resultados a través de aproximaciones sucesivas que abordan los problemas concretos de comunicación entre experimentadores. Resultados: Se formalizó el modelo conceptual del ciclo experimental desde la perspectiva de los 3 roles principales que representan los experimentadores en el proceso experimental, siendo estos: Gestor de la Investigación (GI), Gestor del Experimento (GE) y Experimentador Senior (ES). Por otra parte, se formalizó el modelo del ciclo experimental, a través de: Un workflow del ciclo y un diagrama de procesos. Paralelamente a la formalización del proceso experimental en IS, se desarrolló ISRE (de las siglas en inglés Infrastructure for Sharing and Replicating Experiments), una prueba de concepto de entorno de soporte a la experimentación en IS. Finalmente, se plantearon guías para el desarrollo de entornos de soporte a la experimentación en IS, en base al estudio de las características principales y comunes de los modelos de las herramientas de soporte a la experimentación en distintas disciplinas experimentales. Conclusiones: La principal contribución de la investigación esta representada por la formalización del proceso experimental en IS. Los modelos que representan la formalización del ciclo experimental, así como la herramienta ISRE, construida a modo de evaluación de los modelos, fueron encontrados satisfactorios por los experimentadores del GrISE. Para consolidar la validez de la formalización, consideramos que este estudio debería ser replicado en otros grupos de investigación representativos en la comunidad de la IS experimental. Futuras Líneas de Investigación: El cumplimiento de los objetivos, de la mano con los hallazgos alcanzados, han dado paso a nuevas líneas de investigación, las cuales son las siguientes: (1) Considerar la construcción de un mecanismo para facilitar el proceso de hacer explícito el conocimiento tácito de los experimentadores por si mismos de forma colaborativa y basados en el debate y el consenso , (2) Continuar la investigación empírica en el mismo grupo de investigación hasta cubrir completamente el ciclo experimental (por ejemplo: experimentos nuevos, síntesis de resultados, etc.), (3) Replicar el proceso de investigación en otros grupos de investigación en ISE, y (4) Renovar la tecnología de la prueba de concepto, tal que responda a las restricciones y necesidades de un entorno real de investigación. ABSTRACT Background: This research addresses first and foremost the replication and also the synthesis of software engineering (SE) experiments. Replication is impossible without access to all the details of the original experiment. But the description of experiments is usually incomplete because knowledge is tacit, there is no standard reporting format or there are hardly any tools to support the generation of experimental reports, etc. This means that the original experiment cannot be reproduced exactly. These issues place considerable constraints on experimenters’ options for carrying out replications and ultimately synthesizing experiments. Aim: The aim of the research is to formalize the SE experimental process in order to facilitate information communication among experimenters. Context: This PhD research was developed within the empirical software engineering research group (GrISE) at the Universidad Politécnica de Madrid (UPM)’s School of Computer Engineering (ETSIINF) as part of project TIN2011-23216 entitled “Technologies for Software Engineering Experiment Replication and Synthesis”, which was funded by the Spanish Government. The GrISE research group fulfils all the requirements (established family of experiments with at least three experimental lines and lengthy replication experience (16 replications prior to 2011 in the software testing techniques line)) and provides favourable conditions for the research to be conducted in the best possible way, like, for example, full access to information. Research Method: We opted for action research (AR) as the research method best suited to the characteristics of the investigation. Results were generated successive rounds of AR addressing specific communication problems among experimenters. Results: The conceptual model of the experimental cycle was formalized from the viewpoint of three key roles representing experimenters in the experimental process. They were: research manager, experiment manager and senior experimenter. The model of the experimental cycle was formalized by means of a workflow and a process diagram. In tandem with the formalization of the SE experimental process, infrastructure for sharing and replicating experiments (ISRE) was developed. ISRE is a proof of concept of a SE experimentation support environment. Finally, guidelines for developing SE experimentation support environments were designed based on the study of the key features that the models of experimentation support tools for different experimental disciplines had in common. Conclusions: The key contribution of this research is the formalization of the SE experimental process. GrISE experimenters were satisfied with both the models representing the formalization of the experimental cycle and the ISRE tool built in order to evaluate the models. In order to further validate the formalization, this study should be replicated at other research groups representative of the experimental SE community. Future Research Lines: The achievement of the aims and the resulting findings have led to new research lines, which are as follows: (1) assess the feasibility of building a mechanism to help experimenters collaboratively specify tacit knowledge based on debate and consensus, (2) continue empirical research at the same research group in order to cover the remainder of the experimental cycle (for example, new experiments, results synthesis, etc.), (3) replicate the research process at other ESE research groups, and (4) update the tools of the proof of concept in order to meet the constraints and needs of a real research environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The work described was carried out as part of a collaborative Alvey software engineering project (project number SE057). The project collaborators were the Inter-Disciplinary Higher Degrees Scheme of the University of Aston in Birmingham, BIS Applied Systems Ltd. (BIS) and the British Steel Corporation. The aim of the project was to investigate the potential application of knowledge-based systems (KBSs) to the design of commercial data processing (DP) systems. The work was primarily concerned with BIS's Structured Systems Design (SSD) methodology for DP systems development and how users of this methodology could be supported using KBS tools. The problems encountered by users of SSD are discussed and potential forms of computer-based support for inexpert designers are identified. The architecture for a support environment for SSD is proposed based on the integration of KBS and non-KBS tools for individual design tasks within SSD - The Intellipse system. The Intellipse system has two modes of operation - Advisor and Designer. The design, implementation and user-evaluation of Advisor are discussed. The results of a Designer feasibility study, the aim of which was to analyse major design tasks in SSD to assess their suitability for KBS support, are reported. The potential role of KBS tools in the domain of database design is discussed. The project involved extensive knowledge engineering sessions with expert DP systems designers. Some practical lessons in relation to KBS development are derived from this experience. The nature of the expertise possessed by expert designers is discussed. The need for operational KBSs to be built to the same standards as other commercial and industrial software is identified. A comparison between current KBS and conventional DP systems development is made. On the basis of this analysis, a structured development method for KBSs in proposed - the POLITE model. Some initial results of applying this method to KBS development are discussed. Several areas for further research and development are identified.