60 resultados para computer science, information systems

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim of this project was to design and evaluate a system that would produce tailored information for stroke patients and their carers, customised according to their informational needs, and facilitate communication between the patient and, health professional. Method: A human factors development approach was used to develop a computer system, which dynamically compiles stroke education booklets for patients and carers. Patients and carers are able to select the topics about which they wish to receive information, the amount of information they want, and the font size of the printed booklet. The system is designed so that the health professional interacts with it, thereby providing opportunities for communication between the health professional and patient/carer at a number of points in time. Results: Preliminary evaluation of the system by health professionals, patients and carers was positive. A randomised controlled trial that examines the effect of the system on patient and carer outcomes is underway. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For many years in the area of business systems analysis and design, practitioners and researchers alike have been searching for some comprehensive basis on which to evaluate, compare, and engineer techniques that are promoted for use in the modelling of systems' requirements. To date, while many frameworks, factors, and facets have been forthcoming, none appear to be based on a sound theory. In light of this dilemma, over the last 10 years, attention has been devoted by researchers to the use of ontology to provide some theoretical basis for the advancement of the business systems modelling discipline. This paper outlines how we have used a particular ontology for this purpose over the last five years. In particular we have learned that the understandability and the applicability of the selected ontology must be clear for IS professionals, the results of any ontological evaluation must be tempered by economic efficiency considerations of the stakeholders involved, and ontologies may have to be focused for the business purpose and type of user involved in the modelling situation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Enterprise systems interoperability (ESI) is an important topic for business currently. This situation is evidenced, at least in part, by the number and extent of potential candidate protocols for such process interoperation, viz., ebXML, BPML, BPEL, and WSCI. Wide-ranging support for each of these candidate standards already exists. However, despite broad acceptance, a sound theoretical evaluation of these approaches has not yet been provided. We use the Bunge-Wand-Weber (BWW) models, in particular, the representation model, to provide the basis for such a theoretical evaluation. We, and other researchers, have shown the usefulness of the representation model for analyzing, evaluating, and engineering techniques in the areas of traditional and structured systems analysis, object-oriented modeling, and process modeling. In this work, we address the question, what are the potential semantic weaknesses of using ebXML alone for process interoperation between enterprise systems? We find that users will lack important implementation information because of representational deficiencies; due to ontological redundancy, the complexity of the specification is unnecessarily increased; and, users of the specification will have to bring in extra-model knowledge to understand constructs in the specification due to instances of ontological excess.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the obstacles to improved security of the Internet is ad hoc development of technologies with different design goals and different security goals. This paper proposes reconceptualizing the Internet as a secure distributed system, focusing specifically on the application layer. The notion is to redesign specific functionality, based on principles discovered in research on distributed systems in the decades since the initial development of the Internet. Because of the problems in retrofitting new technology across millions of clients and servers, any options with prospects of success must support backward compatibility. This paper outlines a possible new architecture for internet-based mail which would replace existing protocols by a more secure framework. To maintain backward compatibility, initial implementation could offer a web browser-based front end but the longer-term approach would be to implement the system using appropriate models of replication. (C) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electronic communications devices intended for government or military applications must be rigorously evaluated to ensure that they maintain data confidentiality. High-grade information security evaluations require a detailed analysis of the device's design, to determine how it achieves necessary security functions. In practice, such evaluations are labour-intensive and costly, so there is a strong incentive to find ways to make the process more efficient. In this paper we show how well-known concepts from graph theory can be applied to a device's design to optimise information security evaluations. In particular, we use end-to-end graph traversals to eliminate components that do not need to be evaluated at all, and minimal cutsets to identify the smallest group of components that needs to be evaluated in depth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of distributed compression for correlated quantum sources is considered. The classical version of this problem was solved by Slepian and Wolf, who showed that distributed compression could take full advantage of redundancy in the local sources created by the presence of correlations. Here it is shown that, in general, this is not the case for quantum sources, by proving a lower bound on the rate sum for irreducible sources of product states which is stronger than the one given by a naive application of Slepian-Wolf. Nonetheless, strategies taking advantage of correlation do exist for some special classes of quantum sources. For example, Devetak and Winter demonstrated the existence of such a strategy when one of the sources is classical. Optimal nontrivial strategies for a different extreme, sources of Bell states, are presented here. In addition, it is explained how distributed compression is connected to other problems in quantum information theory, including information-disturbance questions, entanglement distillation and quantum error correction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online geographic information systems provide the means to extract a subset of desired spatial information from a larger remote repository. Data retrieved representing real-world geographic phenomena are then manipulated to suit the specific needs of an end-user. Often this extraction requires the derivation of representations of objects specific to a particular resolution or scale from a single original stored version. Currently standard spatial data handling techniques cannot support the multi-resolution representation of such features in a database. In this paper a methodology to store and retrieve versions of spatial objects at, different resolutions with respect to scale using standard database primitives and SQL is presented. The technique involves heavy fragmentation of spatial features that allows dynamic simplification into scale-specific object representations customised to the display resolution of the end-user's device. Experimental results comparing the new approach to traditional R-Tree indexing and external object simplification reveal the former performs notably better for mobile and WWW applications where client-side resources are limited and retrieved data loads are kept relatively small.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The notion of compensation is widely used in advanced transaction models as means of recovery from a failure. Similar concepts are adopted for providing transaction-like behaviour for long business processes supported by workflows technology. In general, it is not trivial to design compensating tasks for tasks in the context of a workflow. Actually, a task in a workflow process does not have to be compensatable in the sense that the forcibility of reverse operations of the task is not always guaranteed by the application semantics. In addition, the isolation requirement on data resources may make a task difficult to compensate. In this paper, we first look into the requirements that a compensating task has to satisfy. Then we introduce a new concept called confirmation. With the help of confirmation, we are able to modify most non-compensatable tasks so that they become compensatable. This can substantially increase the availability of shared resources and greatly improve backward recovery for workflow applications in case of failures. To effectively incorporate confirmation and compensation into a workflow management environment, a three level bottom-up workflow design method is introduced. The implementation issues of this design are also discussed. (C) 2003 Elsevier Science Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fault diagnosis has become an important component in intelligent systems, such as intelligent control systems and intelligent eLearning systems. Reiter's diagnosis theory, described by first-order sentences, has been attracting much attention in this field. However, descriptions and observations of most real-world situations are related to fuzziness because of the incompleteness and the uncertainty of knowledge, e. g., the fault diagnosis of student behaviors in the eLearning processes. In this paper, an extension of Reiter's consistency-based diagnosis methodology, Fuzzy Diagnosis, has been proposed, which is able to deal with incomplete or fuzzy knowledge. A number of important properties of the Fuzzy diagnoses schemes have also been established. The computing of fuzzy diagnoses is mapped to solving a system of inequalities. Some special cases, abstracted from real-world situations, have been discussed. In particular, the fuzzy diagnosis problem, in which fuzzy observations are represented by clause-style fuzzy theories, has been presented and its solving method has also been given. A student fault diagnostic problem abstracted from a simplified real-world eLearning case is described to demonstrate the application of our diagnostic framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new low-complexity multicarrier modulation (MCM) technique based on lattices which achieves a peak-to-average power ratio (PAR) as low as three. The scheme can be viewed as a drop in replacement for the discrete multitone (DMT) modulation of an asymmetric digital subscriber line modem. We show that the lattice-MCM retains many of the attractive features of sinusoidal-MCM, and does so with lower implementation complexity, O(N), compared with DMT, which requires O(N log N) operations. We also present techniques for narrowband interference rejection and power profiling. Simulation studies confirm that performance of the lattice-MCM is superior, even compared with recent techniques for PAR reduction in DMT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A variety of current and future wired and wireless networking technologies can be transformed into a seamless communication environments through application of context-based vertical handovers. Such seamless communication environments are needed for future pervasive/ubiquitous systems. Pervasive systems are context aware and need to adapt to context changes, including network disconnections and changes in network Quality of Service (QoS). Vertical handover is one of many possible adaptation methods. It allows users to roam freely between heterogeneous networks while maintaining the continuity of their applications. This paper proposes a vertical handover mechanism suitable for multimedia applications in pervasive systems. The paper focuses on the handover decision making process which uses context information regarding user devices, user location, network environment and requested QoS. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interconnecting business processes across systems and organisations is considered to provide significant benefits, such as greater process transparency, higher degrees of integration, facilitation of communication, and consequently higher throughput in a given time interval. However, to achieve these benefits requires tackling constraints. In the context of this paper these are privacy-requirements of the involved workflows and their mutual dependencies. Workflow views are a promising conceptional approach to address the issue of privacy; however this approach requires addressing the issue of interdependencies between workflow view and adjacent private workflow. In this paper we focus on three aspects concerning the support for execution of cross-organisational workflows that have been modelled with a workflow view approach: (i) communication between the entities of a view-based workflow model, (ii) their impact on an extended workflow engine, and (iii) the design of a cross-organisational workflow architecture (CWA). We consider communication aspects in terms of state dependencies and control flow dependencies. We propose to tightly couple private workflow and workflow view with state dependencies, whilst to loosely couple workflow views with control flow dependencies. We introduce a Petri-Net-based state transition approach that binds states of private workflow tasks to their adjacent workflow view-task. On the basis of these communication aspects we develop a CWA for view-based cross-organisational workflow execution. Its concepts are valid for mediated and unmediated interactions and express no choice of a particular technology. The concepts are demonstrated by a scenario, run by two extended workflow management systems. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much research has been devoted over the years to investigating and advancing the techniques and tools used by analysts when they model. As opposed to what academics, software providers and their resellers promote as should be happening, the aim of this research was to determine whether practitioners still embraced conceptual modeling seriously. In addition, what are the most popular techniques and tools used for conceptual modeling? What are the major purposes for which conceptual modeling is used? The study found that the top six most frequently used modeling techniques and methods were ER diagramming, data flow diagramming, systems flowcharting, workflow modeling, UML, and structured charts. Modeling technique use was found to decrease significantly from smaller to medium-sized organizations, but then to increase significantly in larger organizations (proxying for large, complex projects). Technique use was also found to significantly follow an inverted U-shaped curve, contrary to some prior explanations. Additionally, an important contribution of this study was the identification of the factors that uniquely influence the decision of analysts to continue to use modeling, viz., communication (using diagrams) to/from stakeholders, internal knowledge (lack of) of techniques, user expectations management, understanding models' integration into the business, and tool/software deficiencies. The highest ranked purposes for which modeling was undertaken were database design and management, business process documentation, business process improvement, and software development. (c) 2005 Elsevier B.V. All rights reserved.