215 resultados para Computing Classification Systems
Resumo:
Clinical information systems have become important tools in contemporary clinical patient care. However, there is a question of whether the current clinical information systems are able to effectively support clinicians in decision making processes. We conducted a survey to identify some of the decision making issues related to the use of existing clinical information systems. The survey was conducted among the end users of the cardiac surgery unit, quality and safety unit, intensive care unit and clinical costing unit at The Prince Charles Hospital (TPCH). Based on the survey results and reviewed literature, it was identified that support from the current information systems for decision-making is limited. Also, survey results showed that the majority of respondents considered lack in data integration to be one of the major issues followed by other issues such as limited access to various databases, lack of time and lack in efficient reporting and analysis tools. Furthermore, respondents pointed out that data quality is an issue and the three major data quality issues being faced are lack of data completeness, lack in consistency and lack in data accuracy. Conclusion: Current clinical information systems support for the decision-making processes in Cardiac Surgery in this institution is limited and this could be addressed by integrating isolated clinical information systems.
Resumo:
In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays(FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.
Resumo:
EHealth systems promise enviable benefits and capabilities for healthcare. But, the technologies that make these capabilities possible brings with them undesirable drawback such as information security related threats which need to be appropriately addressed. Lurking in these threats are patient privacy concerns. Fulfilling these privacy concerns have proven to be difficult since they often conflict with information requirements of care providers. It is important to achieve a proper balance between these requirements. We believe that information accountability can achieve this balance. In this paper we introduce accountable-eHealth systems. We will discuss how our designed protocols can successfully address the aforementioned requirement. We will also compare characteristics of AeH systems with Australia’s PCEHR system and identify similarities and highlight the differences and the impact those differences would have to the eHealth domain.
Resumo:
Information privacy requirements of patients and information requirements of healthcare providers (HCP) are competing concerns. Reaching a balance between these requirements have proven difficult but is crucial for the success of eHealth systems. The traditional approaches to information management have been preventive measures which either allow or deny access to information. We believe that this approach is inappropriate for a domain such as healthcare. We contend that introducing information accountability (IA) to eHealth systems can reach the aforementioned balance without the need for rigid information control. IA is a fairly new concept to computer science, hence; there are no unambiguously accepted principles as yet. But the concept delivers promising advantages to information management in a robust manner. Accountable-eHealth (AeH) systems are eHealth systems which use IA principles as the measure for privacy and information management. AeH systems face three main impediments; technological, social and ethical and legal. In this paper, we present the AeH model and focus on the legal aspects of AeH systems in Australia. We investigate current legislation available in Australia regarding health information management and identify future legal requirements if AeH systems are to be implemented in Australia.
Resumo:
In recent years there has been a large emphasis placed on the need to use Learning Management Systems (LMS) in the field of higher education, with many universities mandating their use. An important aspect of these systems is their ability to offer collaboration tools to build a community of learners. This paper reports on a study of the effectiveness of an LMS (Blackboard©) in a higher education setting and whether both lecturers and students voluntarily use collaborative tools for teaching and learning. Interviews were conducted with participants (N=67) from the faculties of Science and Technology, Business, Health and Law. Results from this study indicated that participants often use Blackboard© as an online repository of learning materials and that the collaboration tools of Blackboard© are often not utilised. The study also found that several factors have inhibited the use and uptake of the collaboration tools within Blackboard©. These have included structure and user experience, pedagogical practice, response time and a preference for other tools.
Resumo:
In this paper, we present the outcomes of a project on the exploration of the use of Field Programmable Gate Arrays (FPGAs) as co-processors for scientific computation. We designed a custom circuit for the pipelined solving of multiple tri-diagonal linear systems. The design is well suited for applications that require many independent tri-diagonal system solves, such as finite difference methods for solving PDEs or applications utilising cubic spline interpolation. The selected solver algorithm was the Tri-Diagonal Matrix Algorithm (TDMA or Thomas Algorithm). Our solver supports user specified precision thought the use of a custom floating point VHDL library supporting addition, subtraction, multiplication and division. The variable precision TDMA solver was tested for correctness in simulation mode. The TDMA pipeline was tested successfully in hardware using a simplified solver model. The details of implementation, the limitations, and future work are also discussed.
Resumo:
The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.
Resumo:
Substantial research efforts have been expended to deal with the complexity of concurrent systems that is inherent to their analysis, e.g., works that tackle the well-known state space explosion problem. Approaches differ in the classes of properties that they are able to suitably check and this is largely a result of the way they balance the trade-off between analysis time and space employed to describe a concurrent system. One interesting class of properties is concerned with behavioral characteristics. These properties are conveniently expressed in terms of computations, or runs, in concurrent systems. This article introduces the theory of untanglings that exploits a particular representation of a collection of runs in a concurrent system. It is shown that a representative untangling of a bounded concurrent system can be constructed that captures all and only the behavior of the system. Representative untanglings strike a unique balance between time and space, yet provide a single model for the convenient extraction of various behavioral properties. Performance measurements in terms of construction time and size of representative untanglings with respect to the original specifications of concurrent systems, conducted on a collection of models from practice, confirm the scalability of the approach. Finally, this article demonstrates practical benefits of using representative untanglings when checking various behavioral properties of concurrent systems.
Resumo:
Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Resumo:
Research on Enterprise Resource Planning (ERP) Systems is becoming a well-established research theme in Information Systems (IS) research. Enterprise Resource Planning Systems, given its unique differentiations with other IS applications, have provided an interesting backdrop to test and re-test some of the key and fundamental concepts in IS. While some researchers have tested well-established concepts of technology acceptance, system usage and system success in the context of ERP Systems, others have researched how new paradigms like cloud computing and social media integrate with ERP Systems. Moreover, ERP Systems provided the context for cross disciplinary research such as knowledge management, project management and business process management research. Almost after two-decades since its inception in IS research, this paper provides a critique of 198 papers published on ERP Systems since 2006-2012. We observe patterns on ES research, provide comparisons to past studies and provide future research directions.
Resumo:
The availability of health information is rapidly increasing; its expansion and proliferation is inevitable. At the same time, breeding of health information silos is an unstoppable and relentless exercise. Information security and privacy concerns are therefore major barriers in the eHealth socio-eco system. We proposed Information Accountability as a measurable human factor that should eliminate and mitigate security concerns. Information accountability measures would be practicable and feasible if legislative requirements are also embedded. In this context, information accountability constitutes a key component for the development of effective information technology requirements for health information system. Our conceptual approach to measuring human factors related to information accountability in eHealth is presented in this paper with some limitations.
Resumo:
Existing distinctions among macro and micro approaches have been jeopardising the advances of Information Systems (IS) research. Both approaches have been criticized for explaining one level while neglecting the other; thereby, the current situation necessitates the application of multilevel research for revealing the deficiencies. Instead of studying single level (macro or micro), multilevel research entails more than one level of conceptualization and analysis, simultaneously. As the notion of multilevel is borrowed from reference disciplines, there tends to be confusions and inconsistencies within the IS discipline, which hinders the adoption of multilevel research. This paper speaks for the potential value of multilevel research, by investigating the current application status of multilevel research within the IS domain. A content analysis of multilevel research articles from major IS conferences and journals is presented. Analysis results suggest that IS scholars have applied multilevel research to produce high quality work ranging from a variety of topics. However, researchers have not yet been consistently defining “multilevel”, leading to idiosyncratic meanings of multilevel research, most often, in authors’ own interpretations. We argue that a rigorous definition of “multilevel research” needs to be explicated for consistencies in research community.
Resumo:
Qualitative research methods are widely accepted in Information Systems and multiple approaches have been successfully used in IS qualitative studies over the years. These approaches include narrative analysis, discourse analysis, grounded theory, case study, ethnography and phenomenological analysis. Guided by critical, interpretive and positivist epistemologies (Myers 1997), qualitative methods are continuously growing in importance in our research community. In this special issue, we adopt Van Maanen's (1979: 520) definition of qualitative research as an umbrella term to cover an “array of interpretive techniques that can describe, decode, translate, and otherwise come to terms with the meaning, not the frequency, of certain more or less naturally occurring phenomena in the social world”. In the call for papers, we stated that the aim of the special issue was to provide a forum within which we can present and debate the significant number of issues, results and questions arising from the pluralistic approach to qualitative research in Information Systems. We recognise both the potential and the challenges that qualitative approaches offers for accessing the different layers and dimensions of a complex and constructed social reality (Orlikowski, 1993). The special issue is also a response to the need to showcase the current state of the art in IS qualitative research and highlight advances and issues encountered in the process of continuous learning that includes questions about its ontology, epistemological tenets, theoretical contributions and practical applications.
Resumo:
An environmentally sustainable and thus green business process is one that delivers organizational value whilst also exerting a minimal impact on the natural environment. Recent works from the field of Information Systems (IS) have argued that information systems can contribute to the design and implementation of sustainable business processes. While prior research has investigated how information systems can be used in order to support sustainable business practices, there is still a void as to the actual changes that business processes have to undergo in order to become environmentally sustainable, and the specific role that information systems play in enabling this change. In this paper, we provide a conceptualization of environmentally sustainable business processes, and discuss the role of functional affordances of information systems in enabling both incremental and radical changes in order to make processes environmentally sustainable. Our conceptualization is based on (a) a fundamental definition of the concept of environmental sustainability, grounded in two basic components:the environmental source and sink functions of any project or activity, and (b) the concept of functional affordances, which describe the potential uses originating in the material properties of information systems in relation to their use context. In order to illustrate the application of our framework and provide a first evaluation, we analyse two examples from prior research where information systems impacted on the sustainability of business processes.
Resumo:
Enterprise Systems (ES) can be understood as the de facto standard for holistic operational and managerial support within an organization. Most commonly ES are offered as commercial off-the-shelf packages, requiring customization in the user organization. This process is a complex and resource-intensive task, which often prevents small and midsize enterprises (SME) from undertaking configuration projects. Especially in the SME market independent software vendors provide pre-configured ES for a small customer base. The problem of ES configuration is shifted from the customer to the vendor, but remains critical. We argue that the yet unexplored link between process configuration and business document configuration must be closer examined as both types of configuration are closely tied to one another.