824 resultados para information systems theory
Resumo:
Increasingly, national and international governments have a strong mandate to develop national e-health systems to enable delivery of much-needed healthcare services. Research is, therefore, needed into appropriate security and reliance structures for the development of health information systems which must be compliant with governmental and alike obligations. The protection of e-health information security is critical to the successful implementation of any e-health initiative. To address this, this paper proposes a security architecture for index-based e-health environments, according to the broad outline of Australia’s National E-health Strategy and National E-health Transition Authority (NEHTA)’s Connectivity Architecture. This proposal, however, could be equally applied to any distributed, index-based health information system involving referencing to disparate health information systems. The practicality of the proposed security architecture is supported through an experimental demonstration. This successful prototype completion demonstrates the comprehensibility of the proposed architecture, and the clarity and feasibility of system specifications, in enabling ready development of such a system. This test vehicle has also indicated a number of parameters that need to be considered in any national indexed-based e-health system design with reasonable levels of system security. This paper has identified the need for evaluation of the levels of education, training, and expertise required to create such a system.
Resumo:
Introduction: Why we need to base childrens’ sport and physical education on the principles of dynamical systems theory and ecological psychology As the childhood years are crucial for developing many physical skills as well as establishing the groundwork leading to lifelong participation in sport and physical activities, (Orlick & Botterill, 1977, p. 11) it is essential to examine current practice to make sure it is meeting the needs of children. In recent papers (e.g. Renshaw, Davids, Chow & Shuttleworth, in press; Renshaw, Davids, Chow & Hammond, in review; Chow et al., 2009) we have highlighted that a guiding theoretical framework is needed to provide a principled approach to teaching and coaching and that the approach must be evidence- based and focused on mechanism and not just on operational issues such as practice, competition and programme management (Lyle, 2002). There is a need to demonstrate how nonlinear pedagogy underpins teaching and coaching practice for children given that some of the current approaches underpinning children’s sport and P.E. may not be leading to optimal results. For example, little time is spent undertaking physical activities (Tinning, 2006) and much of this practice is not representative of the competition demands of the performance environment (Kirk & McPhail, 2002; Renshaw et al., 2008). Proponents of a non- linear pedagogy advocate the design of practice by applying key concepts such as the mutuality of the performer and environment, the tight coupling of perception and action, and the emergence of movement solutions due to self organisation under constraints (see Renshaw, et al., in press). As skills are shaped by the unique interacting individual, task and environmental constraints in these learning environments, small changes to individual structural (e.g. factors such as height or limb length) or functional constraints (e.g. factors such as motivation, perceptual skills, strength that can be acquired), task rules, equipment, or environmental constraints can lead to dramatic changes in movement patterns adopted by learners to solve performance problems. The aim of this chapter is to provide real life examples for teachers and coaches who wish to adopt the ideas of non- linear pedagogy in their practice. Specifically, I will provide examples related to specific issues related to individual constraints in children and in particular the unique challenges facing coaches when individual constraints are changing due to growth and development. Part two focuses on understanding how cultural environmental constraints impact on children’s sport. This is an area that has received very little attention but plays a very important part in the long- term development of sporting expertise. Finally, I will look at how coaches can manipulate task constraints to create effective learning environments for young children.
Resumo:
Bounded parameter Markov Decision Processes (BMDPs) address the issue of dealing with uncertainty in the parameters of a Markov Decision Process (MDP). Unlike the case of an MDP, the notion of an optimal policy for a BMDP is not entirely straightforward. We consider two notions of optimality based on optimistic and pessimistic criteria. These have been analyzed for discounted BMDPs. Here we provide results for average reward BMDPs. We establish a fundamental relationship between the discounted and the average reward problems, prove the existence of Blackwell optimal policies and, for both notions of optimality, derive algorithms that converge to the optimal value function.
Resumo:
The paper "the importance of convexity in learning with squared loss" gave a lower bound on the sample complexity of learning with quadratic loss using a nonconvex function class. The proof contains an error. We show that the lower bound is true under a stronger condition that holds for many cases of interest.
Resumo:
This paper presents a framework for evaluating information retrieval of medical records. We use the BLULab corpus, a large collection of real-world de-identified medical records. The collection has been hand coded by clinical terminol- ogists using the ICD-9 medical classification system. The ICD codes are used to devise queries and relevance judge- ments for this collection. Results of initial test runs using a baseline IR system are provided. Queries and relevance judgements are online to aid further research in medical IR. Please visit: http://koopman.id.au/med_eval.
Resumo:
The adoption of IT Governance (ITG) continues to be an important topic for research. Many researchers have focused their attention on how these practices are currently being implemented in the many diverse areas and industries. Literature shows that a majority of these studies have only been based on industries and organizations in developed countries. There exist very few researches that look specifically within the context of a developing country. Furthermore, there seems to be a lack of research on identifying the barriers or inhibitors to IT Governance adoption within the context of an emerging yet still developing Asian country. This research sets out to justify, substantiate and improve on a priori model developed to study the barriers to the adoption of ITG practice using qualitative data obtained through a series of semi-structured interviews conducted on organizations in Malaysia.
Resumo:
The School of Electrical and Electronic Systems Engineering of Queensland University of Technology (like many other universities around the world) has recognised the importance of complementing the teaching of signal processing with computer based experiments. A laboratory has been developed to provide a "hands-on" approach to the teaching of signal processing techniques. The motivation for the development of this laboratory was the cliche "What I hear I remember but what I do I understand." The laboratory has been named as the "Signal Computing and Real-time DSP Laboratory" and provides practical training to approximately 150 final year undergraduate students each year. The paper describes the novel features of the laboratory, techniques used in the laboratory based teaching, interesting aspects of the experiments that have been developed and student evaluation of the teaching techniques
Resumo:
We present a modification of the algorithm of Dani et al. [8] for the online linear optimization problem in the bandit setting, which with high probability has regret at most O ∗ ( √ T) against an adaptive adversary. This improves on the previous algorithm [8] whose regret is bounded in expectation against an oblivious adversary. We obtain the same dependence on the dimension (n 3/2) as that exhibited by Dani et al. The results of this paper rest firmly on those of [8] and the remarkable technique of Auer et al. [2] for obtaining high probability bounds via optimistic estimates. This paper answers an open question: it eliminates the gap between the high-probability bounds obtained in the full-information vs bandit settings.
Resumo:
Conceptual modeling continues to be an important means for graphically capturing the requirements of an information system. Observations of modeling practice suggest that modelers often use multiple modeling grammars in combination to articulate various aspects of real-world domains. We extend an ontological theory of representation to suggest why and how users employ multiple conceptual modeling grammars in combination. We provide an empirical test of the extended theory using survey data and structured interviews about the use of traditional and structured analysis grammars within an automated tool environment. We find that users of the analyzed tool combine grammars to overcome the ontological incompleteness that exists in each grammar. Users further selected their starting grammar from a predicted subset of grammars only. The qualitative data provides insights as to why some of the predicted deficiencies manifest in practice differently than predicted.
Resumo:
Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. The spreading activation, spooky-action-at-a-distance and entanglement models have all been used to model the activation of a word. Recently a hypothesis was put forward that the mean activation levels of the respective models are as follows: Spreading � Entanglment � Spooking-action-at-a-distance This article investigates this hypothesis by means of a substantial empirical analysis of each model using the University of South Florida word association, rhyme and word norms.
Resumo:
Dashboards are expected to improve decision making by amplifying cognition and capitalizing on human perceptual capabilities. Hence, interest in dashboards has increased recently, which is also evident from the proliferation of dashboard solution providers in the market. Despite dashboards' popularity, little is known about the extent of their effectiveness, i.e. what types of dashboards work best for different users or tasks. In this paper, we conduct a comprehensive multidisciplinary literature review with an aim to identify the critical issues organizations might need to consider when implementing dashboards. Dashboards are likely to succeed and solve the problems of presentation format and information load when certain visualization principles and features are present (e.g. high data-ink ratio and drill down features).Werecommend that dashboards come with some level of flexibility, i.e. allowing users to switch between alternative presentation formats. Also some theory driven guidance through popups and warnings can help users to select an appropriate presentation format. Given the dearth of research on dashboards, we conclude the paper with a research agenda that could guide future studies in this area.
Resumo:
Intermediaries have introduced electronic services with varying success. One of the problems an intermediary faces is deciding what kind of exchange service it should offer to its customers and suppliers. For example, should it only provide a catalogue or should it also enable customers to order products? Developing the right exchange design is a complex undertaking because of the many design options on the one hand and the interests of multiple actors to be considered on the other. This is far more difficult than simple prescriptions like ‘creating a win-win situation’ suggest. We address this problem by developing design patterns for the exchanges between customers, intermediary, and suppliers related to role, linkage, transparency, and ovelty choices. For developing these design patterns, we studied four distinct electronic intermediaries and dentified exchange design choices that require trade-offs relating to the interests of customers, intermediary, and suppliers. The exchange design patterns contribute to the development of design theory for electronic intermediaries by filling a gap between basic business models and detailed business process designs.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
There is increasing attention to the importance of Enterprise Systems (ES) and Information Systems (IS) for Small and Medium Enterprises (SMEs). The same attention must be addressed in IS graduate curriculum. Studies reveal that despite healthy demand from the industry for IS management expertise, most IS graduates are ill-equipped to meet the challenges of modern organizations. The majority of contemporary firms, represented by SMEs, seek employees with a balance of business process knowledge and ES software skills. This article describes a curriculum that teaches Information Technology (IT) and IS managementconcepts in a SMEs context. The curriculum conceptualises a ‘learn-by-doing’ approach, to provide business process and ES software specific knowledge for its students. The approach recommends coverage of traditional content related to SMEs’’ operations, strategies, IT investment and management issues while providing an increased focus on strategic use of enterprise IT. The study addresses to an extent, the perennial challenge of updating IS curriculum, given the rapid pace of technological change.