861 resultados para Curricular Support Data Analysis
Resumo:
Many initiatives to improve Business processes are emerging. The essential roles and contributions of Business Analyst (BA) and Business Process Management (BPM) professionals to such initiatives have been recognized in literature and practice. The roles and responsibilities of a BA or BPM practitioner typically require different skill-sets; however these differences are often vague. This vagueness creates much confusion in practice and academia. While both the BA and BPM communities have made attempts to describe their domains through capability defining empirical research and developments of Bodies of knowledge, there has not yet been any attempt to identify the commonality of skills required and points of uniqueness between the two professions. This study aims to address this gap and presents the findings of a detailed content mapping exercise (using NVivo as a qualitative data analysis tool) of the International Institution of Business Analysis (IIBA®) Guide to the Business Analysis Body of Knowledge (BABOK® Guide) against core BPM competency and capability frameworks.
Resumo:
Background Significant ongoing learning needs for nurses have occurred as a direct result of the continuous introduction of technological innovations and research developments in the healthcare environment. Despite an increased worldwide emphasis on the importance of continuing education, there continues to be an absence of empirical evidence of program and session effectiveness. Few studies determine whether continuing education enhances or develops practice and the relative cost benefits of health professionals’ participation in professional development. The implications for future clinical practice and associated educational approaches to meet the needs of an increasingly diverse multigenerational and multicultural workforce are also not well documented. There is minimal research confirming that continuing education programs contribute to improved patient outcomes, nurses’ earlier detection of patient deterioration or that standards of continuing competence are maintained. Crucially, evidence-based practice is demonstrated and international quality and safety benchmarks are adhered to. An integrated clinical learning model was developed to inform ongoing education for acute care nurses. Educational strategies included the use of integrated learning approaches, interactive teaching concepts and learner-centred pedagogies. A Respiratory Skills Update education (ReSKU) program was used as the content for the educational intervention to inform surgical nurses’ clinical practice in the area of respiratory assessment. The aim of the research was to evaluate the effectiveness of implementing the ReSKU program using teaching and learning strategies, in the context of organisational utility, on improving surgical nurses’ practice in the area of respiratory assessment. The education program aimed to facilitate better awareness, knowledge and understanding of respiratory dysfunction in the postoperative clinical environment. This research was guided by the work of Forneris (2004), who developed a theoretical framework to operationalise a critical thinking process incorporating the complexities of the clinical context. The framework used educational strategies that are learner-centred and participatory. These strategies aimed to engage the clinician in dynamic thinking processes in clinical practice situations guided by coaches and educators. Methods A quasi experimental pre test, post test non–equivalent control group design was used to evaluate the impact of the ReSKU program on the clinical practice of surgical nurses. The research tested the hypothesis that participation in the ReSKU program improves the reported beliefs and attitudes of surgical nurses, increases their knowledge and reported use of respiratory assessment skills. The study was conducted in a 400 bed regional referral public hospital, the central hub of three smaller hospitals, in a health district servicing the coastal and hinterland areas north of Brisbane. The sample included 90 nurses working in the three surgical wards eligible for inclusion in the study. The experimental group consisted of 36 surgical nurses who had chosen to attend the ReSKU program and consented to be part of the study intervention group. The comparison group included the 39 surgical nurses who elected not to attend the ReSKU program, but agreed to participate in the study. Findings One of the most notable findings was that nurses choosing not to participate were older, more experienced and less well educated. The data demonstrated that there was a barrier for training which impacted on educational strategies as this mature aged cohort was less likely to take up educational opportunities. The study demonstrated statistically significant differences between groups regarding reported use of respiratory skills, three months after ReSKU program attendance. Between group data analysis indicated that the intervention group’s reported beliefs and attitudes pertaining to subscale descriptors showed statistically significant differences in three of the six subscales following attendance at the ReSKU program. These subscales included influence on nursing care, educational preparation and clinical development. Findings suggest that the use of an integrated educational model underpinned by a robust theoretical framework is a strong factor in some perceptions of the ReSKU program relating to attitudes and behaviour. There were minimal differences in knowledge between groups across time. Conclusions This study was consistent with contemporary educational approaches using multi-modal, interactive teaching strategies and a robust overarching theoretical framework to support study concepts. The construct of critical thinking in the clinical context, combined with clinical reasoning and purposeful and collective reflection, was a powerful educational strategy to enhance competency and capability in clinicians.
Resumo:
Compared with viewing videos on PCs or TVs, mobile users have different experiences in viewing videos on a mobile phone due to different device features such as screen size and distinct usage contexts. To understand how mobile user’s viewing experience is impacted, we conducted a field user study with 42 participants in two typical usage contexts using a custom-designed iPhone application. With user’s acceptance of mobile video quality as the index, the study addresses four influence aspects of user experiences, including context, content type, encoding parameters and user profiles. Accompanying the quantitative method (acceptance assessment), we used a qualitative interview method to obtain a deeper understanding of a user’s assessment criteria and to support the quantitative results from a user’s perspective. Based on the results from data analysis, we advocate two user-driven strategies to adaptively provide an acceptable quality and to predict a good user experience, respectively. There are two main contributions from this paper. Firstly, the field user study allows a consideration of more influencing factors into the research on user experience of mobile video. And these influences are further demonstrated by user’s opinions. Secondly, the proposed strategies — user-driven acceptance threshold adaptation and user experience prediction — will be valuable in mobile video delivery for optimizing user experience.
Resumo:
The pervasiveness of technology in the 21st Century has meant that adults and children live in a society where digital devices are integral to their everyday lives and participation in society. How we communicate, learn, work, entertain ourselves, and even shop is influenced by technology. Therefore, before children begin school they are potentially exposed to a range of learning opportunities mediated by digital devices. These devices include microwaves, mobile phones, computers, and console games such as Playstations® and iPods®. In Queensland preparatory classrooms and in the homes of these children, teachers and parents support and scaffold young children’s experiences, providing them with access to a range of tools that promote learning and provide entertainment. This paper examines teachers’ and parents’ perspectives and considers whether they are techno-optimists who advocate for and promote the inclusion of digital technology, or whether they are they techno-pessimists, who prefer to exclude digital devices from young children’s everyday experiences. An exploratory, single case study design was utilised to gather data from three teachers and ten parents of children in the preparatory year. Teacher data was collected through interviews and email correspondence. Parent data was collected from questionnaires and focus groups. All parents who responded to the research invitation were mothers. The results of data analysis identified a misalignment among adults’ perspectives. Teachers were identified as techno-optimists and parents were identified as techno-pessimists with further emergent themes particular to each category being established. This is concerning because both teachers and mothers influence young children’s experiences and numeracy knowledge, thus, a shared understanding and a common commitment to supporting young children’s use of technology would be beneficial. Further research must investigate fathers’ perspectives of digital devices and the beneficial and detrimental roles that a range of digital devices, tools, and entertainment gadgets play in 21st Century children’s lives.
Resumo:
This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.
Resumo:
This study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, “to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice” (Gable et al, 2006). IS-Impact is defined as “a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups” (Gable Sedera and Chan, 2008). Track efforts have yielded the bicameral IS-Impact measurement model; the “impact” half includes Organizational-Impact and Individual-Impact dimensions; the “quality” half includes System-Quality and Information-Quality dimensions. The IS-Impact model, by design, is intended to be robust, simple and generalizable, to yield results that are comparable across time, stakeholders, different systems and system contexts. The model and measurement approach employ perceptual measures and an instrument that is relevant to key stakeholder groups, thereby enabling the combination or comparison of stakeholder perspectives. Such a validated and widely accepted IS-Impact measurement model has both academic and practical value. It facilitates systematic operationalization of a main dependent variable in research (IS-Impact), which can also serve as an important independent variable. For IS management practice it provides a means to benchmark and track the performance of information systems in use. The objective of this study is to develop a Mandarin version IS-Impact model, encompassing a list of China-specific IS-Impact measures, aiding in a better understanding of the IS-Impact phenomenon in a Chinese organizational context. The IS-Impact model provides a much needed theoretical guidance for this investigation of ES and ES impacts in a Chinese context. The appropriateness and soundness of employing the IS-Impact model as a theoretical foundation are evident: the model originated from a sound theory of IS Success (1992), developed through rigorous validation, and also derived in the context of Enterprise Systems. Based on the IS-Impact model, this study investigates a number of research questions (RQs). Firstly, the research investigated what essential impacts have been derived from ES by Chinese users and organizations [RQ1]. Secondly, we investigate which salient quality features of ES are perceived by Chinese users [RQ2]. Thirdly, we seek to answer whether the quality and impacts measures are sufficient to assess ES-success in general [RQ3]. Lastly, the study attempts to address whether the IS-Impact measurement model is appropriate for Chinese organizations in terms of evaluating their ES [RQ4]. An open-ended, qualitative identification survey was employed in the study. A large body of short text data was gathered from 144 Chinese users and 633 valid IS-Impact statements were generated from the data set. A generally inductive approach was applied in the qualitative data analysis. Rigorous qualitative data coding resulted in 50 first-order categories with 6 second-order categories that were grounded from the context of Chinese organization. The six second-order categories are: 1) System Quality; 2) Information Quality; 3) Individual Impacts;4) Organizational Impacts; 5) User Quality and 6) IS Support Quality. The final research finding of the study is the contextualized Mandarin version IS-Impact measurement model that includes 38 measures organized into 4 dimensions: System Quality, information Quality, Individual Impacts and Organizational Impacts. The study also proposed two conceptual models to harmonize the IS-Impact model and the two emergent constructs – User Quality and IS Support Quality by drawing on previous IS effectiveness literatures and the Work System theory proposed by Alter (1999) respectively. The study is significant as it is the first effort that empirically and comprehensively investigates IS-Impact in China. Specifically, the research contributions can be classified into theoretical contributions and practical contributions. From the theoretical perspective, through qualitative evidence, the study test and consolidate IS-Impact measurement model in terms of the quality of robustness, completeness and generalizability. The unconventional research design exhibits creativity of the study. The theoretical model does not work as a top-down a priori seeking for evidence demonstrating its credibility; rather, the study allows a competitive model to emerge from the bottom-up and open-coding analysis. Besides, the study is an example extending and localizing pre-existing theory developed in Western context when the theory is introduced to a different context. On the other hand, from the practical perspective, It is first time to introduce prominent research findings in field of IS Success to Chinese academia and practitioner. This study provides a guideline for Chinese organizations to assess their Enterprise System, and leveraging IT investment in the future. As a research effort in ITPS track, this study contributes the research team with an alternative operationalization of the dependent variable. The future research can take on the contextualized Mandarin version IS-Impact framework as a theoretical a priori model, further quantitative and empirical testing its validity.
Resumo:
This paper provides fundamental understanding for the use of cumulative plots for travel time estimation on signalized urban networks. Analytical modeling is performed to generate cumulative plots based on the availability of data: a) Case-D, for detector data only; b) Case-DS, for detector data and signal timings; and c) Case-DSS, for detector data, signal timings and saturation flow rate. The empirical study and sensitivity analysis based on simulation experiments have observed the consistency in performance for Case-DS and Case-DSS, whereas, for Case-D the performance is inconsistent. Case-D is sensitive to detection interval and signal timings within the interval. When detection interval is integral multiple of signal cycle then it has low accuracy and low reliability. Whereas, for detection interval around 1.5 times signal cycle both accuracy and reliability are high.
Resumo:
Unstructured text data, such as emails, blogs, contracts, academic publications, organizational documents, transcribed interviews, and even tweets, are important sources of data in Information Systems research. Various forms of qualitative analysis of the content of these data exist and have revealed important insights. Yet, to date, these analyses have been hampered by limitations of human coding of large data sets, and by bias due to human interpretation. In this paper, we compare and combine two quantitative analysis techniques to demonstrate the capabilities of computational analysis for content analysis of unstructured text. Specifically, we seek to demonstrate how two quantitative analytic methods, viz., Latent Semantic Analysis and data mining, can aid researchers in revealing core content topic areas in large (or small) data sets, and in visualizing how these concepts evolve, migrate, converge or diverge over time. We exemplify the complementary application of these techniques through an examination of a 25-year sample of abstracts from selected journals in Information Systems, Management, and Accounting disciplines. Through this work, we explore the capabilities of two computational techniques, and show how these techniques can be used to gather insights from a large corpus of unstructured text.
Resumo:
In this paper we present a sequential Monte Carlo algorithm for Bayesian sequential experimental design applied to generalised non-linear models for discrete data. The approach is computationally convenient in that the information of newly observed data can be incorporated through a simple re-weighting step. We also consider a flexible parametric model for the stimulus-response relationship together with a newly developed hybrid design utility that can produce more robust estimates of the target stimulus in the presence of substantial model and parameter uncertainty. The algorithm is applied to hypothetical clinical trial or bioassay scenarios. In the discussion, potential generalisations of the algorithm are suggested to possibly extend its applicability to a wide variety of scenarios
Resumo:
Modern technology now has the ability to generate large datasets over space and time. Such data typically exhibit high autocorrelations over all dimensions. The field trial data motivating the methods of this paper were collected to examine the behaviour of traditional cropping and to determine a cropping system which could maximise water use for grain production while minimising leakage below the crop root zone. They consist of moisture measurements made at 15 depths across 3 rows and 18 columns, in the lattice framework of an agricultural field. Bayesian conditional autoregressive (CAR) models are used to account for local site correlations. Conditional autoregressive models have not been widely used in analyses of agricultural data. This paper serves to illustrate the usefulness of these models in this field, along with the ease of implementation in WinBUGS, a freely available software package. The innovation is the fitting of separate conditional autoregressive models for each depth layer, the ‘layered CAR model’, while simultaneously estimating depth profile functions for each site treatment. Modelling interest also lay in how best to model the treatment effect depth profiles, and in the choice of neighbourhood structure for the spatial autocorrelation model. The favoured model fitted the treatment effects as splines over depth, and treated depth, the basis for the regression model, as measured with error, while fitting CAR neighbourhood models by depth layer. It is hierarchical, with separate onditional autoregressive spatial variance components at each depth, and the fixed terms which involve an errors-in-measurement model treat depth errors as interval-censored measurement error. The Bayesian framework permits transparent specification and easy comparison of the various complex models compared.
Resumo:
Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.
Resumo:
Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.
Resumo:
Monitoring environmental health is becoming increasingly important as human activity and climate change place greater pressure on global biodiversity. Acoustic sensors provide the ability to collect data passively, objectively and continuously across large areas for extended periods. While these factors make acoustic sensors attractive as autonomous data collectors, there are significant issues associated with large-scale data manipulation and analysis. We present our current research into techniques for analysing large volumes of acoustic data efficiently. We provide an overview of a novel online acoustic environmental workbench and discuss a number of approaches to scaling analysis of acoustic data; online collaboration, manual, automatic and human-in-the loop analysis.