896 resultados para pacs: information technology application
Resumo:
This paper describes the development and use of personas, a Human Computer Interaction (HCI) research methodology, within the STIMulate peer learning program, in order to better understand student behaviour patterns and motivations. STIMulate is a support for learning program at the Queensland University of Technology (QUT) in Brisbane, Australia. The program provides assistance in mathematics, science and information technology (IT) for course work students. A STIMulate space is provided for students to study and obtain one-on-one assistance from Peer Learning Facilitators (PLFs), who are experienced students that have excelled in relevant subject areas. This paper describes personas – archetypal users - that represent the motivations and behavioural patterns of students that utilise STIMulate (particularly the IT stream). The personas were developed based on interviews with PLFs, and subsequently validated by a PLF focus group. Seven different personas were developed. The personas enable us to better understand the characteristics of the students utilising the STIMulate program. The research provides a clearer picture of visiting student motivations and behavioural patterns. This has helped us identify gaps in the services provided, and be more aware of our assumptions about students. The personas have been deployed in PLF training programs, to help PLFs provide a better service to the students. The research findings suggest further study on the resonances between some students and PLFs, which we would like to better elicit.
Resumo:
Organisations use Enterprise Architecture (EA) to reduce organisational complexity, improve communication, align business and information technology (IT), and drive organisational change. Due to the dynamic nature of environmental and organisational factors, EA descriptions need to change over time to keep providing value for its stakeholders. Emerging business and IT trends, such as Service-Oriented Architecture (SOA), may impact EA frameworks, methodologies, governance and tools. However, the phenomenon of EA evolution is still poorly understood. Using Archer's morphogenetic theory as a foundation, this research conceptualises three analytical phases of EA evolution in organisations, namely conditioning, interaction and elaboration. Based on a case study with a government agency, this paper provides new empirically and theoretically grounded insights into EA evolution, in particular in relation to the introduction of SOA, and describes relevant generative mechanisms affecting EA evolution. By doing so, it builds a foundation to further examine the impact of other IT trends such as mobile or cloud-based solutions on EA evolution. At a practical level, the research delivers a model that can be used to guide professionals to manage EA and continually evolve it.
Resumo:
This poster presents key features of how QUT’s integrated research data storage and management services work with researchers through their own individual or team research life cycle. By understanding the characteristics of research data, and the long-term need to store this data, QUT has provided resources and tools that support QUT’s goal of being a research intensive institute. Key to successful delivery and operation has been the focus upon researchers’ individual needs and the collaboration between providers, in particular, Information Technology Services, High Performance Computing and Research Support, and QUT Library. QUT’s Research Data Storage service provides all QUT researchers (staff and Higher Degree Research students (HDRs)) with a secure data repository throughout the research data lifecycle. Three distinct storage areas provide for raw research data to be acquired, project data to be worked on, and published data to be archived. Since the service was launched in late 2014, it has provided research project teams from all QUT faculties with acquisition, working or archival data space. Feedback indicates that the storage suits the unique needs of researchers and their data. As part of the workflow to establish storage space for researchers, Research Support Specialists and Research Data Librarians consult with researchers and HDRs to identify data storage requirements for projects and individual researchers, and to select and implement the most suitable data storage services and facilities. While research can be a journey into the unknown[1], a plan can help navigate through the uncertainty. Intertwined in the storage provision is QUT’s Research Data Management Planning tool. Launched in March 2015, it has already attracted 273 QUT staff and 352 HDR student registrations, and over 620 plans have been created (2/10/2015). Developed in collaboration with Office of Research Ethics and Integrity (OREI), uptake of the plan has exceeded expectations.
Resumo:
In this paper, a new high precision focused word sense disambiguation (WSD) approach is proposed, which not only attempts to identify the proper sense for a word but also provides the probabilistic evaluation for the identification confidence at the same time. A novel Instance Knowledge Network (IKN) is built to generate and maintain semantic knowledge at the word, type synonym set and instance levels. Related algorithms based on graph matching are developed to train IKN with probabilistic knowledge and to use IKN for probabilistic word sense disambiguation. Based on the Senseval-3 all-words task, we run extensive experiments to show the performance enhancements in different precision ranges and the rationality of probabilistic based automatic confidence evaluation of disambiguation. We combine our WSD algorithm with five best WSD algorithms in senseval-3 all words tasks. The results show that the combined algorithms all outperform the corresponding algorithms.
Resumo:
Rigid security boundaries hinder the proliferation of eHealth. Through active audit logs, accountable-eHealth systems alleviate privacy concerns and enhance information availability.
Resumo:
Statistical reports of SMEs Internet usage from various countries indicate a steady growth. However, deeper investigation of SME’s e-commerce adoption and usage reveals that a number of SMEs fail to realize the full potential of e-commerce. Factors such as lack of tools and models in Information Systems and Information Technology for SMEs, and lack of technical expertise and specialized knowledge within and outside the SME have the most effect. This study aims to address the two important factors in two steps. First, introduce the conceptual tool for intuitive interaction. Second, explain the implementation process of the conceptual tool with the help of a case study. The subject chosen for the case study is a real estate SME from India. The design and development process of the website for the real estate SME was captured in this case study and the duration of the study was four months. Results indicated specific benefits for web designers and SME business owners. Results also indicated that the conceptual tool is easy to use without the need for technical expertise and specialized knowledge.
Resumo:
Background Prescribing is a complex task, requiring specific knowledge and skills, and the execution of effective, context-specific clinical reasoning. Systematic reviews indicate medical prescribing errors have a median rate of 7% [IQR 2%-14%] of medication orders [1-3]. For podiatrists pursuing prescribing rights, a clear need exists to ensure practitioners develop a well-defined set of prescribing skills, which will contribute to competent, safe and appropriate practice. Aim To investigate the methods employed to teach and assess the principles of effective prescribing in the undergraduate podiatry program and compare and contrast these findings with four other non-medical professions who undertake prescribing after training at Queensland University of Technology. Method The NPS National Prescribing Competency Standards were employed as the prescribing standard. A curriculum mapping exercise was undertaken to determine whether the prescribing principles articulated in the competency standards were addressed by each profession. Results A range of methods are currently utilised to teach prescribing across disciplines. Application of prescribing competencies to the context of each profession appears to influence the teaching methods used. Most competencies were taught using a multimodal format, including interactive lectures, self-directed learning, tutorial sessions and clinical placement. In particular clinical training was identified as the most consistent form of educating safe prescribers across all five disciplines. Assessment of prescribing competency utilised multiple techniques including written and oral examinations and research tasks, case studies, objective structured clinical examination exercises and the assessment of clinical practice. Effective and reliable assessment of prescribing undertaken by students in diverse settings remains challenging e.g. that occurring in the clinical practice environment. Conclusion Recommendations were made to refine curricula and to promote efficient cross-discipline teaching by staff from the disciplines of podiatry, pharmacy, nurse practitioner, optometry and paramedic science. Students now experience a sophisticated level of multidisciplinary learning in the clinical setting which integrates the expertise and skills of experience prescribers combined with innovative information technology platforms (CCTV and live patient assessments). Further work is required to establish a practical, effective approach to the assessment of prescribing competence especially between the university and clinical settings.
Resumo:
This thesis evaluates the effectiveness of the prescribed design and distribution requirements of the Australian Government's home loan key facts sheets (KFS) aimed at helping borrowers compare loan costs. The findings show that despite effectively improving borrower decision-making, few borrowers were aware of their existence and function. It was also demonstrated that KFS have had limited market impact over the four year window since introduction, likely due to the requirement that KFS provision is not required unless formally requested by a borrower. Recommendations include transferring the burden of disclosure to lenders in the first instance to address this information gap.
Resumo:
Glaucoma is the second leading cause of blindness worldwide. Often, the optic nerve head (ONH) glaucomatous damage and ONH changes occur prior to visual field loss and are observable in vivo. Thus, digital image analysis is a promising choice for detecting the onset and/or progression of glaucoma. In this paper, we present a new framework for detecting glaucomatous changes in the ONH of an eye using the method of proper orthogonal decomposition (POD). A baseline topograph subspace was constructed for each eye to describe the structure of the ONH of the eye at a reference/baseline condition using POD. Any glaucomatous changes in the ONH of the eye present during a follow-up exam were estimated by comparing the follow-up ONH topography with its baseline topograph subspace representation. Image correspondence measures of L-1-norm and L-2-norm, correlation, and image Euclidean distance (IMED) were used to quantify the ONH changes. An ONH topographic library built from the Louisiana State University Experimental Glaucoma study was used to evaluate the performance of the proposed method. The area under the receiver operating characteristic curves (AUCs) was used to compare the diagnostic performance of the POD-induced parameters with the parameters of the topographic change analysis (TCA) method. The IMED and L-2-norm parameters in the POD framework provided the highest AUC of 0.94 at 10 degrees. field of imaging and 0.91 at 15 degrees. field of imaging compared to the TCA parameters with an AUC of 0.86 and 0.88, respectively. The proposed POD framework captures the instrument measurement variability and inherent structure variability and shows promise for improving our ability to detect glaucomatous change over time in glaucoma management.
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.
Resumo:
Sensor networks represent an attractive tool to observe the physical world. Networks of tiny sensors can be used to detect a fire in a forest, to monitor the level of pollution in a river, or to check on the structural integrity of a bridge. Application-specific deployments of static-sensor networks have been widely investigated. Commonly, these networks involve a centralized data-collection point and no sharing of data outside the organization that owns it. Although this approach can accommodate many application scenarios, it significantly deviates from the pervasive computing vision of ubiquitous sensing where user applications seamlessly access anytime, anywhere data produced by sensors embedded in the surroundings. With the ubiquity and ever-increasing capabilities of mobile devices, urban environments can help give substance to the ubiquitous sensing vision through Urbanets, spontaneously created urban networks. Urbanets consist of mobile multi-sensor devices, such as smart phones and vehicular systems, public sensor networks deployed by municipalities, and individual sensors incorporated in buildings, roads, or daily artifacts. My thesis is that "multi-sensor mobile devices can be successfully programmed to become the underpinning elements of an open, infrastructure-less, distributed sensing platform that can bring sensor data out of their traditional close-loop networks into everyday urban applications". Urbanets can support a variety of services ranging from emergency and surveillance to tourist guidance and entertainment. For instance, cars can be used to provide traffic information services to alert drivers to upcoming traffic jams, and phones to provide shopping recommender services to inform users of special offers at the mall. Urbanets cannot be programmed using traditional distributed computing models, which assume underlying networks with functionally homogeneous nodes, stable configurations, and known delays. Conversely, Urbanets have functionally heterogeneous nodes, volatile configurations, and unknown delays. Instead, solutions developed for sensor networks and mobile ad hoc networks can be leveraged to provide novel architectures that address Urbanet-specific requirements, while providing useful abstractions that hide the network complexity from the programmer. This dissertation presents two middleware architectures that can support mobile sensing applications in Urbanets. Contory offers a declarative programming model that views Urbanets as a distributed sensor database and exposes an SQL-like interface to developers. Context-aware Migratory Services provides a client-server paradigm, where services are capable of migrating to different nodes in the network in order to maintain a continuous and semantically correct interaction with clients. Compared to previous approaches to supporting mobile sensing urban applications, our architectures are entirely distributed and do not assume constant availability of Internet connectivity. In addition, they allow on-demand collection of sensor data with the accuracy and at the frequency required by every application. These architectures have been implemented in Java and tested on smart phones. They have proved successful in supporting several prototype applications and experimental results obtained in ad hoc networks of phones have demonstrated their feasibility with reasonable performance in terms of latency, memory, and energy consumption.
Resumo:
Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.
Resumo:
This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.
Resumo:
In this paper we give the performance of MQAM OFDM based WLAN in presence of single and multiple channels Zigbee interference. An analytical model for getting symbol error rate (SER) in presence of single and multiple channel Zigbee interference in AWGN and Rayleigh fading channel for MQAM OFDM system is given. Simulation results are compared with analytical symbol error rate (SER) of the MQAM-OFDM system. For analysis we have modeled the Zigbee interference using the power spectral density (PSD) of OQPSK modulation and finding the average interference power for each sub-carrier of the OFDM system. Then we have averaged the SER over all WLAN sub-carriers. Simulations closely match with the analytical models. It is seen from simulation and analytical results that performance of WLAN is severely affected by Zigbee interference. Symbol error rate (SER) for 16QAM and 64QAM OFDM system is of order of 10(-2) for SIR (signal to interference ratio) of 20dB and 30dB respectively in presence of single Zigbee interferer inside the WLAN frequency band for Rayleigh fading channel. For SIR values more than 30dB and 40dB the SER approaches the SER without interference for 16QAM and 64QAM OFDM system respectively.
Resumo:
Purpose: Knowledge management (KM) is important to the knowledge-intensive construction industry. The diversified and changing nature of works in this field warrants us to stocktake, identify changes and map out KM research framework for future exploration. Design/methodology/approach: The study involves three aspects. First, three stages of KM research in construction were distinguished in terms of the time distribution of 217 target publications. Major topics in the stages were extracted for understanding the changes of research emphasis from evolutionary perspective. Second, the past works were summed up in a three-dimensional research framework in terms of management organization, managerial methodology and approach, and managerial objective. Finally, potential research orientations in the future were predicted to expand the existing research framework. Findings: It was found that (1) KM research has significantly blossomed in the last two decades with a great potential; (2) major topics of KM were changing in terms of technology, technique, organization, attribute of knowledge and research objectives; (3) past KM studies centred around management organization, managerial methodology and approach, and managerial objective thus a three-dimensional research framework was proposed; (4) within the research framework, team-level, project-level and firm-level KM were studied to achieve project, organizational and competitive objectives by integrated methodologies of information technology, social technique and KM process tool; and (5) nine potential research orientations were predicted corresponding to the three dimensions. Finally, an expanded research framework was proposed to encourage and guide future research works in this field. Research limitations/implications: The paper only focused on the construction industry. The findings need further exploration in order to discover any possible missing important research works which were not published in English or not included in the time period. Originality/value: The paper formed a systematic framework of KM research in construction and predicted the potential research orientations. It provides much value for the researchers who want to understand the past and the future of global KM research in the construction industry.