140 resultados para information sciences
Resumo:
Information Technology (IT) is successfully applied in a diverse range of fields. Though, the field of Medical Informatics is more than three decades old, it shows a very slow progress compared to many other fields in which the application of IT is growing rapidly. The spending on IT in health care is shooting up but the road to successful use of IT in health care has not been easy. This paper discusses about the barriers to the successful adoption of information technology in clinical environments and outlines the different approaches used by various countries and organisations to tackle the issues successfully. Investing financial and other resources to overcome the barriers for successful adoption of HIT is highly important to realise the dream of a future healthcare system with each customer having secure, private Electronic Health Record (EHR) that is available whenever and wherever needed, enabling the highest degree of coordinated medical care based on the latest medical knowledge and evidence. Arguably, the paper reviews barriers to HIT from organisations’ alignment in respect to the leadership; with their stated values when accepting or willingness to consider the HIT as a determinant factor on their decision-making processes. However, the review concludes that there are many aspects of the organisational accountability and readiness to agree to the technology implementation.
Resumo:
Health Informatics is an intersection of information technology, several disciplines of medicine and health care. It sits at the common frontiers of health care services including patient centric, processes driven and procedural centric care. From the information technology perspective it can be viewed as computer application in medical and/or health processes for delivering better health care solutions. In spite of the exaggerated hype, this field is having a major impact in health care solutions, in particular health care deliveries, decision making, medical devices and allied health care industries. It also affords enormous research opportunities for new methodological development. Despite the obvious connections between Medical Informatics, Nursing Informatics and Health Informatics, most of the methodologies and approaches used in Health Informatics have so far originated from health system management, care aspects and medical diagnostic. This paper explores reasoning for domain knowledge analysis that would establish Health Informatics as a domain and recognised as an intellectual discipline in its own right.
Resumo:
Availability of health information is rapidly increasing and the expansion and proliferation of health information is inevitable. The Electronic Healthcare Record, Electronic Medical Record and Personal Health Record are at the core of this trend and are required for appropriate and practicable exchange and sharing of health information. However, it is becoming increasingly recognized that it is essential to preserve patient privacy and information security when utilising sensitive information for clinical, management and administrative processes. Furthermore, the usability of emerging healthcare applications is also becoming a growing concern. This paper proposes a novel approach for integrating consideration of information accountability with a perspective from usability engineering that can be applied when developing healthcare information technology applications. A social networking user case in the healthcare information exchange will be presented in the context of our approach.
Resumo:
Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."
Resumo:
This paper is a work in progress that examines current consumer engagement with eHealth information through Smartphones or tablets. We focus on three activity types: seeking, posting and ‘other’ engagement activity and compare two age groups, 25-40s and over 40-55s. Findings show that around 30% of the younger age group is engaging with Government and other Health providers’ websites, receiving eHealth emails, and reading other people’s comments about health related issues in online discussion groups/websites/blog. Approximately 20% engage with Government and other Health providers’ social media and watch or listen to audio or video podcasts. For the older age group, their most active engagement with eHealth information is in the seeking category through Government or other health websites (approximately 15%), and less than 10% for social media sites. Their posting activity is less than 5%. Other activities show that less than 15% of the older age group engages through receiving emails and reading blogs, less than 10% watch or listen to podcasts, and their online consulting activity is less than 7%. We note that scores are low for both groups in terms of engaging with eHealth information through Twitter.
Resumo:
Information accountability is seen as a mode of usage control on the Web. Due to its many dimensions, information accountability has been expressed in various ways by computer scientists to address security and privacy in recent times. Information accountability is focused on how users participate in a system and the underlying policies that govern the participation. Healthcare is a domain in which the principles of information accountability can be utilised well. Modern health information systems are Internet based and the discipline is called eHealth. In this paper, we identify and discuss the goals of accountability systems and present the principles of information accountability. We characterise those principles in eHealth and discuss them contextually. We identify the current impediments to eHealth in terms of information privacy issues of eHealth consumers together with information usage requirements of healthcare providers and show how information accountability can be used in a healthcare context to address these needs. The challenges of implementing information accountability in eHealth are also discussed in terms of our efforts thus far.
Resumo:
Effective management of chronic diseases is a global health priority. A healthcare information system offers opportunities to address challenges of chronic disease management. However, the requirements of health information systems are often not well understood. The accuracy of requirements has a direct impact on the successful design and implementation of a health information system. Our research describes methods used to understand the requirements of health information systems for advanced prostate cancer management. The research conducted a survey to identify heterogeneous sources of clinical records. Our research showed that the General Practitioner was the common source of patient's clinical records (41%) followed by the Urologist (14%) and other clinicians (14%). Our research describes a method to identify diverse data sources and proposes a novel patient journey browser prototype that integrates disparate data sources.
Resumo:
A fear of imminent information overload predates the World Wide Web by decades. Yet, that fear has never abated. Worse, as the World Wide Web today takes the lion’s share of the information we deal with, both in amount and in time spent gathering it, the situation has only become more precarious. This chapter analyses new issues in information overload that have emerged with the advent of the Web, which emphasizes written communication, defined in this context as the exchange of ideas expressed informally, often casually, as in verbal language. The chapter focuses on three ways to mitigate these issues. First, it helps us, the users, to be more specific in what we ask for. Second, it helps us amend our request when we don't get what we think we asked for. And third, since only we, the human users, can judge whether the information received is what we want, it makes retrieval techniques more effective by basing them on how humans structure information. This chapter reports on extensive experiments we conducted in all three areas. First, to let users be more specific in describing an information need, they were allowed to express themselves in an unrestricted conversational style. This way, they could convey their information need as if they were talking to a fellow human instead of using the two or three words typically supplied to a search engine. Second, users were provided with effective ways to zoom in on the desired information once potentially relevant information became available. Third, a variety of experiments focused on the search engine itself as the mediator between request and delivery of information. All examples that are explained in detail have actually been implemented. The results of our experiments demonstrate how a human-centered approach can reduce information overload in an area that grows in importance with each day that passes. By actually having built these applications, I present an operational, not just aspirational approach.
Resumo:
This chapter introduces the reader to the relational approach to information literacy, its evolution and use in contemporary research and emerging directions. It presents the relational approach, first introduced by Australian information literacy researchers, as an integration of experiential, contextual and transformational perspectives. The chapter opens with a reflection on the wider information literacy domain. It then addresses the development of the approach, its fundamental elements and characteristics, and explores the adoption of the approach in key contexts including education, workplace and community settings. The chapter explores significant studies that have contributed to its evolution and reflects on the impact of the development of the relational framework and related research. The chapter concludes with a focus on directions emerging from the relational understanding ofinformation literacy and potential implications.
Resumo:
With the implementation of the Personally Controlled eHealth Records system (PCEHR) in Australia, shared Electronic Health Records (EHR) are now a reality. However, the characteristic implicit in the PCEHR that puts the consumer (i.e. patient) in control of managing his or her health information within the PCEHR prevents healthcare professionals (HCPs) from utilising it as a one-stop-shop for information at point of care decision making as they cannot trust that a complete record of the consumer's health history is available to them through it. As a result, whilst reaching a major milestone in Australia's eHealth journey, the PCEHR does not reap the full benefits that such a shared EHR system can offer.
Resumo:
Many mature term-based or pattern-based approaches have been used in the field of information filtering to generate users’ information needs from a collection of documents. A fundamental assumption for these approaches is that the documents in the collection are all about one topic. However, in reality users’ interests can be diverse and the documents in the collection often involve multiple topics. Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, and this has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering has not been so well explored. Patterns are always thought to be more discriminative than single terms for describing documents. However, the enormous amount of discovered patterns hinder them from being effectively and efficiently used in real applications, therefore, selection of the most discriminative and representative patterns from the huge amount of discovered patterns becomes crucial. To deal with the above mentioned limitations and problems, in this paper, a novel information filtering model, Maximum matched Pattern-based Topic Model (MPBTM), is proposed. The main distinctive features of the proposed model include: (1) user information needs are generated in terms of multiple topics; (2) each topic is represented by patterns; (3) patterns are generated from topic models and are organized in terms of their statistical and taxonomic features, and; (4) the most discriminative and representative patterns, called Maximum Matched Patterns, are proposed to estimate the document relevance to the user’s information needs in order to filter out irrelevant documents. Extensive experiments are conducted to evaluate the effectiveness of the proposed model by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model significantly outperforms both state-of-the-art term-based models and pattern-based models
Resumo:
The integration of separate, yet complimentary, cortical pathways appears to play a role in visual perception and action when intercepting objects. The ventral system is responsible for object recognition and identification, while the dorsal system facilitates continuous regulation of action. This dual-system model implies that empirically manipulating different visual information sources during performance of an interceptive action might lead to the emergence of distinct gaze and movement pattern profiles. To test this idea, we recorded hand kinematics and eye movements of participants as they attempted to catch balls projected from a novel apparatus that synchronised or de-synchronised accompanying video images of a throwing action and ball trajectory. Results revealed that ball catching performance was less successful when patterns of hand movements and gaze behaviours were constrained by the absence of advanced perceptual information from the thrower's actions. Under these task constraints, participants began tracking the ball later, followed less of its trajectory, and adapted their actions by initiating movements later and moving the hand faster. There were no performance differences when the throwing action image and ball speed were synchronised or de-synchronised since hand movements were closely linked to information from ball trajectory. Results are interpreted relative to the two-visual system hypothesis, demonstrating that accurate interception requires integration of advanced visual information from kinematics of the throwing action and from ball flight trajectory.
Resumo:
This study aimed to determine if systematic variation of the diagnostic terminology embedded within written discharge information (i.e., concussion or mild traumatic brain injury, mTBI) would produce different expected symptoms and illness perceptions. We hypothesized that compared to concussion advice, mTBI advice would be associated with worse outcomes. Sixty-two volunteers with no history of brain injury or neurological disease were randomly allocated to one of two conditions in which they read a mTBI vignette followed by information that varied only by use of the embedded terms concussion (n = 28) or mTBI (n = 34). Both groups reported illness perceptions (timeline and consequences subscale of the Illness Perception Questionnaire-Revised) and expected Postconcussion Syndrome (PCS) symptoms 6 months post injury (Neurobehavioral Symptom Inventory, NSI). Statistically significant group differences due to terminology were found on selected NSI scores (i.e., total, cognitive and sensory symptom cluster scores (concussion > mTBI)), but there was no effect of terminology on illness perception. When embedded in discharge advice, diagnostic terminology affects some but not all expected outcomes. Given that such expectations are a known contributor to poor mTBI outcome, clinicians should consider the potential impact of varied terminology on their patients.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
Moving cell fronts are an essential feature of wound healing, development and disease. The rate at which a cell front moves is driven, in part, by the cell motility, quantified in terms of the cell diffusivity $D$, and the cell proliferation rate �$\lambda$. Scratch assays are a commonly-reported procedure used to investigate the motion of cell fronts where an initial cell monolayer is scratched and the motion of the front is monitored over a short period of time, often less than 24 hours. The simplest way of quantifying a scratch assay is to monitor the progression of the leading edge. Leading edge data is very convenient since, unlike other methods, it is nondestructive and does not require labeling, tracking or counting individual cells amongst the population. In this work we study short time leading edge data in a scratch assay using a discrete mathematical model and automated image analysis with the aim of investigating whether such data allows us to reliably identify $D$ and $\lambda$�. Using a naıve calibration approach where we simply scan the relevant region of the ($D$;$\lambda$�) parameter space, we show that there are many choices of $D$ and $\lambda$� for which our model produces indistinguishable short time leading edge data. Therefore, without due care, it is impossible to estimate $D$ and $\lambda$� from this kind of data. To address this, we present a modified approach accounting for the fact that cell motility occurs over a much shorter time scale than proliferation. Using this information we divide the duration of the experiment into two periods, and we estimate $D$ using data from the first period, while we estimate �$\lambda$ using data from the second period. We confirm the accuracy of our approach using in silico data and a new set of in vitro data, which shows that our method recovers estimates of $D$ and $\lamdba$� that are consistent with previously-reported values except that that our approach is fast, inexpensive, nondestructive and avoids the need for cell labeling and cell counting.