139 resultados para Healthcare architecture
Resumo:
Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.
Resumo:
The concept of being ‘patient-centric’ is a challenge to many existing healthcare service provision practices. This paper focuses on the issue of referrals, where multiple stakeholders, i.e. general practitioners and patients, are encouraged to make a consensual decision based on patient needs. In this paper, we present an ontology-enabled healthcare service provision, which facilitates both patients and GPs in jointly deciding upon the referral decision. In the healthcare service provision model, we define three types of profile, which represents different stakeholders’ requirements. This model also comprises of a set of healthcare service discovery processes: articulating a service need, matching the need with the healthcare service offerings, and deciding on a best-fit service for acceptance. As a result, the healthcare service provision can carry out coherent analysis using personalised information and iterative processes that deal with requirements change over time.
Resumo:
Users’ requirements change drives an information system evolution. Consequently, such evolution affects those atomic services which provide functional operations from one state of their composition to another state of composition. A challenging issue associated with such evolution of the state of service composition is to ensure a resultant service composition remaining rational. This paper presents a method of Service Composition Atomic-Operation Set (SCAOS). SCAOS defines 2 classes of atomic operations and 13 kinds of basic service compositions to aid a state change process by using Workflow Net. The workflow net has algorithmic capabilities to compose the required services with rationality and maintain any changes to the services in a different composition also rational. This method can improve the adaptability to the ever changing business requirements of information systems in the dynamic environment.
Resumo:
Accident and Emergency (A&E) units provide a route for patients requiring urgent admission to acute hospitals. Public concern over long waiting times for admissions motivated this study, whose aim is to explore the factors which contribute to such delays. The paper discusses the formulation and calibration of a system dynamics model of the interaction of demand pattern, A&E resource deployment, other hospital processes and bed numbers; and the outputs of policy analysis runs of the model which vary a number of the key parameters. Two significant findings have policy implications. One is that while some delays to patients are unavoidable, reductions can be achieved by selective augmentation of resources within, and relating to, the A&E unit. The second is that reductions in bed numbers do not increase waiting times for emergency admissions, their effect instead being to increase sharply the number of cancellations of admissions for elective surgery. This suggests that basing A&E policy solely on any single criterion will merely succeed in transferring the effects of a resource deficit to a different patient group.
Resumo:
Postnatal depression (PND) is associated with impairments in the mother–child relationship, and these impairments are themselves associated with adverse child outcomes. Thus, compared to the children of non-depressed mothers, children of mothers with PND are more likely to be insecurely attached, and to have externalising behaviour problems and poor cognitive development. Each of these three child outcomes is predicted by a particular pattern of difficulty in parenting: insecure attachment is related to maternal insensitivity, particularly in relation to infant distress and emotional vulnerability; externalising problems are particularly common in the context of hostile parenting; and poor cognitive development is related to parental difficulties in noticing infant signs of interest and supporting their engagement with the environment. This article sets out procedures for how parenting could be assessed in ways that are sensitive to the domain-specific associations between parenting and child outcome, while remaining sensitive to the child's developmental stage. This set of assessments requires field testing.
Resumo:
The idea of buildings in harmony with nature can be traced back to ancient times. The increasing concerns on sustainability oriented buildings have added new challenges in building architectural design and called for new design responses. Sustainable design integrates and balances the human geometries and the natural ones. As the language of nature, it is, therefore, natural to assume that fractal geometry could play a role in developing new forms of aesthetics and sustainable architectural design. This paper gives a brief description of fractal geometry theory and presents its current status and recent developments through illustrative review of some fractal case studies in architecture design, which provides a bridge between fractal geometry and architecture design.
Resumo:
Renshaw and Donszelmann lecture on their projects for the collaborative group 'Outside Architecture' this was part of a series of papers on the subject of architecture and art curated by The British School at Rome
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.