870 resultados para information flow model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this perspective. These were drawn from a detailed literature survey. A research framework for analysis of the interaction between lean and BIM was then compiled. The goal of the framework is to both guide and stimulate research; as such, the approach adopted up to this point is constructive. Ongoing research has identified 55 such interactions, the majority of which show positive synergy between the two.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research applies a multidimensional model of publicness to the analysis of organisational change and in so doing enriches understanding of the public nature of organisations and how public characteristics facilitate change. Much of the prior literature describes public organisations as bureaucratic, with characteristics that are resistant to change, hierarchical structures that impede information flow, goals that are imposed and scrutinised by political authority and red tape that constrains decision-making. This dissertation instead reports a more complex picture and explains how public characteristics can also work in ways that enable organisational change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In some delay-tolerant communication systems such as vehicular ad-hoc networks, information flow can be represented as an infectious process, where each entity having already received the information will try to share it with its neighbours. The random walk and random waypoint models are popular analysis tools for these epidemic broadcasts, and represent two types of random mobility. In this paper, we introduce a simulation framework investigating the impact of a gradual increase of bias in path selection (i.e. reduction of randomness), when moving from the former to the latter. Randomness in path selection can significantly alter the system performances, in both regular and irregular network structures. The implications of these results for real systems are discussed in details.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With the introduction of the PCEHR (Personally Controlled Electronic Health Record), the Australian public is being asked to accept greater responsibility for the management of their health information. However, the implementation of the PCEHR has occasioned poor adoption rates underscored by criticism from stakeholders with concerns about transparency, accountability, privacy, confidentiality, governance, and limited capabilities. This study adopts an ethnographic lens to observe how information is created and used during the patient journey and the social factors impacting on the adoption of the PCEHR at the micro-level in order to develop a conceptual model that will encourage the sharing of patient information within the cycle of care. Objective: This study aims to firstly, establish a basic understanding of healthcare professional attitudes toward a national platform for sharing patient summary information in the form of a PCEHR. Secondly, the studies aims to map the flow of patient related information as it traverses a patient’s personal cycle of care. Thus, an ethnographic approach was used to bring a “real world” lens to information flow in a series of case studies in the Australian healthcare system to discover themes and issues that are important from the patient’s perspective. Design: Qualitative study utilising ethnographic case studies. Setting: Case studies were conducted at primary and allied healthcare professionals located in Brisbane Queensland between October 2013 and July 2014. Results: In the first dimension, it was identified that healthcare professionals’ concerns about trust and medico-legal issues related to patient control and information quality, and the lack of clinical value available with the PCEHR emerged as significant barriers to use. The second dimension of the study which attempted to map patient information flow identified information quality issues, clinical workflow inefficiencies and interoperability misconceptions resulting in duplication of effort, unnecessary manual processes, data quality and integrity issues and an over reliance on the understanding and communication skills of the patient. Conclusion: Opportunities for process efficiencies, improved data quality and increased patient safety emerge with the adoption of an appropriate information sharing platform. More importantly, large scale eHealth initiatives must be aligned with the value proposition of individual stakeholders in order to achieve widespread adoption. Leveraging an Australian national eHealth infrastructure and the PCEHR we offer a practical example of a service driven digital ecosystem suitable for co-creating value in healthcare.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spatial data analysis has become more and more important in the studies of ecology and economics during the last decade. One focus of spatial data analysis is how to select predictors, variance functions and correlation functions. However, in general, the true covariance function is unknown and the working covariance structure is often misspecified. In this paper, our target is to find a good strategy to identify the best model from the candidate set using model selection criteria. This paper is to evaluate the ability of some information criteria (corrected Akaike information criterion, Bayesian information criterion (BIC) and residual information criterion (RIC)) for choosing the optimal model when the working correlation function, the working variance function and the working mean function are correct or misspecified. Simulations are carried out for small to moderate sample sizes. Four candidate covariance functions (exponential, Gaussian, Matern and rational quadratic) are used in simulation studies. With the summary in simulation results, we find that the misspecified working correlation structure can still capture some spatial correlation information in model fitting. When the sample size is large enough, BIC and RIC perform well even if the the working covariance is misspecified. Moreover, the performance of these information criteria is related to the average level of model fitting which can be indicated by the average adjusted R square ( [GRAPHICS] ), and overall RIC performs well.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical optimization studies of the performance of a combustion driven premixed two-phase flow gasdynamic laser are presented. The steady inviscid nonreacting quasi-one-dimensional two-phase flow model including appropriate finite rate vibrational kinetic rates has been used in the analysis. The analysis shows that the effect of the particles on the optimum performance of the two-phase laser is very small. The results are presented in graphical form. Applied Physics Letters is copyrighted by The American Institute of Physics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many novel computer architectures like array and multiprocessors which achieve high performance through the use of concurrency exploit variations of the von Neumann model of computation. The effective utilization of the machines makes special demands on programmers and their programming languages, such as the structuring of data into vectors or the partitioning of programs into concurrent processes. In comparison, the data flow model of computation demands only that the principle of structured programming be followed. A data flow program, often represented as a data flow graph, is a program that expresses a computation by indicating the data dependencies among operators. A data flow computer is a machine designed to take advantage of concurrency in data flow graphs by executing data independent operations in parallel. In this paper, we discuss the design of a high level language (DFL: Data Flow Language) suitable for data flow computers. Some sample procedures in DFL are presented. The implementation aspects have not been discussed in detail since there are no new problems encountered. The language DFL embodies the concepts of functional programming, but in appearance closely resembles Pascal. The language is a better vehicle than the data flow graph for expressing a parallel algorithm. The compiler has been implemented on a DEC 1090 system in Pascal.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Simultaneous recordings of spike trains from multiple single neurons are becoming commonplace. Understanding the interaction patterns among these spike trains remains a key research area. A question of interest is the evaluation of information flow between neurons through the analysis of whether one spike train exerts causal influence on another. For continuous-valued time series data, Granger causality has proven an effective method for this purpose. However, the basis for Granger causality estimation is autoregressive data modeling, which is not directly applicable to spike trains. Various filtering options distort the properties of spike trains as point processes. Here we propose a new nonparametric approach to estimate Granger causality directly from the Fourier transforms of spike train data. We validate the method on synthetic spike trains generated by model networks of neurons with known connectivity patterns and then apply it to neurons limultaneously recorded from the thalamus and the primary somatosensory cortex of a squirrel monkey undergoing tactile stimulation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: The aging population is placing increasing demands on surgical services, simultaneously with a decreasing supply of professional labor and a worsening economic situation. Under growing financial constraints, successful operating room management will be one of the key issues in the struggle for technical efficiency. This study focused on several issues affecting operating room efficiency. Materials and methods: The current formal operating room management in Finland and the use of performance metrics and information systems used to support this management were explored using a postal survey. We also studied the feasibility of a wireless patient tracking system as a tool for managing the process. The reliability of the system as well as the accuracy and precision of its automatically recorded time stamps were analyzed. The benefits of a separate anesthesia induction room in a prospective setting were compared with the traditional way of working, where anesthesia is induced in the operating room. Using computer simulation, several models of parallel processing for the operating room were compared with the traditional model with respect to cost-efficiency. Moreover, international differences in operating room times for two common procedures, laparoscopic cholecystectomy and open lung lobectomy, were investigated. Results: The managerial structure of Finnish operating units was not clearly defined. Operating room management information systems were found to be out-of-date, offering little support to online evaluation of the care process. Only about half of the information systems provided information in real time. Operating room performance was most often measured by the number of procedures in a time unit, operating room utilization, and turnover time. The wireless patient tracking system was found to be feasible for hospital use. Automatic documentation of the system facilitated patient flow management by increasing process transparency via more available and accurate data, while lessening work for staff. Any parallel work flow model was more cost-efficient than the traditional way of performing anesthesia induction in the operating room. Mean operating times for two common procedures differed by 50% among eight hospitals in different countries. Conclusions: The structure of daily operative management of an operating room warrants redefinition. Performance measures as well as information systems require updating. Parallel work flows are more cost-efficient than the traditional induction-in-room model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Autonomous mission control, unlike automatic mission control which is generally pre-programmed to execute an intended mission, is guided by the philosophy of carrying out a complete mission on its own through online sensing, information processing, and control reconfiguration. A crucial cornerstone of this philosophy is the capability of intelligence and of information sharing between unmanned aerial vehicles (UAVs) or with a central controller through secured communication links. Though several mission control algorithms, for single and multiple UAVs, have been discussed in the literature, they lack a clear definition of the various autonomous mission control levels. In the conventional system, the ground pilot issues the flight and mission control command to a UAV through a command data link and the UAV transmits intelligence information, back to the ground pilot through a communication link. Thus, the success of the mission depends entirely on the information flow through a secured communication link between ground pilot and the UAV In the past, mission success depended on the continuous interaction of ground pilot with a single UAV, while present day applications are attempting to define mission success through efficient interaction of ground pilot with multiple UAVs. However, the current trend in UAV applications is expected to lead to a futuristic scenario where mission success would depend only on interaction among UAV groups with no interaction with any ground entity. However, to reach this capability level, it is necessary to first understand the various levels of autonomy and the crucial role that information and communication plays in making these autonomy levels possible. This article presents a detailed framework of UAV autonomous mission control levels in the context of information flow and communication between UAVs and UAV groups for each level of autonomy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Implementation details of efficient schemes for lenient execution and concurrent execution of re-entrant routines in a data flow model have been discussed in this paper. The proposed schemes require no extra hardware support and utilise the existing hardware resources such as the Matching Unit and Memory Network Interface, effectively to achieve the above mentioned goals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The effect of surface mass transfer on buoyancy induced flow in a variable porosity medium adjacent to a heated vertical plate is studied for high Rayleigh numbers. Similarity solutions are obtained within the frame work of boundary layer theory for a power law variation in surface temperature,T Wpropx lambda and surface injectionv Wpropx(lambda–1/2). The analysis incorporates the expression connecting porosity and permeability and also the expression connecting porosity and effective thermal diffusivity. The influence of thermal dispersion on the flow and heat transfer characteristics are also analysed in detail. The results of the present analysis document the fact that variable porosity enhances heat transfer rate and the magnitude of velocity near the wall. The governing equations are solved using an implicit finite difference scheme for both the Darcy flow model and Forchheimer flow model, the latter analysis being confined to an isothermal surface and an impermeable vertical plate. The influence of the intertial terms in the Forchheimer model is to decrease the heat transfer and flow rates and the influence of thermal dispersion is to increase the heat transfer rate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The current study analyzes the leachate distribution in the Orchard Hills Landfill, Davis Junction, Illinois, using a two-phase flow model to assess the influence of variability in hydraulic conductivity on the effectiveness of the existing leachate recirculation system and its operations through reliability analysis. Numerical modeling, using finite-difference code, is performed with due consideration to the spatial variation of hydraulic conductivity of the municipal solid waste (MSW). The inhomogeneous and anisotropic waste condition is assumed because it is a more realistic representation of the MSW. For the reliability analysis, the landfill is divided into 10 MSW layers with different mean values of vertical and horizontal hydraulic conductivities (decreasing from top to bottom), and the parametric study is performed by taking the coefficients of variation (COVs) as 50, 100, 150, and 200%. Monte Carlo simulations are performed to obtain statistical information (mean and COV) of output parameters of the (1) wetted area of the MSW, (2) maximum induced pore pressure, and (3) leachate outflow. The results of the reliability analysis are used to determine the influence of hydraulic conductivity on the effectiveness of the leachate recirculation and are discussed in the light of a deterministic approach. The study is useful in understanding the efficiency of the leachate recirculation system. (C) 2013 American Society of Civil Engineers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper provides a new model of network formation that bridges the gap between the two benchmark models by Bala and Goyal, the one-way flow model, and the two-way flow model, and includes both as particular extreme cases. As in both benchmark models, in what we call an "asymmetric flow" network a link can be initiated unilaterally by any player with any other, and the flow through a link towards the player who supports it is perfect. Unlike those models, in the opposite direction there is friction or decay. When this decay is complete there is no flow and this corresponds to the one-way flow model. The limit case when the decay in the opposite direction (and asymmetry) disappears, corresponds to the two-way flow model. We characterize stable and strictly stable architectures for the whole range of parameters of this "intermediate" and more general model. We also prove the convergence of Bala and Goyal's dynamic model in this context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hydrocyclones are widely used in industry, of which the geometrical design using CFD techniques is gaining more popularity in recent years. In this study, the Euler-Euler approach and the Reynolds stress model are applied to simulate the liquid-solid flowfield in a hydrocyclone. The methodology is validated by a good agreement between experimental data and numerical results. Within the research range, the simulation indicates that the liquid-solid separation mainly occurs in the conical segment, and increasing conical height or decreasing cylindrical height helps to improve the grade efficiencies of solid particles. Based on these results, two of the same hydrocyclones are designed and installed in series to establish a liquid-solid separation system. Many experiments are then conducted under different conditions, in which the effects of the water cut and the second hydrocyclone on the separation are investigated. The results also confirm that smaller solid particles are more susceptible to the inlet conditions, and the second hydrocyclone plays a more important role as the water cut reduces.