911 resultados para information flow properties
Resumo:
Understanding the dynamics of blood cells is a crucial element to discover biological mechanisms, to develop new efficient drugs, design sophisticated microfluidic devices, for diagnostics. In this work, we focus on the dynamics of red blood cells in microvascular flow. Microvascular blood flow resistance has a strong impact on cardiovascular function and tissue perfusion. The flow resistance in microcirculation is governed by flow behavior of blood through a complex network of vessels, where the distribution of red blood cells across vessel cross-sections may be significantly distorted at vessel bifurcations and junctions. We investigate the development of blood flow and its resistance starting from a dispersed configuration of red blood cells in simulations for different hematocrits, flow rates, vessel diameters, and aggregation interactions between red blood cells. Initially dispersed red blood cells migrate toward the vessel center leading to the formation of a cell-free layer near the wall and to a decrease of the flow resistance. The development of cell-free layer appears to be nearly universal when scaled with a characteristic shear rate of the flow, which allows an estimation of the length of a vessel required for full flow development, $l_c \approx 25D$, with vessel diameter $D$. Thus, the potential effect of red blood cell dispersion at vessel bifurcations and junctions on the flow resistance may be significant in vessels which are shorter or comparable to the length $l_c$. The presence of aggregation interactions between red blood cells lead in general to a reduction of blood flow resistance. The development of the cell-free layer thickness looks similar for both cases with and without aggregation interactions. Although, attractive interactions result in a larger cell-free layer plateau values. However, because the aggregation forces are short-ranged at high enough shear rates ($\bar{\dot{\gamma}} \gtrsim 50~\text{s}^{-1}$) aggregation of red blood cells does not bring a significant change to the blood flow properties. Also, we develop a simple theoretical model which is able to describe the converged cell-free-layer thickness with respect to flow rate assuming steady-state flow. The model is based on the balance between a lift force on red blood cells due to cell-wall hydrodynamic interactions and shear-induced effective pressure due to cell-cell interactions in flow. We expect that these results can also be used to better understand the flow behavior of other suspensions of deformable particles such as vesicles, capsules, and cells. Finally, we investigate segregation phenomena in blood as a two-component suspension under Poiseuille flow, consisting of red blood cells and target cells. The spatial distribution of particles in blood flow is very important. For example, in case of nanoparticle drug delivery, the particles need to come closer to microvessel walls, in order to adhere and bring the drug to a target position within the microvasculature. Here we consider that segregation can be described as a competition between shear-induced diffusion and the lift force that pushes every soft particle in a flow away from the wall. In order to investigate the segregation, on one hand, we have 2D DPD simulations of red blood cells and target cell of different sizes, on the other hand the Fokker-Planck equation for steady state. For the equation we measure force profile, particle distribution and diffusion constant across the channel. We compare simulation results with those from the Fokker-Planck equation and find a very good correspondence between the two approaches. Moreover, we investigate the diffusion behavior of target particles for different hematocrit values and shear rates. Our simulation results indicate that diffusion constant increases with increasing hematocrit and depends linearly on shear rate. The third part of the study describes development of a simulation model of complex vascular geometries. The development of the model is important to reproduce vascular systems of small pieces of tissues which might be gotten from MRI or microscope images. The simulation model of the complex vascular systems might be divided into three parts: modeling the geometry, developing in- and outflow boundary conditions, and simulation domain decomposition for an efficient computation. We have found that for the in- and outflow boundary conditions it is better to use the SDPD fluid than DPD one because of the density fluctuations along the channel of the latter. During the flow in a straight channel, it is difficult to control the density of the DPD fluid. However, the SDPD fluid has not that shortcoming even in more complex channels with many branches and in- and outflows because the force acting on particles is calculated also depending on the local density of the fluid.
Resumo:
The brain is a network spanning multiple scales from subcellular to macroscopic. In this thesis I present four projects studying brain networks at different levels of abstraction. The first involves determining a functional connectivity network based on neural spike trains and using a graph theoretical method to cluster groups of neurons into putative cell assemblies. In the second project I model neural networks at a microscopic level. Using diferent clustered wiring schemes, I show that almost identical spatiotemporal activity patterns can be observed, demonstrating that there is a broad neuro-architectural basis to attain structured spatiotemporal dynamics. Remarkably, irrespective of the precise topological mechanism, this behavior can be predicted by examining the spectral properties of the synaptic weight matrix. The third project introduces, via two circuit architectures, a new paradigm for feedforward processing in which inhibitory neurons have the complex and pivotal role in governing information flow in cortical network models. Finally, I analyze axonal projections in sleep deprived mice using data collected as part of the Allen Institute's Mesoscopic Connectivity Atlas. After normalizing for experimental variability, the results indicate there is no single explanatory difference in the mesoscale network between control and sleep deprived mice. Using machine learning techniques, however, animal classification could be done at levels significantly above chance. This reveals that intricate changes in connectivity do occur due to chronic sleep deprivation.
Resumo:
Measuring quality attributes of object-oriented designs (e.g. maintainability and performance) has been covered by a number of studies. However, these studies have not considered security as much as other quality attributes. Also, most security studies focus at the level of individual program statements. This approach makes it hard and expensive to discover and fix vulnerabilities caused by design errors. In this work, we focus on the security design of an object oriented application and define a number of security metrics. These metrics allow designers to discover and fix security vulnerabilities at an early stage, and help compare the security of various alternative designs. In particular, we propose seven security metrics to measure Data Encapsulation (accessibility) and Cohesion (interactions) of a given object-oriented class from the point of view of potential information flow.
Resumo:
Life-cycle management (LCM) has been employed in the management of construction projects for many years in order to reduce whole life cost, time, risk and improve the service to owners. However, owing to lack of an effective information sharing platform, the current LCM of construction projects is not effectively used in the construction industry. Based upon the analysis of the information flow of LCM, a virutal prototyping (VP)-based communication and collaboration information platform is proposed. Following this, the platform is customized using DASSAULT sofware. The whole process of implementing the VP-based LCM are also discussed and, from a simple case study, it is demonstrated that the VP-based communication and collaboration information platform is an effective tool to support the LCM of construction projects.
Resumo:
Purpose - Building project management (BPM) requires effective coordination and collaboration between multiple project team organisations which can be achieved by real time information flow between all participants. In the present scenario, this can be achieved by the use of information communication technologies (ICT). The purpose of this paper is to present part of a research project conducted to study the causal relationships between factors affecting ICT adoption for BPM by small and medium enterprises. Design/methodology/approach - This paper discusses structural equation modelling (SEM) analysis conducted to test the causal relationships between quantitative factors. Data for quantitative analysis were gathered through a questionnaire survey conducted in the Indian construction industry. Findings - SEM analysis results help in demonstrating that an increased and matured use of ICT for general administration within the organisation would lead to: an improved ICT infrastructure within the organisation; development of electronic databases; and a staff that is confident of using information technology (IT) tools. In such a scenario, staff would use advanced software and IT technologies for project management (PM) processes and that would lead to an increased adoption of ICT for PM processes. But, for general administration also, ICT adoption would be enhanced if the organisation is interacting more with geographically separated agencies and senior management perceives that significant benefits would accrue by adoption of ICT. All the factors are inter-related and their effect cannot be maximized in isolation. Originality/value - The results provide direction to building project managements for strategically adopting the effective use of ICT within their organisations and for BPM general.
Resumo:
Several studies have developed metrics for software quality attributes of object-oriented designs such as reusability and functionality. However, metrics which measure the quality attribute of information security have received little attention. Moreover, existing security metrics measure either the system from a high level (i.e. the whole system’s level) or from a low level (i.e. the program code’s level). These approaches make it hard and expensive to discover and fix vulnerabilities caused by software design errors. In this work, we focus on the design of an object-oriented application and define a number of information security metrics derivable from a program’s design artifacts. These metrics allow software designers to discover and fix security vulnerabilities at an early stage, and help compare the potential security of various alternative designs. In particular, we present security metrics based on composition, coupling, extensibility, inheritance, and the design size of a given object-oriented, multi-class program from the point of view of potential information flow.
Resumo:
Refactoring focuses on improving the reusability, maintainability and performance of programs. However, the impact of refactoring on the security of a given program has received little attention. In this work, we focus on the design of object-oriented applications and use metrics to assess the impact of a number of standard refactoring rules on their security by evaluating the metrics before and after refactoring. This assessment tells us which refactoring steps can increase the security level of a given program from the point of view of potential information flow, allowing application designers to improve their system’s security at an early stage.
Resumo:
The natural convection boundary layer adjacent to an inclined plate subject to sudden cooling boundary condition has been studied. It is found that the cold boundary layer adjacent to the plate is potentially unstable to Rayleigh-Bénard instability if the Rayleigh number exceeds a certain critical value. A scaling relation for the onset of instability of the boundary layer is achieved. The scaling relations have been developed by equating important terms of the governing equations based on the development of the boundary layer with time. The flow adjacent to the plate can be classified broadly into a conductive, a stable convective or an unstable convective regime determined by the Rayleigh number. Proper scales have been established to quantify the flow properties in each of these flow regimes. An appropriate identification of the time when the instability may set in is discussed. A numerical verification of the time for the onset of instability is also presented in this study. Different flow regimes based on the stability of the boundary layer have been discussed with numerical results.
Resumo:
Since predictions of scalar dispersion in small estuaries can rarely be predicted accurately, new field measurements were conducted continuously at relatively high frequency for up to 50 h (per investigation) in a small subtropical estuary with semidiurnal tides. The bulk flow parameters varied in time with periods comparable to tidal cycles and other large-scale processes. The turbulence properties depended upon the instantaneous local flow properties. They were little affected by the flow history, but their structure and temporal variability were influenced by a variety of parameters including the tidal conditions and bathymetry. A striking feature of the data sets was the large fluctuations in all turbulence characteristics during the tidal cycle, and basic differences between neap and spring tide turbulence.
Resumo:
This article presents a novel approach to confidentiality violation detection based on taint marking. Information flows are dynamically tracked between applications and objects of the operating system such as files, processes and sockets. A confidentiality policy is defined by labelling sensitive information and defining which information may leave the local system through network exchanges. Furthermore, per application profiles can be defined to restrict the sets of information each application may access and/or send through the network. In previous works, we focused on the use of mandatory access control mechanisms for information flow tracking. In this current work, we have extended the previous information flow model to track network exchanges, and we are able to define a policy attached to network sockets. We show an example application of this extension in the context of a compromised web browser: our implementation detects a confidentiality violation when the browser attempts to leak private information to a remote host over the network.
Resumo:
In the current economy, knowledge has been recognized to be a valuable organisational asset, a crucial factor that aids organisations to succeed in highly competitive environments. Many organisations have begun projects and special initiatives aimed at fostering better knowledge sharing amongst their employees. Not surprisingly, information technology (IT) has been a central element of many of these projects and initiatives, as the potential of emerging information technologies such as Web 2.0 for enabling the process of managing organisational knowledge is recognised. This technology could be used as a collaborative system for knowledge management (KM) within enterprises. Enterprise 2.0 is the application of Web 2.0 in an organisational context. Enterprise 2.0 technologies are web-based social software that facilitate collaboration, communication and information flow in a bidirectional manner: an essential aspect of organisational knowledge management. This chapter explains how Enterprise 2.0 technologies (Web 2.0 technologies within organisations) can support knowledge management. The chapter also explores how such technologies support the codifying (technology-centred) and social network (people-centred) approaches of KM, towards bridging the current gap between these two approaches.
Resumo:
In information retrieval (IR) research, more and more focus has been placed on optimizing a query language model by detecting and estimating the dependencies between the query and the observed terms occurring in the selected relevance feedback documents. In this paper, we propose a novel Aspect Language Modeling framework featuring term association acquisition, document segmentation, query decomposition, and an Aspect Model (AM) for parameter optimization. Through the proposed framework, we advance the theory and practice of applying high-order and context-sensitive term relationships to IR. We first decompose a query into subsets of query terms. Then we segment the relevance feedback documents into chunks using multiple sliding windows. Finally we discover the higher order term associations, that is, the terms in these chunks with high degree of association to the subsets of the query. In this process, we adopt an approach by combining the AM with the Association Rule (AR) mining. In our approach, the AM not only considers the subsets of a query as “hidden” states and estimates their prior distributions, but also evaluates the dependencies between the subsets of a query and the observed terms extracted from the chunks of feedback documents. The AR provides a reasonable initial estimation of the high-order term associations by discovering the associated rules from the document chunks. Experimental results on various TREC collections verify the effectiveness of our approach, which significantly outperforms a baseline language model and two state-of-the-art query language models namely the Relevance Model and the Information Flow model
Resumo:
The Social Web is a torrent of real-time information and an emerging discipline is now focussed on harnessing this information flow for analysis of themes, opinions and sentiment. This short paper reports on early work on designing better user interfaces for end users in manipulating the outcomes from these analysis engines.
Resumo:
The complex design process of airport terminal needs to support a wide range of changes in operational facilities for both usual and unusual/emergency events. Process model describes how activities within a process are connected and also states logical information flow of the various activities. The traditional design process overlooks the necessity of information flow from the process model to the actual building design, which needs to be considered as a integral part of building design. The current research introduced a generic method to obtain design related information from process model to incorporate with the design process. Appropriate integration of the process model prior to the design process uncovers the relationship exist between spaces and their relevant functions, which could be missed in the traditional design approach. The current paper examines the available Business Process Model (BPM) and generates modified Business Process Model(mBPM) of check-in facilities of Brisbane International airport. The information adopted from mBPM then transform into possible physical layout utilizing graph theory.
Resumo:
Objective: To describe the reported impact of Pandemic (H1N1) 2009 on EDs, so as to inform future pandemic policy, planning and response management. Methods: This study comprised an issue and theme analysis of publicly accessible literature, data from jurisdictional health departments, and data obtained from two electronic surveys of ED directors and ED staff. The issues identified formed the basis of policy analysis and evaluation. Results: Pandemic (H1N1) 2009 had a significant impact on EDs with presentation for patients with ‘influenza-like illness’ up to three times that of the same time in previous years. Staff reported a range of issues, including poor awareness of pandemic plans, patient and family aggression, chaotic information flow to themselves and the public, heightened stress related to increased workloads and lower levels of staffing due to illness, family care duties and redeployment of staff to flu clinics. Staff identified considerable discomfort associated with prolonged times wearing personal protective equipment. Staff believed that the care of non-flu patients was compromised during the pandemic as a result of overwork, distraction from core business and the difficulties associated with accommodating infectious patients in an environment that was not conducive. Conclusions: This paper describes the breadth of the impact of pandemics on ED operations. It identifies a need to address a range of industrial, management and procedural issues. In particular, there is a need for a single authoritative source of information, the re-engineering of EDs to accommodate infectious patients and organizational changes to enable rapid deployment of alternative sources of care.