958 resultados para Information Flows


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands m ? long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( 1000 L m ? = ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( 5 L m ? = ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Along with the growing complexity of logistic chains the demand for transparency of informations has increased. The use of intelligent RFID-Technology offers the possibility to optimize and control all capacities in use, since it enables the identification and tracking of goods alongside the entire supply chain. Every single product can be located at any given time and a multitude of current and historical data can be transferred. The interaction of the flow of material and the flow of information between the various process steps can be optimized by using RFID-Technology since it guarantees that all required data is available at the right time and at the right place. The local accessibility and convertibility of data allows a flexible, decentralised control of logistic systems. As additional advantages of RFID-Components can be considered that they are individually writable and that their identification can be achieved over considerable distances even if there is no intervisibility between tag and reader. The use of RFID-Transponder opens up new potentials regarding process security, reduction of logistic costs or availability of products. These advantages depend on reliability of the identification processes. The undisputed potentials that are made accessible by the use of RFID-Elements can only be beneficial when the informations that are decentralised and attached to goods and loading equipment can be reliably retrieved at the required points. The communication between tag and reader can be influenced by different materials such as metal, that can disturbed or complicate the radio contact. The communications reliability is subject of various tests and experiments that analyse the effects of different filling materials as well as different alignments of tags on the loading equipment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper represents a new theorization of the role of location-based games (LBGs) as potentially playing specific roles in peoples’ access to the culture of cities [22]. A LBG is a game that employs mobile technologies as tools for game play in real world environments. We argue that as a new genre in the field of mobile entertainment, research in this area tends to be preoccupied with the newness of the technology and its commercial possibilities. However, this overlooks its potential to contribute to cultural production. We argue that the potential to contribute to cultural production lies in the capacity of these experiences to enhance relationships between specific groups and new urban spaces. Given that developers can design LBGs to be played with everyday devices in everyday environments, what new creative opportunities are available to everyday people?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the scope of this study, ‘performance measurement’ includes the collection and presentation of relevant information that reflects progress in achieving organisational strategic aims and meeting the needs of stakeholders such as merchants, importers, exporters and other clients. Evidence shows that utilising information technology (IT) in customs matters supports import and export practices and ensures that supply chain management flows seamlessly. This paper briefly reviews some practical techniques for measuring performance. Its aim is to recommend a model for measuring the performance of information systems (IS): in this case, the Customs Information System (CIS) used by the Royal Malaysian Customs Department (RMCD).The study evaluates the effectiveness of CIS implementation measures in Malaysia from an IT perspective. A model based on IS theories will be used to assess the impact of CIS. The findings of this study recommend measures for evaluating the performance of CIS and its organisational impacts in Malaysia. It is also hoped that the results of the study will assist other Customs administrations evaluate the performance of their information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex flow datasets are often difficult to represent in detail using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows (i.e., complex dynamics and time-dependent). In this paper, we review two popular texture-based techniques and their application to flow datasets sourced from real research projects. The texture-based techniques investigated were Line Integral Convolution (LIC), and Image-Based Flow Visualisation (IBFV). We evaluated these techniques and in this paper report on their visualisation effectiveness (when compared with traditional techniques), their ease of implementation, and their computational overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detailed representations of complex flow datasets are often difficult to generate using traditional vector visualisation techniques such as arrow plots and streamlines. This is particularly true when the flow regime changes in time. Texture-based techniques, which are based on the advection of dense textures, are novel techniques for visualising such flows. We review two popular texture based techniques and their application to flow datasets sourced from active research projects. The techniques investigated were Line integral convolution (LIC) [1], and Image based flow visualisation (IBFV) [18]. We evaluated these and report on their effectiveness from a visualisation perspective. We also report on their ease of implementation and computational overheads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the study is to explain how paradise beliefs are born from the viewpoint of mental functions of the human mind. The focus is on the observation that paradise beliefs across the world are mutually more similar than dissimilar. By using recent theories and results from the cognitive and evolutionary study of religion as well as from studies of environmental preferences, I suggest that this is because pan-human unconscious motivations, the architecture of mind, and the way the human mind processes information constrain the possible repertoire of paradise beliefs. The study is divided into two parts, theoretical and empirical. The arguments in the theoretical part are tested with data in the empirical part with two data sets. The first data set was collected using an Internet survey. The second data set was derived from literary sources. The first data test the assumption that intuitive conceptions of an environment of dreams generally follow the outlines set by evolved environmental preferences, but that they can be tweaked by modifying the presence of desirable elements. The second data test the assumption that familiarity is a dominant factor determining the content of paradise beliefs. The results of the study show that in addition to the widely studied belief in supernatural agents, belief in supernatural environments wells from the natural functioning of the human mind attesting the view that religious thinking and ideas are natural for human species and are produced by the same mental mechanisms as other cultural information. The results also help us to understand that the mental structures behind the belief in the supernatural have a wider scope than has been previously acknowledged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing process mining techniques provide summary views of the overall process performance over a period of time, allowing analysts to identify bottlenecks and associated performance issues. However, these tools are not de- signed to help analysts understand how bottlenecks form and dissolve over time nor how the formation and dissolution of bottlenecks – and associated fluctua- tions in demand and capacity – affect the overall process performance. This paper presents an approach to analyze the evolution of process performance via a notion of Staged Process Flow (SPF). An SPF abstracts a business process as a series of queues corresponding to stages. The paper defines a number of stage character- istics and visualizations that collectively allow process performance evolution to be analyzed from multiple perspectives. The approach has been implemented in the ProM process mining framework. The paper demonstrates the advantages of the SPF approach over state-of-the-art process performance mining tools using two real-life event logs publicly available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quartz fibre anemometers have been used (as described in subsequent papers) to survey the velocity field of turbulent free convective air flows. This paper discusses the reasons for the choice of this instrument and provides the background information for its use in this way. Some practical points concerning fibre anemometers are mentioned. The rest of the paper is a theoretical study of the response of a fibre to a turbulent flow. An approximate representation of the force on the fibre due to the velocity field and the equation for a bending beam, representing the response to this force, form the basis of a consideration of the mean and fluctuating displacement of the fibre. Emphasis is placed on the behaviour when the spectrum of the turbulence is largely in frequencies low enough for the fibre to respond effectively instantaneously (as this corresponds to the practical situation). Incomplete correlation of the turbulence along the length of the fibre is taken into account. Brief mention is made to the theory of the higher-frequency (resonant) response in the context of an experimental check on the applicability of the low-frequency theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple method employing an optical probe is presented to measure density variations in a hypersonic flow obstructed by a test model in a typical shock tunnel. The probe has a plane light wave trans-illuminating the flow and casting a shadow of a random dot pattern. Local slopes of the distorted wavefront are obtained from shifts of the dots in the pattern. Local shifts in the dots are accurately measured by cross-correlating local shifted shadows with the corresponding unshifted originals. The measured slopes are suitably unwrapped by using a discrete cosine transform based phase unwrapping procedure and also through iterative procedures. The unwrapped phase information is used in an iterative scheme for a full quantitative recovery of density distribution in the shock around the model through refraction tomographic inversion. Hypersonic flow field parameters around a missile shaped body at a free-stream Mach number of 5.8 measured using this technique are compared with the numerically estimated values. (C) 2014 Society of Photo-Optical Instrumentation Engineers (SPIE)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a server serving a time-slotted queued system of multiple packet-based flows, where not more than one flow can be serviced in a single time slot. The flows have exogenous packet arrivals and time-varying service rates. At each time, the server can observe instantaneous service rates for only a subset of flows ( selected from a fixed collection of observable subsets) before scheduling a flow in the subset for service. We are interested in queue length aware scheduling to keep the queues short. The limited availability of instantaneous service rate information requires the scheduler to make a careful choice of which subset of service rates to sample. We develop scheduling algorithms that use only partial service rate information from subsets of channels, and that minimize the likelihood of queue overflow in the system. Specifically, we present a new joint subset-sampling and scheduling algorithm called Max-Exp that uses only the current queue lengths to pick a subset of flows, and subsequently schedules a flow using the Exponential rule. When the collection of observable subsets is disjoint, we show that Max-Exp achieves the best exponential decay rate, among all scheduling algorithms that base their decision on the current ( or any finite past history of) system state, of the tail of the longest queue. To accomplish this, we employ novel analytical techniques for studying the performance of scheduling algorithms using partial state, which may be of independent interest. These include new sample-path large deviations results for processes obtained by non-random, predictable sampling of sequences of independent and identically distributed random variables. A consequence of these results is that scheduling with partial state information yields a rate function significantly different from scheduling with full channel information. In the special case when the observable subsets are singleton flows, i.e., when there is effectively no a priori channel state information, Max-Exp reduces to simply serving the flow with the longest queue; thus, our results show that to always serve the longest queue in the absence of any channel state information is large deviations optimal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rarefied gas flows through micro-channels are simulated using particle approaches, named as the information preservation (IP) method and the direct simulation Monte Carlo (DSMC) method. In simulating the low speed flows in long micro-channels the DSMC method encounters the problem of large sample size demand and the difficulty of regulating boundary conditions at the inlet and outlet. Some important computational issues in the calculation of long micro-channel flows by using the IP method, such as the use the conservative form of the mass conservation equation to guarantee the adjustment of the inlet and outlet boundary conditions and the super-relaxation scheme to accelerate the convergence process, are addressed. Stream-wise pressure distributions and mass fluxes through micro-channels given by the IP method agree well with experimental data measured in long micro-channels by Pong et al. (with a height to length ratio of 1.2:3000), Shih et al. (l.2:4800), Arkilic et al. and Arkilic (l.3:7500), respectively. The famous Knudsen minimum of normalized mass flux is observed in IP and DSMC calculations of a short micro-channel over the entire flow regime from continuum to free molecular, whereas the slip Navier-Stokes solution fails to predict it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a measurement of flow patterns and flow velocities of gas-water two-phase flows based on the technique of electrical resistance tomography (ERT) in a 40m horizontal flow loop. A single-plane and dual-plane ERT sensor on conductive ring technique were used to gather sufficient information for the implementation of flow characteristics particularly flow pattern recognition and air cavity velocity measurement. A fast data collection strategy was applied to the dual-plane ERT sensor and an iterative algorithm was used for image reconstruction. Results, in respect to flow patterns and velocity maps, are reported.