134 resultados para Religions (Proposed, universal, etc.)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new immobilized flat plate photocatalytic reactor for wastewater treatment has been proposed in this study to avoid subsequent catalyst removal from the treated water. The reactor consists of an inlet, reactive section where catalyst is coated and an outlet parts. In order to optimize the fluid mixing and reactor design, this study aims to investigate the influence of baffles and its arrangement on the flat plate reactor hydrodynamics using computational fluid dynamics (CFD) simulation. For simulation, an array of baffles acting as turbulence promoters is inserted in the reactive zone of the reactor. In this regard, results obtained from the simulation of a baffled- flat plate photoreactor hydrodynamics for different baffle positions, heights and intervals are presented utilizing RNG k-ε turbulence model. Under the conditions simulated, the qualitative flow features, such as the development and separation of boundary layers, vortex formation, the presence of high shear regions and recirculation zones, and the underlying mechanism are examined. The influence of various baffle sizes on the distribution of pollutant concentration is also highlighted. The results presented here indicate that the spanning of recirculation increases the degree of interfacial distortion with a larger interfacial area between fluids which results in substantial enhancement in fluid mixing. The simulation results suggest that the qualitative and quantitative properties of fluid dynamics in a baffled reactor can be obtained which provides valuable insight to fully understand the effect of baffles and its arrangements on the flow pattern, behaviour, and feature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bana et al. proposed the relation formal indistinguishability (FIR), i.e. an equivalence between two terms built from an abstract algebra. Later Ene et al. extended it to cover active adversaries and random oracles. This notion enables a framework to verify computational indistinguishability while still offering the simplicity and formality of symbolic methods. We are in the process of making an automated tool for checking FIR between two terms. First, we extend the work by Ene et al. further, by covering ordered sorts and simplifying the way to cope with random oracles. Second, we investigate the possibility of combining algebras together, since it makes the tool scalable and able to cover a wide class of cryptographic schemes. Specially, we show that the combined algebra is still computationally sound, as long as each algebra is sound. Third, we design some proving strategies and implement the tool. Basically, the strategies allow us to find a sequence of intermediate terms, which are formally indistinguishable, between two given terms. FIR between the two given terms is then guaranteed by the transitivity of FIR. Finally, we show applications of the work, e.g. on key exchanges and encryption schemes. In the future, the tool should be extended easily to cover many schemes. This work continues previous research of ours on use of compilers to aid in automated proofs for key exchange.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While recent research has provided valuable information as to the composition of laser printer particles, their formation mechanisms, and explained why some printers are emitters whilst others are low emitters, fundamental questions relating to the potential exposure of office workers remained unanswered. In particular, (i) what impact does the operation of laser printers have on the background particle number concentration (PNC) of an office environment over the duration of a typical working day?; (ii) what is the airborne particle exposure to office workers in the vicinity of laser printers; (iii) what influence does the office ventilation have upon the transport and concentration of particles?; (iv) is there a need to control the generation of, and/or transport of particles arising from the operation of laser printers within an office environment?; (v) what instrumentation and methodology is relevant for characterising such particles within an office location? We present experimental evidence on printer temporal and spatial PNC during the operation of 107 laser printers within open plan offices of five buildings. We show for the first time that the eight-hour time-weighted average printer particle exposure is significantly less than the eight-hour time-weighted local background particle exposure, but that peak printer particle exposure can be greater than two orders of magnitude higher than local background particle exposure. The particle size range is predominantly ultrafine (< 100nm diameter). In addition we have established that office workers are constantly exposed to non-printer derived particle concentrations, with up to an order of magnitude difference in such exposure amongst offices, and propose that such exposure be controlled along with exposure to printer derived particles. We also propose, for the first time, that peak particle reference values be calculated for each office area analogous to the criteria used in Australia and elsewhere for evaluating exposure excursion above occupational hazardous chemical exposure standards. A universal peak particle reference value of 2.0 x 104 particles cm-3 has been proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In team sports such as rugby union, a myriad of decisions and actions occur within the boundaries that compose the performance perceptual- motor workspace. The way that these performance boundaries constrain decision making and action has recently interested researchers and has involved developing an understanding of the concept of constraints. Considering team sports as complex dynamical systems, signifies that they are composed of multiple, independent agents (i.e. individual players) whose interactions are highly integrated. This level of complexity is characterized by the multiple ways that players in a rugby field can interact. It affords the emergence of rich patterns of behaviour, such as rucks, mauls, and collective tactical actions that emerge due to players’ adjustments to dynamically varying competition environments. During performance, the decisions and actions of each player are constrained by multiple causes (e.g. technical and tactical skills, emotional states, plans, thoughts, etc.) that generate multiple effects (e.g. to run or pass, to move forward to tackle or maintain position and drive the opponent to the line), a prime feature in a complex systems approach to team games performance (Bar- Yam, 2004). To establish a bridge between the complexity sciences and learning design in team sports like rugby union, the aim of practice sessions is to prepare players to pick up and explore the information available in the multiple constraints (i.e. the causes) that influence performance. Therefore, learning design in training sessions should be soundly based on the interactions amongst players (i.e.teammates and opponents) that will occur in rugby matches. To improve individual and collective decision making in rugby union, Passos and colleagues proposed in previous work a performer- environment interaction- based approach rather than a traditional performer- based approach (Passos, Araújo, Davids & Shuttleworth, 2008).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the Service-oriented architecture paradigm has become ever more popular, different standardization efforts have been proposed by various consortia to enable interaction among heterogeneous environments through this paradigm. This chapter will overview the most prevalent of these SOA Efforts. It will first show how technical services can be described, how they can interact with each other and be discovered by users. Next, the chapter will present different standards to facilitate service composition and to design service-oriented environments in light of a universal understanding of service orientation. The chapter will conclude with a summary and a discussion on the limitations of the reviewed standards along their ability to describe service properties. This paves the way to the next chapters where the USDL standard will be presented, which aim to lift such limitations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many applications, e.g., bioinformatics, web access traces, system utilisation logs, etc., the data is naturally in the form of sequences. People have taken great interest in analysing the sequential data and finding the inherent characteristics or relationships within the data. Sequential association rule mining is one of the possible methods used to analyse this data. As conventional sequential association rule mining very often generates a huge number of association rules, of which many are redundant, it is desirable to find a solution to get rid of those unnecessary association rules. Because of the complexity and temporal ordered characteristics of sequential data, current research on sequential association rule mining is limited. Although several sequential association rule prediction models using either sequence constraints or temporal constraints have been proposed, none of them considered the redundancy problem in rule mining. The main contribution of this research is to propose a non-redundant association rule mining method based on closed frequent sequences and minimal sequential generators. We also give a definition for the non-redundant sequential rules, which are sequential rules with minimal antecedents but maximal consequents. A new algorithm called CSGM (closed sequential and generator mining) for generating closed sequences and minimal sequential generators is also introduced. A further experiment has been done to compare the performance of generating non-redundant sequential rules and full sequential rules, meanwhile, performance evaluation of our CSGM and other closed sequential pattern mining or generator mining algorithms has also been conducted. We also use generated non-redundant sequential rules for query expansion in order to improve recommendations for infrequently purchased products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Travel time is an important network performance measure and it quantifies congestion in a manner easily understood by all transport users. In urban networks, travel time estimation is challenging due to number of reasons such as, fluctuations in traffic flow due to traffic signals, significant flow to/from mid link sinks/sources, etc. The classical analytical procedure utilizes cumulative plots at upstream and downstream locations for estimating travel time between the two locations. In this paper, we discuss about the issues and challenges with classical analytical procedure such as its vulnerability to non conservation of flow between the two locations. The complexity with respect to exit movement specific travel time is discussed. Recently, we have developed a methodology utilising classical procedure to estimate average travel time and its statistic on urban links (Bhaskar, Chung et al. 2010). Where, detector, signal and probe vehicle data is fused. In this paper we extend the methodology for route travel time estimation and test its performance using simulation. The originality is defining cumulative plots for each exit turning movement utilising historical database which is self updated after each estimation. The performance is also compared with a method solely based on probe (Probe-only). The performance of the proposed methodology has been found insensitive to different route flow, with average accuracy of more than 94% given a probe per estimation interval which is more than 5% increment in accuracy with respect to Probe-only method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Overall, computer models and simulations have a rather disappointing record within the management sciences as a tool for predicting the future. Social and market environments can be influenced by an overwhelming number of variables, and it is therefore difficult to use computer models to make forecasts or to test hypotheses concerning the relationship between individual behaviours and macroscopic outcomes. At the same time, however, advocates of computer models argue that they can be used to overcome the human mind's inability to cope with several complex variables simultaneously or to understand concepts that are highly counterintuitive. This paper seeks to bridge the gap between these two perspectives by suggesting that management research can indeed benefit from computer models by using them to formulate fruitful hypotheses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many existing schemes for malware detection are signature-based. Although they can effectively detect known malwares, they cannot detect variants of known malwares or new ones. Most network servers do not expect executable code in their in-bound network traffic, such as on-line shopping malls, Picasa, Youtube, Blogger, etc. Therefore, such network applications can be protected from malware infection by monitoring their ports to see if incoming packets contain any executable contents. This paper proposes a content-classification scheme that identifies executable content in incoming packets. The proposed scheme analyzes the packet payload in two steps. It first analyzes the packet payload to see if it contains multimedia-type data (such as . If not, then it classifies the payload either as text-type (such as or executable. Although in our experiments the proposed scheme shows a low rate of false negatives and positives (4.69% and 2.53%, respectively), the presence of inaccuracies still requires further inspection to efficiently detect the occurrence of malware. In this paper, we also propose simple statistical and combinatorial analysis to deal with false positives and negatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Review of Coping with Choices to Die, by C. G. Prado. Cambridge: Cambridge University Press, 2011.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thin solid films were extensively used in the making of solar cells, cutting tools, magnetic recording devices, etc. As a result, the accurate measurement of mechanical properties of the thin films, such as hardness and elastic modulus, was required. The thickness of thin films normally varies from tens of nanometers to several micrometers. It is thus challenging to measure their mechanical properties. In this study, a nanoscratch method was proposed for hardness measurement. A three-dimensional finite element method (3-D FEM) model was developed to validate the nanoscratch method and to understand the substrate effect during nanoscratch. Nanoindentation was also used for comparison. The nanoscratch method was demonstrated to be valuable for measuring hardness of thin solid films.