703 resultados para patchy environments
Resumo:
The phase diagram of a simple model with two patches of type A and ten patches of type B (2A10B) on the face centred cubic lattice has been calculated by simulations and theory. Assuming that there is no interaction between the B patches the behavior of the system can be described in terms of the ratio of the AB and AA interactions, r. Our results show that, similarly to what happens for related off-lattice and two-dimensional lattice models, the liquid-vapor phase equilibria exhibit reentrant behavior for some values of the interaction parameters. However, for the model studied here the liquid-vapor phase equilibria occur for values of r lower than 1/3, a threshold value which was previously thought to be universal for 2AnB models. In addition, the theory predicts that below r = 1/3 (and above a new condensation threshold which is < 1/3) the reentrant liquid-vapor equilibria are so extreme that it exhibits a closed loop with a lower critical point, a very unusual behavior in single-component systems. An order-disorder transition is also observed at higher densities than the liquid-vapor equilibria, which shows that the liquid-vapor reentrancy occurs in an equilibrium region of the phase diagram. These findings may have implications in the understanding of the condensation of dipolar hard spheres given the analogy between that system and the 2AnB models considered here. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4771591]
Resumo:
We investigate the phase behaviour of 2D mixtures of bi-functional and three-functional patchy particles and 3D mixtures of bi-functional and tetra-functional patchy particles by means of Monte Carlo simulations and Wertheim theory. We start by computing the critical points of the pure systems and then we investigate how the critical parameters change upon lowering the temperature. We extend the successive umbrella sampling method to mixtures to make it possible to extract information about the phase behaviour of the system at a fixed temperature for the whole range of densities and compositions of interest. (C) 2013 AIP Publishing LLC.
Resumo:
II European Conference on Curriculum Studies. "Curriculum studies: Policies, perspectives and practices”. Porto, FPCEUP, October 16th - 17th.
Resumo:
Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.
Resumo:
Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.
Resumo:
We investigate the influence of strong directional, or bonding, interactions on the phase diagram of complex fluids, and in particular on the liquid-vapour critical point. To this end we revisit a simple model and theory for associating fluids which consist of spherical particles having a hard-core repulsion, complemented by three short-ranged attractive sites on the surface (sticky spots). Two of the spots are of type A and one is of type B; the interactions between each pair of spots have strengths [image omitted], [image omitted] and [image omitted]. The theory is applied over the whole range of bonding strengths and results are interpreted in terms of the equilibrium cluster structures of the coexisting phases. In systems where unlike sites do not interact (i.e. where [image omitted]), the critical point exists all the way to [image omitted]. By contrast, when [image omitted], there is no critical point below a certain finite value of [image omitted]. These somewhat surprising results are rationalised in terms of the different network structures of the two systems: two long AA chains are linked by one BB bond (X-junction) in the former case, and by one AB bond (Y-junction) in the latter. The vapour-liquid transition may then be viewed as the condensation of these junctions and we find that X-junctions condense for any attractive [image omitted] (i.e. for any fraction of BB bonds), whereas condensation of the Y-junctions requires that [image omitted] be above a finite threshold (i.e. there must be a finite fraction of AB bonds).
Resumo:
A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition
Resumo:
As polycyclic aromatic hydrocarbons (PAHs) have a negative impact on human health due to their mutagenic and/or carcinogenic properties, the objective of this work was to study the influence of tobacco smoke on levels and phase distribution of PAHs and to evaluate the associated health risks. The air samples were collected at two homes; 18 PAHs (the 16 PAHs considered by U.S. EPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) were determined in gas phase and associated with thoracic (PM10) and respirable (PM2.5) particles. At home influenced by tobacco smoke the total concentrations of 18 PAHs in air ranged from 28.3 to 106 ngm 3 (mean of 66.7 25.4 ngm 3),∑PAHs being 95% higher than at the non-smoking one where the values ranged from 17.9 to 62.0 ngm 3 (mean of 34.5 16.5 ngm 3). On average 74% and 78% of ∑PAHs were present in gas phase at the smoking and non-smoking homes, respectively, demonstrating that adequate assessment of PAHs in air requires evaluation of PAHs in both gas and particulate phases. When influenced by tobacco smoke the health risks values were 3.5e3.6 times higher due to the exposure of PM10. The values of lifetime lung cancer risks were 4.1 10 3 and 1.7 10 3 for the smoking and nonsmoking homes, considerably exceeding the health-based guideline level at both homes also due to the contribution of outdoor traffic emissions. The results showed that evaluation of benzo[a]pyrene alone would probably underestimate the carcinogenic potential of the studied PAH mixtures; in total ten carcinogenic PAHs represented 36% and 32% of the gaseous ∑PAHs and in particulate phase they accounted for 75% and 71% of ∑PAHs at the smoking and non-smoking homes, respectively.
Resumo:
This paper focuses on evaluating the usability of an Intelligent Wheelchair (IW) in both real and simulated environments. The wheelchair is controlled at a high-level by a flexible multimodal interface, using voice commands, facial expressions, head movements and joystick as its main inputs. A Quasi-experimental design was applied including a deterministic sample with a questionnaire that enabled to apply the System Usability Scale. The subjects were divided in two independent samples: 46 individuals performing the experiment with an Intelligent Wheelchair in a simulated environment (28 using different commands in a sequential way and 18 with the liberty to choose the command); 12 individuals performing the experiment with a real IW. The main conclusion achieved by this study is that the usability of the Intelligent Wheelchair in a real environment is higher than in the simulated environment. However there were not statistical evidences to affirm that there are differences between the real and simulated wheelchairs in terms of safety and control. Also, most of users considered the multimodal way of driving the wheelchair very practical and satisfactory. Thus, it may be concluded that the multimodal interfaces enables very easy and safe control of the IW both in simulated and real environments.
Resumo:
Resource constraints are becoming a problem as many of the wireless mobile devices have increased generality. Our work tries to address this growing demand on resources and performance, by proposing the dynamic selection of neighbor nodes for cooperative service execution. This selection is in uenced by user's quality of service requirements expressed in his request, tailoring provided service to user's speci c needs. In this paper we improve our proposal's formulation algorithm with the ability to trade o time for the quality of the solution. At any given time, a complete solution for service execution exists, and the quality of that solution is expected to improve overtime.
Resumo:
The current work can be seen as a starting point for the discussion of the problematic on risk acceptance criteria in occupational environments. Some obstacles to the quantitative acceptance criteria formulation and use were analyzed. A look to the long tradition of major hazards accidents was also performed. This work shows that organizations can have several difficulties in acceptance criteria formulation and that the use of pre-defined acceptance criteria in risk assessment methodologies can be inadequate in some cases. It is urgent to define guidelines that can help organizations in the formulation of risk acceptance criteria for occupational environments.
Resumo:
Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.
Resumo:
In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.
Resumo:
The scarcity and diversity of resources among the devices of heterogeneous computing environments may affect their ability to perform services with specific Quality of Service constraints, particularly in dynamic distributed environments where the characteristics of the computational load cannot always be predicted in advance. Our work addresses this problem by allowing resource constrained devices to cooperate with more powerful neighbour nodes, opportunistically taking advantage of global distributed resources and processing power. Rather than assuming that the dynamic configuration of this cooperative service executes until it computes its optimal output, the paper proposes an anytime approach that has the ability to tradeoff deliberation time for the quality of the solution. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves at each iteration, with an overhead that can be considered negligible.
Resumo:
This paper presents the recent research results about the development of a Observed Time Difference (OTD) based geolocation algorithm based on network trace data, for a real Universal Mobile Telecommunication System (UMTS) Network. The initial results have been published in [1], the current paper focus on increasing the sample convergence rate, and introducing a new filtering approach based on a moving average spatial filter, to increase accuracy. Field tests have been carried out for two radio environments (urban and suburban) in the Lisbon area, Portugal. The new enhancements produced a geopositioning success rate of 47% and 31%, and a median accuracy of 151 m and 337 m, for the urban and suburban environments, respectively. The implemented filter produced a 16% and 20% increase on accuracy, when compared with the geopositioned raw data. The obtained results are rather promising in accuracy and geolocation success rate. OTD positioning smoothed by moving average spatial filtering reveals a strong approach for positioning trace extracted events, vital for boosting Self-Organizing Networks (SON) over a 3G network.