919 resultados para multi-environments experiments


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Mar, especialidade de Biologia Marinha, 18 de Dezembro de 2015, Universidade dos Açores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter aims to demonstrate how PAOL - Unit for Innovation in Education, a project from ISCAP - School of Accounting and Administration of Oporto ....

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A classical application of biosignal analysis has been the psychophysiological detection of deception, also known as the polygraph test, which is currently a part of standard practices of law enforcement agencies and several other institutions worldwide. Although its validity is far from gathering consensus, the underlying psychophysiological principles are still an interesting add-on for more informal applications. In this paper we present an experimental off-the-person hardware setup, propose a set of feature extraction criteria and provide a comparison of two classification approaches, targeting the detection of deception in the context of a role-playing interactive multimedia environment. Our work is primarily targeted at recreational use in the context of a science exhibition, where the main goal is to present basic concepts related with knowledge discovery, biosignal analysis and psychophysiology in an educational way, using techniques that are simple enough to be understood by children of different ages. Nonetheless, this setting will also allow us to build a significant data corpus, annotated with ground-truth information, and collected with non-intrusive sensors, enabling more advanced research on the topic. Experimental results have shown interesting findings and provided useful guidelines for future work. Pattern Recognition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solvent extraction is considered as a multi-criteria optimization problem, since several chemical species with similar extraction kinetic properties are frequently present in the aqueous phase and the selective extraction is not practicable. This optimization, applied to mixer–settler units, considers the best parameters and operating conditions, as well as the best structure or process flow-sheet. Global process optimization is performed for a specific flow-sheet and a comparison of Pareto curves for different flow-sheets is made. The positive weight sum approach linked to the sequential quadratic programming method is used to obtain the Pareto set. In all investigated structures, recovery increases with hold-up, residence time and agitation speed, while the purity has an opposite behaviour. For the same treatment capacity, counter-current arrangements are shown to promote recovery without significant impairment in purity. Recycling the aqueous phase is shown to be irrelevant, but organic recycling with as many stages as economically feasible clearly improves the design criteria and reduces the most efficient organic flow-rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As polycyclic aromatic hydrocarbons (PAHs) have a negative impact on human health due to their mutagenic and/or carcinogenic properties, the objective of this work was to study the influence of tobacco smoke on levels and phase distribution of PAHs and to evaluate the associated health risks. The air samples were collected at two homes; 18 PAHs (the 16 PAHs considered by U.S. EPA as priority pollutants, dibenzo[a,l]pyrene and benzo[j]fluoranthene) were determined in gas phase and associated with thoracic (PM10) and respirable (PM2.5) particles. At home influenced by tobacco smoke the total concentrations of 18 PAHs in air ranged from 28.3 to 106 ngm 3 (mean of 66.7 25.4 ngm 3),∑PAHs being 95% higher than at the non-smoking one where the values ranged from 17.9 to 62.0 ngm 3 (mean of 34.5 16.5 ngm 3). On average 74% and 78% of ∑PAHs were present in gas phase at the smoking and non-smoking homes, respectively, demonstrating that adequate assessment of PAHs in air requires evaluation of PAHs in both gas and particulate phases. When influenced by tobacco smoke the health risks values were 3.5e3.6 times higher due to the exposure of PM10. The values of lifetime lung cancer risks were 4.1 10 3 and 1.7 10 3 for the smoking and nonsmoking homes, considerably exceeding the health-based guideline level at both homes also due to the contribution of outdoor traffic emissions. The results showed that evaluation of benzo[a]pyrene alone would probably underestimate the carcinogenic potential of the studied PAH mixtures; in total ten carcinogenic PAHs represented 36% and 32% of the gaseous ∑PAHs and in particulate phase they accounted for 75% and 71% of ∑PAHs at the smoking and non-smoking homes, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-objective particle swarm optimization (MOPSO) is a search algorithm based on social behavior. Most of the existing multi-objective particle swarm optimization schemes are based on Pareto optimality and aim to obtain a representative non-dominated Pareto front for a given problem. Several approaches have been proposed to study the convergence and performance of the algorithm, particularly by accessing the final results. In the present paper, a different approach is proposed, by using Shannon entropy to analyzethe MOPSO dynamics along the algorithm execution. The results indicate that Shannon entropy can be used as an indicator of diversity and convergence for MOPSO problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study focused on the development of a sensitive enzymatic biosensor for the determination of pirimicarb pesticide based on the immobilization of laccase on composite carbon paste electrodes. Multi- walled carbon nanotubes(MWCNTs)paste electrode modified by dispersion of laccase(3%,w/w) within the optimum composite matrix(60:40%,w/w,MWCNTs and paraffin binder)showed the best performance, with excellent electron transfer kinetic and catalytic effects related to the redox process of the substrate4- aminophenol. No metal or anti-interference membrane was added. Based on the inhibition of laccase activity, pirimicarb can be determined in the range 9.90 ×10- 7 to 1.15 ×10- 5 molL 1 using 4- aminophenol as substrate at the optimum pH of 5.0, with acceptable repeatability and reproducibility (relative standard deviations lower than 5%).The limit of detection obtained was 1.8 × 10-7 molL 1 (0.04 mgkg 1 on a fresh weight vegetable basis).The high activity and catalytic properties of the laccase- based biosensor are retained during ca. one month. The optimized electroanalytical protocol coupled to the QuEChERS methodology were applied to tomato and lettuce samples spiked at three levels; recoveries ranging from 91.0±0.1% to 101.0 ± 0.3% were attained. No significant effects in the pirimicarb electro- analysis were observed by the presence of pro-vitamin A, vitamins B1 and C,and glucose in the vegetable extracts. The proposed biosensor- based pesticide residue methodology fulfills all requisites to be used in implementation of food safety programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on evaluating the usability of an Intelligent Wheelchair (IW) in both real and simulated environments. The wheelchair is controlled at a high-level by a flexible multimodal interface, using voice commands, facial expressions, head movements and joystick as its main inputs. A Quasi-experimental design was applied including a deterministic sample with a questionnaire that enabled to apply the System Usability Scale. The subjects were divided in two independent samples: 46 individuals performing the experiment with an Intelligent Wheelchair in a simulated environment (28 using different commands in a sequential way and 18 with the liberty to choose the command); 12 individuals performing the experiment with a real IW. The main conclusion achieved by this study is that the usability of the Intelligent Wheelchair in a real environment is higher than in the simulated environment. However there were not statistical evidences to affirm that there are differences between the real and simulated wheelchairs in terms of safety and control. Also, most of users considered the multimodal way of driving the wheelchair very practical and satisfactory. Thus, it may be concluded that the multimodal interfaces enables very easy and safe control of the IW both in simulated and real environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resource constraints are becoming a problem as many of the wireless mobile devices have increased generality. Our work tries to address this growing demand on resources and performance, by proposing the dynamic selection of neighbor nodes for cooperative service execution. This selection is in uenced by user's quality of service requirements expressed in his request, tailoring provided service to user's speci c needs. In this paper we improve our proposal's formulation algorithm with the ability to trade o time for the quality of the solution. At any given time, a complete solution for service execution exists, and the quality of that solution is expected to improve overtime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current work can be seen as a starting point for the discussion of the problematic on risk acceptance criteria in occupational environments. Some obstacles to the quantitative acceptance criteria formulation and use were analyzed. A look to the long tradition of major hazards accidents was also performed. This work shows that organizations can have several difficulties in acceptance criteria formulation and that the use of pre-defined acceptance criteria in risk assessment methodologies can be inadequate in some cases. It is urgent to define guidelines that can help organizations in the formulation of risk acceptance criteria for occupational environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fieldbus communication networks aim to interconnect sensors, actuators and controllers within process control applications. Therefore, they constitute the foundation upon which real-time distributed computer-controlled systems can be implemented. P-NET is a fieldbus communication standard, which uses a virtual token-passing medium-access-control mechanism. In this paper pre-run-time schedulability conditions for supporting real-time traffic with P-NET networks are established. Essentially, formulae to evaluate the upper bound of the end-to-end communication delay in P-NET messages are provided. Using this upper bound, a feasibility test is then provided to check the timing requirements for accessing remote process variables. This paper also shows how P-NET network segmentation can significantly reduce the end-to-end communication delays for messages with stringent timing requirements.