49 resultados para Link characteristics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The sorption of sulfamethoxazole, a frequently detected pharmaceutical compound in the environment, onto walnut shells was evaluated. Methods: The sorption proprieties of the raw sorbent were chemically modified and two additional samples were obtained, respectively HCl and NaOH treated. Scanning electron microscopy, Fourier transform infrared spectroscopy, X-ray photoelectron spectroscopy, and thermogravimetric (TG/DTG) techniques were applied to investigate the effect of the chemical treatments on the shell surface morphology and chemistry. Sorption experiments to investigate the pH effect on the process were carried out between pH 2 and 8. Results: The chemical treatment did not substantially alter the structure of the sorbent (physical and textural characteristics) but modified the surface chemistry of the sorbent (acid–base properties, point of zero charge—pHpzc). The solution pH influences both the sorbent’s surface charge and sulfamethoxazole speciation. The best removal efficiencies were obtained for lower pH values where the neutral and cationic sulfamethoxazole forms are present in the solution. Langmuir and Freundlich isotherms were applied to the experimental adsorption data for sulfamethoxazole sorption at pH 2, 4, and 7 onto raw walnut shell. No statistical difference was found between the two models except for the pH 2 experimental data to which the Freundlich model fitted better. Conclusion: Sorption of sulfamethoxazole was found to be highly pH dependent in the entire pH range studied and for both raw and treated sorbent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Na Fábrica de Papel da Ponte Redonda fabricam-se sacos de papel multi-folhas e papel reciclado do tipo Kraft. Tendo em consideração a primeira actividade, é de grande interesse optimizar o processo de fabrico de papel com vista a incorporara a máxima taxa de papel produzido internamente nas diferentes camadas dos sacos de papel. Os papéis de maior interesse são os do tipo Fluting e Liners, tendo sido produzidos em 2010 um total de 4,9 mil toneladas, ou seja 90% de todo o papel fabricado em 2010, correspondendo a a 4 mil toneladas de papéis do tipo Liners e 0,9 mil toneladas para os papéis do tipo Fluting. Nos papéis do tipo Liners incluem-se os papéis do tipo Test-Liner e Kraft-Liner, representando em termos produtivos valores idênticos. No âmbito deste trabalho, em que se pretendeu controlar as águas do processo e optimizar a produção de papel, foram introduzidos uma unidade de flutuação e um sistema que permitisse regular a consistência da suspensão fibrosa à entrada da máquina do papel, e foram ainda estudadas as possibilidades de adição de produtos químicos para melhorar as características da pasta assim como um tratamento microbiológico mais eficaz para todo o processo. Para se avaliar se as medidas implementadas teriam um impacto positivo na qualidade desses dois tipos de papéis, desenvolveu-se o trabalho em duas fases: a primeira envolve a introdução de um sistema de flutuação e de um sistema de controlo de consistência da pasta, assim como a selecção de produtos químicos a adicionar ao processo. A segunda fase consistiu na avaliação do efeito destas medidas nas características do papel fabricado. Para o efeito foram escolhidos dois tipos de papel e de diferentes gramagens, nomeadamente Test-Liner de 80 g/m2 e Fluting de 110 g/m2. Introduziu-se um flutuador com o objectivo de tratar parte das águas do processo de fabrico com vista a serem reutilizadas em determinadas aplicações possíveis para a qualidade da água obtida (lavagens e água do processo), de modo a conseguir-se uma poupança de água, assim como aproveitar-se as lamas resultantes, ricas em fibra de celulose, para utilizá-las como matéria-prima. Foi introduzido um regulador de consistência no processo da Ponte Redonda com o objectivo de alimentar de uma forma constante a consistência da pasta à entrada da máquina do papel proporcionando uma melhor formação da folha, devido à ligação entre fibras, na direcção máquina e direcção transversal. Esse sistema inovador é um Regulador de Consistência que vem proporcionar à máquina do papel uma alimentação em fibra mais constante. O fabrico de papel apenas a partir de fibras de celulose não permitirá obter um papel com as características desejadas para a sua utilização. Para corrigir estas deficiências, são adicionados produtos químicos para atribuir ou melhorar as propriedades dos papéis. Desta forma considerou-se interessante introduzir no processo um agente de retenção numa fase posterior à preparação da pasta e antes da chegada à máquina de papel, de forma a melhorar as características da suspensão fibrosa. Assim foi implementado um sistema cuja eficácia foi avaliada. Concluiu-se que com a sua implementação a máquina de papel apresentou melhores resultados na drenagem e na turbidez da água removida, significando uma água com menor teor de matéria suspensa e dissolvida, devido à melhor agregação das fibras dispersas na suspensão fibrosa, levando a um aumento da drenagem e consequentemente melhor eficiência das prensas e secaria. Foi também elaborado um estudo para introdução de um sistema de tratamento microbiológico em todo o processo de fabrico de papel, devido à existência de microorganismos prejudiciais ao seu fabrico. Concluiu-se que a água clarificada proveniente do flutuador apresentou qualidade aceitável para os objectivos pretendidos. No entanto, considerando a eficiência de 26,5% na remoção de sólidos suspensos será necessário mais algum tempo de utilização da água clarificada, cerca de um ano, para avaliar se esta terá algum efeito prejudicial nos equipamentos. Verificou-se que devido à existência de microrganismos em todo o processo de fabrico de papel será necessário efectuar lavagens aos tinões, tanques e circuitos com alguma regularidade, aproveitando-se as paragens do processo assim como implementar um sistema de tratamento microbiológico mais eficaz. Em resultado das medidas implementadas concluiu-se que os papéis produzidos apresentaram melhorias, tendo-se obtido melhores resultados em todos os ensaios de resistência. No papel do tipo Test-Liner destacam-se os bons resultados nos ensaios de superfície, Cobb60 e rebentamento. No caso do parâmetro do Cobb60, foi um resultado surpreendente visto que por norma este tipo de papéis reciclados não suportam este ensaio. Concluiu-se também que as medidas implementadas proporcionaram uma melhor agregação e ligação entre fibras, e melhor formação da folha na máquina do papel proporcionando aos papéis propriedades físico-mecânicas mais interessantes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the business world, there are issues such as globalisation, environmental awareness, and the rising expectations of public opinion which have a specific role in what is required from companies as providers of information to the market. This chapter refers to the current state of corporate reporting (financial reporting and sustainability reporting) and demonstrates the need for evolution to a more integrated method of reporting which meets the stakeholders’ needs. This research offers a reflection on how this development can be achieved, which notes the ongoing efforts by international organisations in implementing the diffusion and adoption, as well as looking at the characteristics which are needed for this type of reporting. It also makes the link between an actual case of a company that is one of the world references in sustainable development and integrated reporting. Whether or not the integrated reporting is the natural evolution of the history of financial and sustainability reporting, it still cannot yet claim to be infallible. However, it may definitely be concluded that a new approach is necessary to meet the needs which are continuously developing for a network of stakeholders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a contribution for the assessment and comparison of magnet properties based on magnetic field characteristics particularly concerning the magnetic induction uniformity in the air gaps. For this aim, a solver was developed and implemented to determine the magnetic field of a magnetic core to be used in Fast Field Cycling (FFC) Nuclear Magnetic Resonance (NMR) relaxometry. The electromagnetic field computation is based on a 2D finite-element method (FEM) using both the scalar and the vector potential formulation. Results for the magnetic field lines and the magnetic induction vector in the air gap are presented. The target magnetic induction is 0.2 T, which is a typical requirement of the FFC NMR technique, which can be achieved with a magnetic core based on permanent magnets or coils. In addition, this application requires high magnetic induction uniformity. To achieve this goal, a solution including superconducting pieces is analyzed. Results are compared with a different FEM program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) emerge as underlying infrastructures for new classes of large-scale networked embedded systems. However, WSNs system designers must fulfill the quality-of-service (QoS) requirements imposed by the applications (and users). Very harsh and dynamic physical environments and extremely limited energy/computing/memory/communication node resources are major obstacles for satisfying QoS metrics such as reliability, timeliness, and system lifetime. The limited communication range of WSN nodes, link asymmetry, and the characteristics of the physical environment lead to a major source of QoS degradation in WSNs-the ldquohidden node problem.rdquo In wireless contention-based medium access control (MAC) protocols, when two nodes that are not visible to each other transmit to a third node that is visible to the former, there will be a collision-called hidden-node or blind collision. This problem greatly impacts network throughput, energy-efficiency and message transfer delays, and the problem dramatically increases with the number of nodes. This paper proposes H-NAMe, a very simple yet extremely efficient hidden-node avoidance mechanism for WSNs. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes that scales to multiple clusters via a cluster grouping strategy that guarantees no interference between overlapping clusters. Importantly, H-NAMe is instantiated in IEEE 802.15.4/ZigBee, which currently are the most widespread communication technologies for WSNs, with only minor add-ons and ensuring backward compatibility with their protocols standards. H-NAMe was implemented and exhaustively tested using an experimental test-bed based on ldquooff-the-shelfrdquo technology, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. H-NAMe effectiveness was also demonstrated in a target tracking application with mobile robots - over a WSN deployment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radio Link Quality Estimation (LQE) is a fundamental building block for Wireless Sensor Networks, namely for a reliable deployment, resource management and routing. Existing LQEs (e.g. PRR, ETX, Fourbit, and LQI ) are based on a single link property, thus leading to inaccurate estimation. In this paper, we propose F-LQE, that estimates link quality on the basis of four link quality properties: packet delivery, asymmetry, stability, and channel quality. Each of these properties is defined in linguistic terms, the natural language of Fuzzy Logic. The overall quality of the link is specified as a fuzzy rule whose evaluation returns the membership of the link in the fuzzy subset of good links. Values of the membership function are smoothed using EWMA filter to improve stability. An extensive experimental analysis shows that F-LQE outperforms existing estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hidden-node problem has been shown to be a major source of Quality-of-Service (QoS) degradation in Wireless Sensor Networks (WSNs) due to factors such as the limited communication range of sensor nodes, link asymmetry and the characteristics of the physical environment. In wireless contention-based Medium Access Control protocols, if two nodes that are not visible to each other transmit to a third node that is visible to the formers, there will be a collision – usually called hidden-node or blind collision. This problem greatly affects network throughput, energy-efficiency and message transfer delays, which might be particularly dramatic in large-scale WSNs. This technical report tackles the hidden-node problem in WSNs and proposes HNAMe, a simple yet efficient distributed mechanism to overcome it. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes and then scales to multiple clusters via a cluster grouping strategy that guarantees no transmission interference between overlapping clusters. We also show that the H-NAMe mechanism can be easily applied to the IEEE 802.15.4/ZigBee protocols with only minor add-ons and ensuring backward compatibility with the standard specifications. We demonstrate the feasibility of H-NAMe via an experimental test-bed, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. We believe that the results in this technical report will be quite useful in efficiently enabling IEEE 802.15.4/ZigBee as a WSN protocol.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hidden-node problem has been shown to be a major source of Quality-of-Service (QoS) degradation in Wireless Sensor Networks (WSNs) due to factors such as the limited communication range of sensor nodes, link asymmetry and the characteristics of the physical environment. In wireless contention-based Medium Access Control protocols, if two nodes that are not visible to each other transmit to a third node that is visible to the formers, there will be a collision – usually called hidden-node or blind collision. This problem greatly affects network throughput, energy-efficiency and message transfer delays, which might be particularly dramatic in large-scale WSNs. This paper tackles the hiddennode problem in WSNs and proposes H-NAMe, a simple yet efficient distributed mechanism to overcome it. H-NAMe relies on a grouping strategy that splits each cluster of a WSN into disjoint groups of non-hidden nodes and then scales to multiple clusters via a cluster grouping strategy that guarantees no transmission interference between overlapping clusters. We also show that the H-NAMe mechanism can be easily applied to the IEEE 802.15.4/ZigBee protocols with only minor add-ons and ensuring backward compatibility with the standard specifications. We demonstrate the feasibility of H-NAMe via an experimental test-bed, showing that it increases network throughput and transmission success probability up to twice the values obtained without H-NAMe. We believe that the results in this paper will be quite useful in efficiently enabling IEEE 802.15.4/ZigBee as a WSN protocol

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of scheduling sporadically-arriving tasks with implicit deadlines using Earliest-Deadline-First (EDF) on a single processor. The system may undergo changes in its operational modes and therefore the characteristics of the task set may change at run-time. We consider a well-established previously published mode-change protocol and we show that if every mode utilizes at most 50% of the processing capacity then all deadlines are met. We also show that there exists a task set that misses a deadline although the utilization exceeds 50% by just an arbitrarily small amount. Finally, we present, for a relevant special case, an exact schedulability test for EDF with mode change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the growing complexity and adaptability requirements of real-time embedded systems, which often exhibit unrestricted inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand. This paper proposes an iterative refinement approach for a service’s QoS configuration taking into account services’ inter-dependencies and quality constraints, and trading off the achieved solution’s quality for the cost of computation. Extensive simulations demonstrate that the proposed anytime algorithm is able to quickly find a good initial solution and effectively optimises the rate at which the quality of the current solution improves as the algorithm is given more time to run. The added benefits of the proposed approach clearly surpass its reducedoverhead.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses several complex systems in the perspective of fractional dynamics. For prototype systems are considered the cases of deoxyribonucleic acid decoding, financial evolution, earthquakes events, global warming trend, and musical rhythms. The application of the Fourier transform and of the power law trendlines leads to an assertive representation of the dynamics and to a simple comparison of their characteristics. Moreover, the gallery of different systems, both natural and man made, demonstrates the richness of phenomena that can be described and studied with the tools of fractional calculus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper applied MDS and Fourier transform to analyze different periods of the business cycle. With such purpose, four important stock market indexes (Dow Jones, Nasdaq, NYSE, S&P500) were studied over time. The analysis under the lens of the Fourier transform showed that the indexes have characteristics similar to those of fractional noise. By the other side, the analysis under the MDS lens identified patterns in the stock markets specific to each economic expansion period. Although the identification of patterns characteristic to each expansion period is interesting to practitioners (even if only in a posteriori fashion), further research should explore the meaning of such regularities and target to find a method to estimate future crisis.