871 resultados para Anisotropic Analytical Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trihalomethanes (THMs) are widely referred and studied as disinfection by-products (DBPs). The THMs that are most commonly detected are chloroform (TCM), bromodichloromethane (BDCM), chlorodibromomethane (CDBM), and bromoform (TBM). Several studies regarding the determination of THMs in swimming pool water and air samples have been published. This paper reviews the most recent work in this field, with a special focus on water and air sampling, sample preparation and analytical determination methods. An experimental study has been developed in order to optimize the headspace solid-phasemicroextraction (HS-SPME) conditions of TCM, BDCM, CDBM and TBM from water samples using a 23 factorial design. An extraction temperature of 45 °C, for 25min, and a desorption time of 5 min were found to be the best conditions. Analysis was performed by gas chromatography with an electron capture detector (GC-ECD). The method was successfully applied to a set of 27 swimming pool water samples collected in the Oporto area (Portugal). TCM was the only THM detected with levels between 4.5 and 406.5 μg L−1. Four of the samples exceeded the guideline value for total THMs in swimming pool water (100 μgL−1) indicated by the Portuguese Health Authority.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The knowledge of the anisotropic properties beneath the Iberian Peninsula and Northern Morocco has been dramatically improved since late 2007 with the analysis of the data provided by the dense TopoIberia broadband seismic network, the increasing number of permanent stations operating in Morocco, Portugal and Spain, and the contribution of smaller scale/higher resolution experiments. Results from the two first TopoIberia deployments have evidenced a spectacular rotation of the fast polarization direction (FPD) along the Gibraltar Arc, interpreted as an evidence of mantle flow deflected around the high velocity slab beneath the Alboran Sea, and a rather uniform N100 degrees E FPD beneath the central Iberian Variscan Massif, consistent with global mantle flow models taking into account contributions of surface plate motion, density variations and net lithosphere rotation. The results from the last Iberarray deployment presented here, covering the northern part of the Iberian Peninsula, also show a rather uniform FPD orientation close to N100 degrees E, thus confirming the previous interpretation globally relating the anisotropic parameters to the LPO of mantle minerals generated by mantle flow at asthenospheric depths. However, the degree of anisotropy varies significantly, from delay time values of around 0.5 s beneath NW Iberia to values reaching 2.0 sin its NE comer. The anisotropic parameters retrieved from single events providing high quality data also show significant differences for stations located in the Variscan units of NW Iberia, suggesting that the region includes multiple anisotropic layers or complex anisotropy systems. These results allow to complete the map of the anisotropic properties of the westernmost Mediterranean region, which can now be considered as one of best constrained regions worldwide, with more than 300 sites investigated over an area extending from the Bay of Biscay to the Sahara platform. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The container loading problem (CLP) is a combinatorial optimization problem for the spatial arrangement of cargo inside containers so as to maximize the usage of space. The algorithms for this problem are of limited practical applicability if real-world constraints are not considered, one of the most important of which is deemed to be stability. This paper addresses static stability, as opposed to dynamic stability, looking at the stability of the cargo during container loading. This paper proposes two algorithms. The first is a static stability algorithm based on static mechanical equilibrium conditions that can be used as a stability evaluation function embedded in CLP algorithms (e.g. constructive heuristics, metaheuristics). The second proposed algorithm is a physical packing sequence algorithm that, given a container loading arrangement, generates the actual sequence by which each box is placed inside the container, considering static stability and loading operation efficiency constraints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Many-core” systems based on a Network-on-Chip (NoC) architecture offer various opportunities in terms of performance and computing capabilities, but at the same time they pose many challenges for the deployment of real-time systems, which must fulfill specific timing requirements at runtime. It is therefore essential to identify, at design time, the parameters that have an impact on the execution time of the tasks deployed on these systems and the upper bounds on the other key parameters. The focus of this work is to determine an upper bound on the traversal time of a packet when it is transmitted over the NoC infrastructure. Towards this aim, we first identify and explore some limitations in the existing recursive-calculus-based approaches to compute the Worst-Case Traversal Time (WCTT) of a packet. Then, we extend the existing model by integrating the characteristics of the tasks that generate the packets. For this extended model, we propose an algorithm called “Branch and Prune” (BP). Our proposed method provides tighter and safe estimates than the existing recursive-calculus-based approaches. Finally, we introduce a more general approach, namely “Branch, Prune and Collapse” (BPC) which offers a configurable parameter that provides a flexible trade-off between the computational complexity and the tightness of the computed estimate. The recursive-calculus methods and BP present two special cases of BPC when a trade-off parameter is 1 or ∞, respectively. Through simulations, we analyze this trade-off, reason about the implications of certain choices, and also provide some case studies to observe the impact of task parameters on the WCTT estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new parallel implementation of a previously hyperspectral coded aperture (HYCA) algorithm for compressive sensing on graphics processing units (GPUs). HYCA method combines the ideas of spectral unmixing and compressive sensing exploiting the high spatial correlation that can be observed in the data and the generally low number of endmembers needed in order to explain the data. The proposed implementation exploits the GPU architecture at low level, thus taking full advantage of the computational power of GPUs using shared memory and coalesced accesses to memory. The proposed algorithm is evaluated not only in terms of reconstruction error but also in terms of computational performance using two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN. Experimental results using real data reveals signficant speedups up with regards to serial implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a step count algorithm designed to work in real-time using low computational power. This proposal is our first step for the development of an indoor navigation system, based on Pedestrian Dead Reckoning (PDR). We present two approaches to solve this problem and compare them based in their error on step counting, as well as, the capability of their use in a real time system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an ankle mounted Inertial Navigation System (INS) used to estimate the distance traveled by a pedestrian. This distance is estimated by the number of steps given by the user. The proposed method is based on force sensors to enhance the results obtained from an INS. Experimental results have shown that, depending on the step frequency, the traveled distance error varies between 2.7% and 5.6%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Mecânica Especialização em Concepção e Produção

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper introduces a new method to blindly unmix hyperspectral data, termed dependent component analysis (DECA). This method decomposes a hyperspectral images into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA assumes that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abudances are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. The effectiveness of the proposed method is illustrated using simulated data based on U.S.G.S. laboratory spectra and real hyperspectral data collected by the AVIRIS sensor over Cuprite, Nevada.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation submitted to Faculdade de Ciências e Tecnologia - Universidade Nova de Lisboa in fulfilment of the requirements for the degree of Doctor of Philosophy (Biochemistry - Biotechnology)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent changes concerning the consumers’ active participation in the efficient management of load devices for one’s own interest and for the interest of the network operator, namely in the context of demand response, leads to the need for improved algorithms and tools. A continuous consumption optimization algorithm has been improved in order to better manage the shifted demand. It has been done in a simulation and user-interaction tool capable of being integrated in a multi-agent smart grid simulator already developed, and also capable of integrating several optimization algorithms to manage real and simulated loads. The case study of this paper enhances the advantages of the proposed algorithm and the benefits of using the developed simulation and user interaction tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The integration of the Smart Grid concept into the electric grid brings to the need for an active participation of small and medium players. This active participation can be achieved using decentralized decisions, in which the end consumer can manage loads regarding the Smart Grid needs. The management of loads must handle the users’ preferences, wills and needs. However, the users’ preferences, wills and needs can suffer changes when faced with exceptional events. This paper proposes the integration of exceptional events into the SCADA House Intelligent Management (SHIM) system developed by the authors, to handle machine learning issues in the domestic consumption context. An illustrative application and learning case study is provided in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented in fulfilment of the requirements for the Master’s degree in Conservation and Restoration

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-standard mobile devices are allowing users to enjoy higher data rates with ubiquitous connectivity. However, the benefits gained from multiple interfaces come at an expense—that being higher energy consumption in an era where mobile devices need to be energy compliant. One promising solution is the usage of short-range cooperative communication as an overlay for infrastructure-based networks taking advantage of its context information. However, the node discovery mechanism, which is pivotal to the bearer establishment process, still represents a major burden in terms of the total energy budget. In this paper, we propose a technology agnostic approach towards enhancing the MAC energy ratings by presenting a context-aware node discovery (CANDi) algorithm, which provides a priori knowledge towards the node discovery mechanism by allowing it to search nodes in the near vicinity at the ‘right time and at the right place’. We describe the different beacons required for establishing the cooperation, as well as the context information required, including battery level, modes, location and so on. CANDi uses the long-range network (WiMAX and WiFi) to distribute the context information about cooperative clusters (Ultra-wideband-based) in the vicinity. The searching nodes can use this context in locating the cooperative clusters/nodes, which facilitates the establishing of short-range connections. Analytical and simulation results are obtained, and the energy saving gains are further demonstrated in the laboratory using a customised testbed. CANDi saves up to 50% energy during the node discovery process, while the demonstrative testbed shows up to 75% savings in the total energy budget, thus validating the algorithm, as well as providing viable evidence to support the usage of short-range cooperative communications for energy savings.