910 resultados para complexity metrics
Resumo:
Aquatic humic substances (AHS) isolated from two characteristic seasons of the Negro river, winter and summer corresponding to floody and dry periods, were structurally characterized by (13)C nuclear magnetic ressonance. Subsequently, AHS aqueous solutions were irradiated with a polychromatic lamp (290-475 nm) and monitored by its total organic carbon (TOC) content, ultraviolet-visible (UV-vis) absorbance, fluorescence and Fourier transformed infrared spectroscopy (FTIR). As a result, a photobleaching upto 80% after irradiation of 48 h was observed. Conformational rearrangements and formation of low molecular complexity structures were formed during the irradiation, as deduced from the pH decrement and the fluorescence shifting to lower wavelengths. Additionally a significant mineralization with the formation Of CO(2), CO, and inorganic carbon compounds was registered, as assumed by TOC losses of up to 70%. The differences in photodegradation between samples expressed by photobleaching efficiency were enhanced in the summer sample and related to its elevated aromatic content. Aromatic structures are assumed to have high autosensitization capacity effects mediated by the free radical generation from quinone and phenolic moieties.
Resumo:
Chemical admixtures increase the theological complexity of cement pastes owing to their chemical and physical interactions with particles, which affects cement hydration and agglomeration kinetics. Using oscillatory rheometry and isothermal calorimetry, this article shows that the cellulose ether HMEC (hydroxymethyl ethylcellulose), widely used as a viscosity modifying agent in self-compacting concretes and dry-set mortars, displayed a steric dispersant barrier effect during the first 2 h of hydration associated to a cement retarding nature, consequently reducing the setting speed. However, despite this stabilization effect, the polymer increased the cohesion strength when comparing cement particles with the same hydration degree. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Wireless Sensor Networks (WSNs) have a vast field of applications, including deployment in hostile environments. Thus, the adoption of security mechanisms is fundamental. However, the extremely constrained nature of sensors and the potentially dynamic behavior of WSNs hinder the use of key management mechanisms commonly applied in modern networks. For this reason, many lightweight key management solutions have been proposed to overcome these constraints. In this paper, we review the state of the art of these solutions and evaluate them based on metrics adequate for WSNs. We focus on pre-distribution schemes well-adapted for homogeneous networks (since this is a more general network organization), thus identifying generic features that can improve some of these metrics. We also discuss some challenges in the area and future research directions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This paper contains a new proposal for the definition of the fundamental operation of query under the Adaptive Formalism, one capable of locating functional nuclei from descriptions of their semantics. To demonstrate the method`s applicability, an implementation of the query procedure constrained to a specific class of devices is shown, and its asymptotic computational complexity is discussed.
Resumo:
The roots of swarm intelligence are deeply embedded in the biological study of self-organized behaviors in social insects. Particle swarm optimization (PSO) is one of the modern metaheuristics of swarm intelligence, which can be effectively used to solve nonlinear and non-continuous optimization problems. The basic principle of PSO algorithm is formed on the assumption that potential solutions (particles) will be flown through hyperspace with acceleration towards more optimum solutions. Each particle adjusts its flying according to the flying experiences of both itself and its companions using equations of position and velocity. During the process, the coordinates in hyperspace associated with its previous best fitness solution and the overall best value attained so far by other particles within the group are kept track and recorded in the memory. In recent years, PSO approaches have been successfully implemented to different problem domains with multiple objectives. In this paper, a multiobjective PSO approach, based on concepts of Pareto optimality, dominance, archiving external with elite particles and truncated Cauchy distribution, is proposed and applied in the design with the constraints presence of a brushless DC (Direct Current) wheel motor. Promising results in terms of convergence and spacing performance metrics indicate that the proposed multiobjective PSO scheme is capable of producing good solutions.
Resumo:
The volumetric reconstruction technique presented in this paper employs a two-camera stereoscopic particle image velocimetry (SPIV) system in order to reconstruct the mean flow behind a fixed cylinder fitted with helical strakes, which are commonly used to suppress vortex-induced vibrations (VIV). The technique is based on the measurement of velocity fields at equivalent adjacent planes that results in pseudo volumetric fields. The main advantage over proper volumetric techniques is the avoidance of additional equipment and complexity. The averaged velocity fields behind the straked cylinders and the geometrical periodicity of the three-start configuration are used to further simplify the reconstruction process. Two straked cylindrical models with the same pitch (p = 10d) and two different heights (h = 0.1 and 0.2d) are tested. The reconstructed flow shows that the strakes introduce in the wake flow a well-defined wavelength of one-third of the pitch. Measurements of hydrodynamic forces, fluctuating velocity, vortex formation length, and vortex shedding frequency show the interdependence of the wake parameters. The vortex formation length is increased by the strakes, which is an important effect for the suppression of vortex-induced vibrations. The results presented complement previous investigations concerning the effectiveness of strakes as VIV suppressors and provide a basis of comparison to numerical simulations.
Resumo:
This paper investigates probabilistic logics endowed with independence relations. We review propositional probabilistic languages without and with independence. We then consider graph-theoretic representations for propositional probabilistic logic with independence; complexity is analyzed, algorithms are derived, and examples are discussed. Finally, we examine a restricted first-order probabilistic logic that generalizes relational Bayesian networks. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a family of algorithms for approximate inference in credal networks (that is, models based on directed acyclic graphs and set-valued probabilities) that contain only binary variables. Such networks can represent incomplete or vague beliefs, lack of data, and disagreements among experts; they can also encode models based on belief functions and possibilistic measures. All algorithms for approximate inference in this paper rely on exact inferences in credal networks based on polytrees with binary variables, as these inferences have polynomial complexity. We are inspired by approximate algorithms for Bayesian networks; thus the Loopy 2U algorithm resembles Loopy Belief Propagation, while the Iterated Partial Evaluation and Structured Variational 2U algorithms are, respectively, based on Localized Partial Evaluation and variational techniques. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
The thermal performance of a cooling tower and its cooling water system is critical for industrial plants, and small deviations from the design conditions may cause severe instability in the operation and economics of the process. External disturbances such as variation in the thermal demand of the process or oscillations in atmospheric conditions may be suppressed in multiple ways. Nevertheless, such alternatives are hardly ever implemented in the industrial operation due to the poor coordination between the utility and process sectors. The complexity of the operation increases because of the strong interaction among the process variables. In the present work, an integrated model for the minimization of the operating costs of a cooling water system is developed. The system is composed of a cooling tower as well as a network of heat exchangers. After the model is verified, several cases are studied with the objective of determining the optimal operation. It is observed that the most important operational resources to mitigate disturbances in the thermal demand of the process are, in this order: the increase in recycle water flow rate, the increase in air flow rate and finally the forced removal of a portion of the water flow rate that enters the cooling tower with the corresponding make-up flow rate. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
We present a novel array RLS algorithm with forgetting factor that circumvents the problem of fading regularization, inherent to the standard exponentially-weighted RLS, by allowing for time-varying regularization matrices with generic structure. Simulations in finite precision show the algorithm`s superiority as compared to alternative algorithms in the context of adaptive beamforming.
Resumo:
We propose a robust and low complexity scheme to estimate and track carrier frequency from signals traveling under low signal-to-noise ratio (SNR) conditions in highly nonstationary channels. These scenarios arise in planetary exploration missions subject to high dynamics, such as the Mars exploration rover missions. The method comprises a bank of adaptive linear predictors (ALP) supervised by a convex combiner that dynamically aggregates the individual predictors. The adaptive combination is able to outperform the best individual estimator in the set, which leads to a universal scheme for frequency estimation and tracking. A simple technique for bias compensation considerably improves the ALP performance. It is also shown that retrieval of frequency content by a fast Fourier transform (FFT)-search method, instead of only inspecting the angle of a particular root of the error predictor filter, enhances performance, particularly at very low SNR levels. Simple techniques that enforce frequency continuity improve further the overall performance. In summary we illustrate by extensive simulations that adaptive linear prediction methods render a robust and competitive frequency tracking technique.
Resumo:
In Part I [""Fast Transforms for Acoustic Imaging-Part I: Theory,"" IEEE TRANSACTIONS ON IMAGE PROCESSING], we introduced the Kronecker array transform (KAT), a fast transform for imaging with separable arrays. Given a source distribution, the KAT produces the spectral matrix which would be measured by a separable sensor array. In Part II, we establish connections between the KAT, beamforming and 2-D convolutions, and show how these results can be used to accelerate classical and state of the art array imaging algorithms. We also propose using the KAT to accelerate general purpose regularized least-squares solvers. Using this approach, we avoid ill-conditioned deconvolution steps and obtain more accurate reconstructions than previously possible, while maintaining low computational costs. We also show how the KAT performs when imaging near-field source distributions, and illustrate the trade-off between accuracy and computational complexity. Finally, we show that separable designs can deliver accuracy competitive with multi-arm logarithmic spiral geometries, while having the computational advantages of the KAT.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
The behavior of normal individuals and psychiatric patients vary in a similar way following power laws. The presence of identical patterns of behavioral variation occurring in individuals with different levels of activity is suggestive of self-similarity phenomena. Based on these findings, we propose that the human behavior in social context can constitute a system exhibiting self-organized criticality (SOC). The introduction of SOC concept in psychological theories can help to approach the question of behavior predictability by taking into consideration their intrinsic stochastic character. Also, the ceteris paribus generalizations characteristic of psychological laws can be seen as a consequence of individual level description of a more complex collective phenomena. Although limited, this study suggests that, if an adequate level of description is adopted, the complexity of human behavior can be more easily approached and their individual and social components can be more realistically modeled. (C) 2009 Elsevier Ltd. All rights reserved.