939 resultados para SYSTEMS ANALYSIS
Resumo:
This paper studies forest fires from the perspective of dynamical systems. Burnt area, precipitation and atmospheric temperatures are interpreted as state variables of a complex system and the correlations between them are investigated by means of different mathematical tools. First, we use mutual information to reveal potential relationships in the data. Second, we adopt the state space portrait to characterize the system’s behavior. Third, we compare the annual state space curves and we apply clustering and visualization tools to unveil long-range patterns. We use forest fire data for Portugal, covering the years 1980–2003. The territory is divided into two regions (North and South), characterized by different climates and vegetation. The adopted methodology represents a new viewpoint in the context of forest fires, shedding light on a complex phenomenon that needs to be better understood in order to mitigate its devastating consequences, at both economical and environmental levels.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.
Resumo:
In this paper, we propose the Distributed using Optimal Priority Assignment (DOPA) heuristic that finds a feasible partitioning and priority assignment for distributed applications based on the linear transactional model. DOPA partitions the tasks and messages in the distributed system, and makes use of the Optimal Priority Assignment (OPA) algorithm known as Audsley’s algorithm, to find the priorities for that partition. The experimental results show how the use of the OPA algorithm increases in average the number of schedulable tasks and messages in a distributed system when compared to the use of Deadline Monotonic (DM) usually favoured in other works. Afterwards, we extend these results to the assignment of Parallel/Distributed applications and present a second heuristic named Parallel-DOPA (P-DOPA). In that case, we show how the partitioning process can be simplified by using the Distributed Stretch Transformation (DST), a parallel transaction transformation algorithm introduced in [1].
Resumo:
23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France. Best Paper Award Nominee
Resumo:
Article in Press, Corrected Proof
Resumo:
Proceedings of the 12th Conference on 'Dynamical Systems -Theory and Applications'
Resumo:
This paper applies multidimensional scaling techniques and Fourier transform for visualizing possible time-varying correlations between 25 stock market values. The method is useful for observing clusters of stock markets with similar behavior.
Resumo:
The internal impedance of a wire is the function of the frequency. In a conductor, where the conductivity is sufficiently high, the displacement current density can be neglected. In this case, the conduction current density is given by the product of the electric field and the conductance. One of the aspects the high-frequency effects is the skin effect (SE). The fundamental problem with SE is it attenuates the higher frequency components of a signal. The SE was first verified by Kelvin in 1887. Since then many researchers developed work on the subject and presently a comprehensive physical model, based on the Maxwell equations, is well established. The Maxwell formalism plays a fundamental role in the electromagnetic theory. These equations lead to the derivation of mathematical descriptions useful in many applications in physics and engineering. Maxwell is generally regarded as the 19th century scientist who had the greatest influence on 20th century physics, making contributions to the fundamental models of nature. The Maxwell equations involve only the integer-order calculus and, therefore, it is natural that the resulting classical models adopted in electrical engineering reflect this perspective. Recently, a closer look of some phenomas present in electrical systems and the motivation towards the development of precise models, seem to point out the requirement for a fractional calculus approach. Bearing these ideas in mind, in this study we address the SE and we re-evaluate the results demonstrating its fractional-order nature.
Resumo:
This paper analyzes the performance of two cooperative robot manipulators. In order to capture the working performancewe formulated several performance indices that measure the manipulability, the effort reduction and the equilibrium between the two robots. In this perspective the proposed indices we determined the optimal values for the system parameters. Furthermore, it is studied the implementation of fractional-order algorithms in the position/force control of two cooperative robotic manipulators holding an object.
Resumo:
A new method for the study and optimization of manu«ipulator trajectories is developed. The novel feature resides on the modeling formulation. Standard system desciptions are based on a set of differential equations which, in general, require laborious computations and may be difficult to analyze. Moreover, the derived algorithms are suited to "deterministic" tasks, such as those appearing in a repetitivework, and are not well adapted to a "random" operation that occurs in intelligent systems interacting with a non-structured and changing environment. These facts motivate the development of alternative models based on distinct concepts. The proposed embedding of statistics and Fourier trasnform gives a new perspective towards the calculation and optimization of the robot trajectories in manipulating tasks.
Resumo:
Presented at IEEE Real-Time Systems Symposium (RTSS 2015). 1 to 4, Dec, 2015. San Antonio, U.S.A..
Resumo:
Presented at IEEE Real-Time Systems Symposium (RTSS 2015). 1 to 4, Dec, 2015. San Antonio, U.S.A..
Resumo:
The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.
Resumo:
The increasing number of television channels, on-demand services and online content, is expected to contribute to a better quality of experience for a costumer of such a service. However, the lack of efficient methods for finding the right content, adapted to personal interests, may lead to a progressive loss of clients. In such a scenario, recommendation systems are seen as a tool that can fill this gap and contribute to the loyalty of users. Multimedia content, namely films and television programmes are usually described using a set of metadata elements that include the title, a genre, the date of production, and the list of directors and actors. This paper provides a deep study on how the use of different metadata elements can contribute to increase the quality of the recommendations suggested. The analysis is conducted using Netflix and Movielens datasets and aspects such as the granularity of the descriptions, the accuracy metric used and the sparsity of the data are taken into account. Comparisons with collaborative approaches are also presented.