32 resultados para Multidimensional. Development. Convergence. Divergence. Analysis of groupings


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the impact of the energy upon electricity markets using Multidimensional Scaling (MDS). Data from major energy and electricity markets is considered. Several maps produced by MDS are presented and discussed revealing that this method is useful for understanding the correlation between them. Furthermore, the results help electricity markets agents hedging against Market Clearing Price (MCP) volatility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper applies multidimensional scaling techniques and Fourier transform for visualizing possible time-varying correlations between 25 stock market values. The method is useful for observing clusters of stock markets with similar behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crowdfunding (CF) is an increasingly attractive source to fund social projects. However, to our best knowledge, the study of CF for social purposes has remained largely unexplored in the literature. This research envisages a detailed examination of the role of CF on the early-stage of the social projects at regional level. By comparing the characteristics of the projects available in the Portuguese Social Stock Exchange (PSSE) platform with others that did not use this source of financial support, we explore its role on regional development. The results we got show that, in most cases, both PSSE and Non-Governmental Organizations projects complemented the services offered by the State or by the private sector. Furthermore, about a quarter of the projects present in PSSE operated in areas that were not being addressed neither by the services offered by the State nor by the ones of the private sector. The results attained show that more recent social ventures have a greater propensity to use PSSE. The same is find in those organizations which work closely with the target audience. We also observed that the use of PSSE was correlated with the geographical scope of the Social Venture. The circumstance of having the social organization acting at a local or regional level seems to be strongly associated with the possibility of using social crowdfunding for financing social projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes musical opus from the point of view of two mathematical tools, namely the entropy and the multidimensional scaling (MDS). The Fourier analysis reveals a fractional dynamics, but the time rhythm variations are diluted along the spectrum. The combination of time-window entropy and MDS copes with the time characteristics and is well suited to treat a large volume of data. The experiments focus on a large number of compositions classified along three sets of musical styles, namely “Classical”, “Jazz”, and “Pop & Rock” compositions. Without lack of generality, the present study describes the application of the tools and the sets of musical compositions in a methodology leading to clear conclusions, but extensions to other possibilities are straightforward. The results reveal significant differences in the musical styles, demonstrating the feasibility of the proposed strategy and motivating further developments toward a dynamical analysis of musical compositions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the human DNA in the perspective of signal processing. Six wavelets are tested for analyzing the information content of the human DNA. By adopting real Shannon wavelet several fundamental properties of the code are revealed. A quantitative comparison of the chromosomes and visualization through multidimensional and dendograms is developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The industrial activity is inevitably associated with a certain degradation of the environmental quality, because is not possible to guarantee that a manufacturing process can be totally innocuous. The eco-efficiency concept is globally accepted as a philosophy of entreprise management, that encourages the companies to become more competitive, innovative and environmentally responsible by promoting the link between its companies objectives for excellence and its objectives of environmental excellence issues. This link imposes the creation of an organizational methodology where the performance of the company is concordant with the sustainable development. The main propose of this project is to apply the concept of eco-efficiency to the particular case of the metallurgical and metal workshop industries through the development of the particular indicators needed and to produce a manual of procedures for implementation of the accurate solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies forest fires from the perspective of dynamical systems. Burnt area, precipitation and atmospheric temperatures are interpreted as state variables of a complex system and the correlations between them are investigated by means of different mathematical tools. First, we use mutual information to reveal potential relationships in the data. Second, we adopt the state space portrait to characterize the system’s behavior. Third, we compare the annual state space curves and we apply clustering and visualization tools to unveil long-range patterns. We use forest fire data for Portugal, covering the years 1980–2003. The territory is divided into two regions (North and South), characterized by different climates and vegetation. The adopted methodology represents a new viewpoint in the context of forest fires, shedding light on a complex phenomenon that needs to be better understood in order to mitigate its devastating consequences, at both economical and environmental levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method for the study and optimization of manu«ipulator trajectories is developed. The novel feature resides on the modeling formulation. Standard system desciptions are based on a set of differential equations which, in general, require laborious computations and may be difficult to analyze. Moreover, the derived algorithms are suited to "deterministic" tasks, such as those appearing in a repetitivework, and are not well adapted to a "random" operation that occurs in intelligent systems interacting with a non-structured and changing environment. These facts motivate the development of alternative models based on distinct concepts. The proposed embedding of statistics and Fourier trasnform gives a new perspective towards the calculation and optimization of the robot trajectories in manipulating tasks.