882 resultados para automatic music analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks have been increasingly used in text analysis, including in connection with natural language processing tools, as important text features appear to be captured by the topology and dynamics of the networks. Following previous works that apply complex networks concepts to text quality measurement, summary evaluation, and author characterization, we now focus on machine translation (MT). In this paper we assess the possible representation of texts as complex networks to evaluate cross-linguistic issues inherent in manual and machine translation. We show that different quality translations generated by NIT tools can be distinguished from their manual counterparts by means of metrics such as in-(ID) and out-degrees (OD), clustering coefficient (CC), and shortest paths (SP). For instance, we demonstrate that the average OD in networks of automatic translations consistently exceeds the values obtained for manual ones, and that the CC values of source texts are not preserved for manual translations, but are for good automatic translations. This probably reflects the text rearrangements humans perform during manual translation. We envisage that such findings could lead to better NIT tools and automatic evaluation metrics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an automatic device for in situ and continuous monitoring of the ageing process occurring in natural and synthetic resins widely used in art and in the conservation and restoration of cultural artefacts. The results of tests carried out under accelerated ageing conditions are also presented. This easy-to-assemble palm-top device, essentially consists of oscillators based on quartz crystal resonators coated with films of the organic materials whose response to environmental stress is to be addressed. The device contains a microcontroller which selects at pre-defined time intervals the oscillators and records and stores their oscillation frequency. The ageing of the coatings, caused by the environmental stress and resulting in a shift in the oscillation frequency of the modified crystals, can be straightforwardly monitored in this way. The kinetics of this process reflects the level of risk damage associated with a specific microenvironment. In this case, natural and artificial resins, broadly employed in art and restoration of artistic and archaeological artefacts (dammar and Paraloid B72), were applied onto the crystals. The environmental stress was represented by visible and UV radiation, since the chosen materials are known to be photochemically active, to different extents. In the case of dammar, the results obtained are consistent with previous data obtained using a bench-top equipment by impedance analysis through discrete measurements and confirm that the ageing of this material is reflected in the gravimetric response of the modified quartz crystals. As for Paraloid B72, the outcome of the assays indicates that the resin is resistant to visible light, but is very sensitive to UV irradiation. The use of a continuous monitoring system, apart from being obviously more practical, is essential to identify short-term (i.e. reversible) events, like water vapour adsorption/desorption processes, and to highlight ageing trends or sudden changes of such trends. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’) and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Libraries seek active ways to innovate amidst macroeconomic shifts, growing online education to help alleviate ever-growing schedule conflicts as students juggle jobs and course schedules, as well as changing business models in publishing and evolving information technologies. Patron-driven acquisition (PDA), also known as demand-driven acquisition (DDA), offers numerous strengths in supporting university curricula in the context of these significant shifts. PDA is a business model centered on short-term loans and subsequent purchases of ebooks resulting directly from patrons' natural use stemming from their discovery of the ebooks in library catalogs where the ebooks' bibliographic records are loaded at regular intervals established between the library and ebook supplier. Winthrop University's PDA plan went live in October 2011, and this article chronicles the philosophical and operational considerations, the in-library collaboration, and technical preparations in concert with the library system vendor and ebook supplier. Short-term loan is invoked after a threshold is crossed, typically number of pages or time spent in the ebook. After a certain number of short-term loans negotiated between the library and ebook supplier, the next short-term loan becomes an automatic purchase after which the library owns the ebook in perpetuity. Purchasing options include single-user and multi-user licenses. Owing to high levels of need in college and university environments, Winthrop chose the multi-user license as the preferred default purchase. Only where multi-user licenses are unavailable does the automatic purchase occur with single-user title licenses. Data on initial use between October 2011 and February 2013 reveal that of all PDA ebooks viewed, only 30% crossed the threshold into short-term loans. Of all triggered short-term loans, Psychology was the highest-using. Of all ebook views too brief to trigger short-term loans, Business was the highest-using area. Although the data are still too young to draw conclusions after only a few months, thought-provoking usage differences between academic disciplines have begun to emerge. These differences should be considered in library plans for the best possible curricular support for each academic program. As higher education struggles with costs and course-delivery methods libraries have an enduring lead role.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of integrated circuits technologies demands the development of new CAD tools. The traditional development of digital circuits at physical level is based in library of cells. These libraries of cells offer certain predictability of the electrical behavior of the design due to the previous characterization of the cells. Besides, different versions of each cell are required in such a way that delay and power consumption characteristics are taken into account, increasing the number of cells in a library. The automatic full custom layout generation is an alternative each time more important to cell based generation approaches. This strategy implements transistors and connections according patterns defined by algorithms. So, it is possible to implement any logic function avoiding the limitations of the library of cells. Tools of analysis and estimate must offer the predictability in automatic full custom layouts. These tools must be able to work with layout estimates and to generate information related to delay, power consumption and area occupation. This work includes the research of new methods of physical synthesis and the implementation of an automatic layout generation in which the cells are generated at the moment of the layout synthesis. The research investigates different strategies of elements disposition (transistors, contacts and connections) in a layout and their effects in the area occupation and circuit delay. The presented layout strategy applies delay optimization by the integration with a gate sizing technique. This is performed in such a way the folding method allows individual discrete sizing to transistors. The main characteristics of the proposed strategy are: power supply lines between rows, over the layout routing (channel routing is not used), circuit routing performed before layout generation and layout generation targeting delay reduction by the application of the sizing technique. The possibility to implement any logic function, without restrictions imposed by a library of cells, allows the circuit synthesis with optimization in the number of the transistors. This reduction in the number of transistors decreases the delay and power consumption, mainly the static power consumption in submicrometer circuits. Comparisons between the proposed strategy and other well-known methods are presented in such a way the proposed method is validated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to provide a low cost system of thermal comfort, a common model of home fan, 40 cm diameter size, had its manual four-button control system replaced by an automatic speed control. The new control system has a temperature sensor feeding a microcontroller that, by using an optic coupling, DIAC or TRIAC-based circuit, varies the RMS value of the fan motor input voltage and its speed, according to the room temperature. Over a wide range of velocity, the fan net power and the motor fan input power were measured working under both control system. The temperature of the motor stator and the voltage waveforms were observed too. Measured values analysis showed that the TRIAC-based control system makes the fan motor work at a very low power factor and efficiency values. The worst case is at low velocity range where the higher fan motor stator temperatures were registered. The poor power factor and efficiency and the harmonics signals inserted in the motor input voltage wave by the TRIAC commutation procedure are correlated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)