77 resultados para STATISTICAL STRENGTH


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate how note onsets in Turkish Makam music compositions are distributed, and in how far this distribution supports or contradicts the metrical structure of the pieces, the usul. We use MIDI data to derive the distributions in the form of onset histograms, and comparethem with metrical weights that are applied to describe the usul in theory. We compute correlation and syncopation values to estimate the degrees of support and contradiction, respectively. While the concept of syncopation is rarelymentioned in the context of this music, we can gain interesting insight into the structure of a piece using such a measure.We show that metrical contradiction is systematically applied in some metrical structures. We will compare thedifferences between Western music and Turkish Makam music regarding metrical support and contradiction. Such a study can help avoiding pitfalls in later attempts to perform audio processing tasks such as beat tracking or rhythmic similarity measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines statistical analysis of social reciprocity, that is, the balance between addressing and receiving behaviour in social interactions. Specifically, it focuses on the measurement of social reciprocity by means of directionality and skew-symmetry statistics at different levels. Two statistics have been used as overall measures of social reciprocity at group level: the directional consistency and the skew-symmetry statistics. Furthermore, the skew-symmetry statistic allows social researchers to obtain complementary information at dyadic and individual levels. However, having computed these measures, social researchers may be interested in testing statistical hypotheses regarding social reciprocity. For this reason, it has been developed a statistical procedure, based on Monte Carlo sampling, in order to allow social researchers to describe groups and make statistical decisions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Excitation-continuous music instrument control patterns are often not explicitly represented in current sound synthesis techniques when applied to automatic performance. Both physical model-based and sample-based synthesis paradigmswould benefit from a flexible and accurate instrument control model, enabling the improvement of naturalness and realism. Wepresent a framework for modeling bowing control parameters inviolin performance. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing control parameter signals.We model the temporal contour of bow velocity, bow pressing force, and bow-bridge distance as sequences of short Bézier cubic curve segments. Considering different articulations, dynamics, and performance contexts, a number of note classes are defined. Contours of bowing parameters in a performance database are analyzed at note-level by following a predefined grammar that dictates characteristics of curve segment sequences for each of the classes in consideration. As a result, contour analysis of bowing parameters of each note yields an optimal representation vector that is sufficient for reconstructing original contours with significant fidelity. From the resulting representation vectors, we construct a statistical model based on Gaussian mixtures suitable for both the analysis and synthesis of bowing parameter contours. By using the estimated models, synthetic contours can be generated through a bow planning algorithm able to reproduce possible constraints caused by the finite length of the bow. Rendered contours are successfully used in two preliminary synthesis frameworks: digital waveguide-based bowed stringphysical modeling and sample-based spectral-domain synthesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

EEG recordings are usually corrupted by spurious extra-cerebral artifacts, which should be rejected or cleaned up by the practitioner. Since manual screening of human EEGs is inherently error prone and might induce experimental bias, automatic artifact detection is an issue of importance. Automatic artifact detection is the best guarantee for objective and clean results. We present a new approach, based on the time–frequency shape of muscular artifacts, to achieve reliable and automatic scoring. The impact of muscular activity on the signal can be evaluated using this methodology by placing emphasis on the analysis of EEG activity. The method is used to discriminate evoked potentials from several types of recorded muscular artifacts—with a sensitivity of 98.8% and a specificity of 92.2%. Automatic cleaning ofEEGdata are then successfully realized using this method, combined with independent component analysis. The outcome of the automatic cleaning is then compared with the Slepian multitaper spectrum based technique introduced by Delorme et al (2007 Neuroimage 34 1443–9).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a framework for modeling right-hand gestures in bowed-string instrument playing, applied to violin. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing gesture parameter cues. We model the temporal contour of bow transversal velocity, bow pressing force, and bow-bridge distance as sequences of short segments, in particular B´ezier cubic curve segments. Considering different articulations, dynamics, andcontexts, a number of note classes is defined. Gesture parameter contours of a performance database are analyzed at note-level by following a predefined grammar that dictatescharacteristics of curve segment sequences for each of the classes into consideration. Based on dynamic programming, gesture parameter contour analysis provides an optimal curve parameter vector for each note. The informationpresent in such parameter vector is enough for reconstructing original gesture parameter contours with significant fidelity. From the resulting representation vectors, weconstruct a statistical model based on Gaussian mixtures, suitable for both analysis and synthesis of bowing gesture parameter contours. We show the potential of the modelby synthesizing bowing gesture parameter contours from an annotated input score. Finally, we point out promising applicationsand developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Further knowledge of the processes conditioning nitrogen use efficiency (NUE) is of great relevance to crop productivity. The aim of this paper was characterise C and N partitioning during grain filling and their implications for NUE. Cereals such as bread wheat (Triticum aestivum L. cv Califa sur), triticale (× Triticosecale Wittmack cv. Imperioso) and tritordeum (× Tritordeum Asch. & Graebn line HT 621) were grown under low (LN, 5 mm NH4NO3) and high (HN, 15 mm NH4NO3) N conditions. We conducted simultaneous double labelling (12CO2 and 15NH415NO3) in order to characterise C and N partitioning during grain filling. Although triticale plants showed the largest total and ear dry matter values in HN conditions, the large investment in shoot and root biomass negatively affected ear NUE. Tritordeum was the only genotype that increased NUE in both N treatments (NUEtotal), whereas in wheat, no significant effect was detected. N labelling revealed that N fertilisation during post-anthesis was more relevant for wheat and tritordeum grain filling than for triticale. The study also revealed that the investments of C and N in flag leaves and shoots, together with the"waste" of photoassimilates in respiration, conditioned the NUE of plants, and especially under LN. These results suggest that C and N use by these plants needs to be improved in order to increase ear C and N sinks, especially under LN. It is also remarkable that even though tritordeum shows the largest increase in NUE, the low yield of this cereal limits its agronomic value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical properties of binary complex networks are well understood and recently many attempts have been made to extend this knowledge to weighted ones. There are, however, subtle yet important considerations to be made regarding the nature of the weights used in this generalization. Weights can be either continuous or discrete magnitudes, and in the latter case, they can additionally have undistinguishable or distinguishable nature. This fact has not been addressed in the literature insofar and has deep implications on the network statistics. In this work we face this problem introducing multiedge networks as graphs where multiple (distinguishable) connections between nodes are considered. We develop a statistical mechanics framework where it is possible to get information about the most relevant observables given a large spectrum of linear and nonlinear constraints including those depending both on the number of multiedges per link and their binary projection. The latter case is particularly interesting as we show that binary projections can be understood from multiedge processes. The implications of these results are important as many real-agent-based problems mapped onto graphs require this treatment for a proper characterization of their collective behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the failure process of a two-component system with widely different fracture strength in the framework of a fiber bundle model with localized load sharing. A fraction 0≤α≤1 of the bundle is strong and it is represented by unbreakable fibers, while fibers of the weak component have randomly distributed failure strength. Computer simulations revealed that there exists a critical composition αc which separates two qualitatively different behaviors: Below the critical point, the failure of the bundle is brittle, characterized by an abrupt damage growth within the breakable part of the system. Above αc, however, the macroscopic response becomes ductile, providing stability during the entire breaking process. The transition occurs at an astonishingly low fraction of strong fibers which can have importance for applications. We show that in the ductile phase, the size distribution of breaking bursts has a power law functional form with an exponent μ=2 followed by an exponential cutoff. In the brittle phase, the power law also prevails but with a higher exponent μ=92. The transition between the two phases shows analogies to continuous phase transitions. Analyzing the microstructure of the damage, it was found that at the beginning of the fracture process cracks nucleate randomly, while later on growth and coalescence of cracks dominate, which give rise to power law distributed crack sizes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a series of experiments in which we start with English to French and English to Japanese versions of an Open Source rule-based speech translation system for a medical domain, and bootstrap correspondign statistical systems. Comparative evaluation reveals that the rule-based systems are still significantly better than the statistical ones, despite the fact that considerable effort has been invested in tuning both the recognition and translation components; also, a hybrid system only marginally improved recall at the cost of a los in precision. The result suggests that rule-based architectures may still be preferable to statistical ones for safety-critical speech translation tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A general criterion for the design of adaptive systemsin digital communications called the statistical reference criterionis proposed. The criterion is based on imposition of the probabilitydensity function of the signal of interest at the outputof the adaptive system, with its application to the scenario ofhighly powerful interferers being the main focus of this paper.The knowledge of the pdf of the wanted signal is used as adiscriminator between signals so that interferers with differingdistributions are rejected by the algorithm. Its performance isstudied over a range of scenarios. Equations for gradient-basedcoefficient updates are derived, and the relationship with otherexisting algorithms like the minimum variance and the Wienercriterion are examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To evaluate the shear bond strength and site of failure of brackets bonded to dry and wet enamel. Study design: 50 teeth were divided into ten groups of 5 teeth each (10 surfaces). In half the groups enamel was kept dry before bonding, and in the other half distilled water was applied to wet the surface after etching. The following groups were established: 1)Acid/Transbond-XT (dry/wet) XT; 2) Transbond Plus Self Etching Primer (TSEP)/Transbond-XT paste (dry/wet); 3) Concise (dry), Transbond MIP/Concise (wet), 4) FujiOrtho-LC (dry/wet); 5) SmartBond (dry/wet). Brackets were bonded to both buccal and lingual surfaces. Specimens were stored in distilled water (24 hours at 37ºC) and thermocycled. Brackets were debonded using a Universal testing machine (cross-head speed 1 mm/min). Failure sites were classified using a stereomicroscope. Results: No significant differences in bond strength were detected between the adhesives under wet and dry conditions except for Smart- Bond, whose bond strength was significantly lower under dry conditions. For all the adhesives most bond failures were of mixed site location except for Smartbond, which failed at the adhesive-bracket interface. Conclusions: Under wet conditions the bonding capacity of the adhesives tested was similar than under dry conditions, with the exception of SmartBond which improved under wet conditions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this work is to try to create a statistical model, based only on easily computable parameters from the CSP problem to predict runtime behaviour of the solving algorithms, and let us choose the best algorithm to solve the problem. Although it seems that the obvious choice should be MAC, experimental results obtained so far show, that with big numbers of variables, other algorithms perfom much better, specially for hard problems in the transition phase.