846 resultados para robust speaker verification
Resumo:
CiaoPP is the abstract interpretation-based preprocessor of the Ciao multi-paradigm (Constraint) Logic Programming system. It uses modular, incremental abstract interpretation as a fundamental tool to obtain information about programs. In CiaoPP, the semantic approximations thus produced have been applied to perform high- and low-level optimizations during program compilation, including transformations such as múltiple abstract specialization, parallelization, partial evaluation, resource usage control, and program verification. More recently, novel and promising applications of such semantic approximations are being applied in the more general context of program development such as program verification. In this work, we describe our extensión of the system to incorpórate Abstraction-Carrying Code (ACC), a novel approach to mobile code safety. ACC follows the standard strategy of associating safety certificates to programs, originally proposed in Proof Carrying- Code. A distinguishing feature of ACC is that we use an abstraction (or abstract model) of the program computed by standard static analyzers as a certifícate. The validity of the abstraction on the consumer side is checked in a single-pass by a very efficient and specialized abstractinterpreter. We have implemented and benchmarked ACC within CiaoPP. The experimental results show that the checking phase is indeed faster than the proof generation phase, and that the sizes of certificates are reasonable. Moreover, the preprocessor is based on compile-time (and run-time) tools for the certification of CLP programs with resource consumption assurances.
Resumo:
The technique of Abstract Interpretation [13] has allowed the development of sophisticated program analyses which are provably correct and practical. The semantic approximations produced by such analyses have been traditionally applied to optimization during program compilation. However, recently, novel and promising applications of semantic approximations have been proposed in the more general context of program verification and debugging [3],[10],[7].
Resumo:
In an increasing number of applications (e.g., in embedded, real-time, or mobile systems) it is important or even essential to ensure conformance with respect to a specification expressing resource usages, such as execution time, memory, energy, or user-defined resources. In previous work we have presented a novel framework for data size-aware, static resource usage verification. Specifications can include both lower and upper bound resource usage functions. In order to statically check such specifications, both upper- and lower-bound resource usage functions (on input data sizes) approximating the actual resource usage of the program which are automatically inferred and compared against the specification. The outcome of the static checking of assertions can express intervals for the input data sizes such that a given specification can be proved for some intervals but disproved for others. After an overview of the approach in this paper we provide a number of novel contributions: we present a full formalization, and we report on and provide results from an implementation within the Ciao/CiaoPP framework (which provides a general, unified platform for static and run-time verification, as well as unit testing). We also generalize the checking of assertions to allow preconditions expressing intervals within which the input data size of a program is supposed to lie (i.e., intervals for which each assertion is applicable), and we extend the class of resource usage functions that can be checked.
Resumo:
In air transportation, airline profitability is influenced by the airline's ability to build flight schedules. In order to generate operational schedules, airlines engage in a complex decision-making process, referred to as airline schedule planning. Up to now, the generation of flight schedules has been separated and optimized sequentially. The schedule design has been traditionally decomposed into two sequential steps. The frequency planning and the timetable development. The purpose of the second problem of schedule development, fleet assignment, is to assign available aircraft types to flight legs such that seating capacity on an assigned aircraft matches closely with flight demand and such that costs are minimized. Our work integrates these planning phases into one single model in order to produce more economical solutions and create fewer incompatibilities between the decisions. We propose an integrated robust approach for the schedule development step. We design the timetable ensuring that enough time is available to perform passengers’ flight connections, making the system robust avoiding misconnected passengers. An application of the model for a simplified IBERIA network is shown.
Resumo:
This paper focuses on the railway rolling stock circulation problem in rapid transit networks, in which frequencies are high and distances are relatively short. Although the distances are not very large, service times are high due to the large number of intermediate stops required to allow proper passenger flow. The main complicating issue is the fact that the available capacity at depot stations is very low, and both capacity and rolling stock are shared between different train lines. This forces the introduction of empty train movements and rotation maneuvers, to ensure sufficient station capacity and rolling stock availability. However, these shunting operations may sometimes be difficult to perform and can easily malfunction, causing localized incidents that could propagate throughout the entire network due to cascading effects. This type of operation will be penalized with the goal of selectively avoiding them and ameliorating their high malfunction probabilities. Critic trains, defined as train services that come through stations that have a large number of passengers arriving at the platform during rush hours, are also introduced. We illustrate our model using computational experiments drawn from RENFE (the main Spanish operator of suburban passenger trains) in Madrid, Spain. The results of the model, achieved in approximately 1 min, have been received positively by RENFE planners
Resumo:
We have designed and implemented a framework that unifies unit testing and run-time verification (as well as static verification and static debugging). A key contribution of our approach is that a unified assertion language is used for all of these tasks. We first propose methods for compiling runtime checks for (parts of) assertions which cannot be verified at compile-time via program transformation. This transformation allows checking preconditions and postconditions, including conditional postconditions, properties at arbitrary program points, and certain computational properties. The implemented transformation includes several optimizations to reduce run-time overhead. We also propose a minimal addition to the assertion language which allows defining unit tests to be run in order to detect possible violations of the (partial) specifications expressed by the assertions. This language can express for example the input data for performing the unit tests or the number of times that the unit tests should be repeated. We have implemented the framework within the Ciao/CiaoPP system and effectively applied it to the verification of ISO-prolog compliance and to the detection of different types of bugs in the Ciao system source code. Several experimental results are presented that illustrate different trade-offs among program size, running time, or levels of verbosity of the messages shown to the user.
Resumo:
Fractal and multifractal are concepts that have grown increasingly popular in recent years in the soil analysis, along with the development of fractal models. One of the common steps is to calculate the slope of a linear fit commonly using least squares method. This shouldn?t be a special problem, however, in many situations using experimental data the researcher has to select the range of scales at which is going to work neglecting the rest of points to achieve the best linearity that in this type of analysis is necessary. Robust regression is a form of regression analysis designed to circumvent some limitations of traditional parametric and non-parametric methods. In this method we don?t have to assume that the outlier point is simply an extreme observation drawn from the tail of a normal distribution not compromising the validity of the regression results. In this work we have evaluated the capacity of robust regression to select the points in the experimental data used trying to avoid subjective choices. Based on this analysis we have developed a new work methodology that implies two basic steps: ? Evaluation of the improvement of linear fitting when consecutive points are eliminated based on R pvalue. In this way we consider the implications of reducing the number of points. ? Evaluation of the significance of slope difference between fitting with the two extremes points and fitted with the available points. We compare the results applying this methodology and the common used least squares one. The data selected for these comparisons are coming from experimental soil roughness transect and simulated based on middle point displacement method adding tendencies and noise. The results are discussed indicating the advantages and disadvantages of each methodology.
Resumo:
The understanding of the embryogenesis in living systems requires reliable quantitative analysis of the cell migration throughout all the stages of development. This is a major challenge of the "in-toto" reconstruction based on different modalities of "in-vivo" imaging techniques -spatio-temporal resolution and image artifacts and noise. Several methods for cell tracking are available, but expensive manual interaction -time and human resources- is always required to enforce coherence. Because of this limitation it is necessary to restrict the experiments or assume an uncontrolled error rate. Is it possible to obtain automated reliable measurements of migration? can we provide a seed for biologists to complete cell lineages efficiently? We propose a filtering technique that considers trajectories as spatio-temporal connected structures that prunes out those that might introduce noise and false positives by using multi-dimensional morphological operators.
Resumo:
Current text-to-speech systems are developed using studio-recorded speech in a neutral style or based on acted emotions. However, the proliferation of media sharing sites would allow developing a new generation of speech-based systems which could cope with spontaneous and styled speech. This paper proposes an architecture to deal with realistic recordings and carries out some experiments on unsupervised speaker diarization. In order to maximize the speaker purity of the clusters while keeping a high speaker coverage, the paper evaluates the F-measure of a diarization module, achieving high scores (>85%) especially when the clusters are longer than 30 seconds, even for the more spontaneous and expressive styles (such as talk shows or sports).
Resumo:
Several methods to improve multiple distant microphone (MDM) speaker diarization based on Time Delay of Arrival (TDOA) features are evaluated in this paper. All of them avoid the use of a single reference channel to calculate the TDOA values and, based on different criteria, select among all possible pairs of microphones a set of pairs that will be used to estimate the TDOA's. The evaluated methods have been named the "Dynamic Margin" (DM), the "Extreme Regions" (ER), the "Most Common" (MC), the "Cross Correlation" (XCorr) and the "Principle Component Analysis" (PCA). It is shown that all methods improve the baseline results for the development set and four of them improve also the results for the evaluation set. Improvements of 3.49% and 10.77% DER relative are obtained for DM and ER respectively for the test set. The XCorr and PCA methods achieve an improvement of 36.72% and 30.82% DER relative for the test set. Moreover, the computational cost for the XCorr method is 20% less than the baseline.
Resumo:
In this work, robustness and stability of continuum damage models applied to material failure in soft tissues are addressed. In the implicit damage models equipped with softening, the presence of negative eigenvalues in the tangent elemental matrix degrades the condition number of the global matrix, leading to a reduction of the computational performance of the numerical model. Two strategies have been adapted from literature to improve the aforementioned computational performance degradation: the IMPL-EX integration scheme [Oliver,2006], which renders the elemental matrix contribution definite positive, and arclength-type continuation methods [Carrera,1994], which allow to capture the unstable softening branch in brittle ruptures. The IMPL-EX integration scheme has as a major drawback the need to use small time steps to keep numerical error below an acceptable value. A convergence study, limiting the maximum allowed increment of internal variables in the damage model, is presented. Finally, numerical simulation of failure problems with fibre reinforced materials illustrates the performance of the adopted methodology.
Resumo:
A novel algorithm based on bimatrix game theory has been developed to improve the accuracy and reliability of a speaker diarization system. This algorithm fuses the output data of two open-source speaker diarization programs, LIUM and SHoUT, taking advantage of the best properties of each one. The performance of this new system has been tested by means of audio streams from several movies. From preliminary results on fragments of five movies, improvements of 63% in false alarms and missed speech mistakes have been achieved with respect to LIUM and SHoUT systems working alone. Moreover, we also improve in a 20% the number of recognized speakers, getting close to the real number of speakers in the audio stream
Resumo:
The paper presents research conducted in the Flow workpackage of the EU funded UPWIND project which focuses on improving models for flow within and downwind of large wind farms in complex terrain and offshore. The main activity is modelling the behaviour of wind turbine wakes in order to improve power output predictions.
Resumo:
FBGs are excellent strain sensors, because of its low size and multiplexing capability. Tens to hundred of sensors may be embedded into a structure, as it has already been demonstrated. Nevertheless, they only afford strain measurements at local points, so unless the damage affects the strain readings in a distinguishable manner, damage will go undetected. This paper show the experimental results obtained on the wing of a UAV, instrumented with 32 FBGs, before and after small damages were introduced. The PCA algorithm was able to distinguish the damage cases, even for small cracks. Principal Component Analysis (PCA) is a technique of multivariable analysis to reduce a complex data set to a lower dimension and reveal some hidden patterns that underlie.