980 resultados para Random test
Resumo:
Speech enhancement in stationary noise is addressed using the ideal channel selection framework. In order to estimate the binary mask, we propose to classify each time-frequency (T-F) bin of the noisy signal as speech or noise using Discriminative Random Fields (DRF). The DRF function contains two terms - an enhancement function and a smoothing term. On each T-F bin, we propose to use an enhancement function based on likelihood ratio test for speech presence, while Ising model is used as smoothing function for spectro-temporal continuity in the estimated binary mask. The effect of the smoothing function over successive iterations is found to reduce musical noise as opposed to using only enhancement function. The binary mask is inferred from the noisy signal using Iterated Conditional Modes (ICM) algorithm. Sentences from NOIZEUS corpus are evaluated from 0 dB to 15 dB Signal to Noise Ratio (SNR) in 4 kinds of additive noise settings: additive white Gaussian noise, car noise, street noise and pink noise. The reconstructed speech using the proposed technique is evaluated in terms of average segmental SNR, Perceptual Evaluation of Speech Quality (PESQ) and Mean opinion Score (MOS).
Resumo:
The LY12-cz aluminium alloy sheet specimens with a central hole were tested under constant amplitude loading, Rayleigh narrow band random loading and a typical fighter broad band random loading. The fatigue life was estimated by means of the nominal stress and the Miner's rule. The stress cycles were distinguished by the rainflow count, range count and peak value count, respectively. The comparison between the estimated results and the test results was made. The effects of random loading sequence and small load cycles on fatigue life were also studied.
Resumo:
In this paper, reanalysis fields from the ECMWF have been statistically downscaled to predict from large-scale atmospheric fields, surface moisture flux and daily precipitation at two observatories (Zaragoza and Tortosa, Ebro Valley, Spain) during the 1961-2001 period. Three types of downscaling models have been built: (i) analogues, (ii) analogues followed by random forests and (iii) analogues followed by multiple linear regression. The inputs consist of data (predictor fields) taken from the ERA-40 reanalysis. The predicted fields are precipitation and surface moisture flux as measured at the two observatories. With the aim to reduce the dimensionality of the problem, the ERA-40 fields have been decomposed using empirical orthogonal functions. Available daily data has been divided into two parts: a training period used to find a group of about 300 analogues to build the downscaling model (1961-1996) and a test period (19972001), where models' performance has been assessed using independent data. In the case of surface moisture flux, the models based on analogues followed by random forests do not clearly outperform those built on analogues plus multiple linear regression, while simple averages calculated from the nearest analogues found in the training period, yielded only slightly worse results. In the case of precipitation, the three types of model performed equally. These results suggest that most of the models' downscaling capabilities can be attributed to the analogues-calculation stage.
Resumo:
Plant community ecologists use the null model approach to infer assembly processes from observed patterns of species co-occurrence. In about a third of published studies, the null hypothesis of random assembly cannot be rejected. When this occurs, plant ecologists interpret that the observed random pattern is not environmentally constrained - but probably generated by stochastic processes. The null model approach (using the C-score and the discrepancy index) was used to test for random assembly under two simulation algorithms. Logistic regression, distance-based redundancy analysis, and constrained ordination were used to test for environmental determinism (species segregation along environmental gradients or turnover and species aggregation). This article introduces an environmentally determined community of alpine hydrophytes that presents itself as randomly assembled. The pathway through which the random pattern arises in this community is suggested to be as follows: Two simultaneous environmental processes, one leading to species aggregation and the other leading to species segregation, concurrently generate the observed pattern, which results to be neither aggregated nor segregated - but random. A simulation study supports this suggestion. Although apparently simple, the null model approach seems to assume that a single ecological factor prevails or that if several factors decisively influence the community, then they all exert their influence in the same direction, generating either aggregation or segregation. As these assumptions are unlikely to hold in most cases and assembly processes cannot be inferred from random patterns, we would like to propose plant ecologists to investigate specifically the ecological processes responsible for observed random patterns, instead of trying to infer processes from patterns
Resumo:
Six sample specimens of Trachypithecus francoisi and 3 of T. leucocephalus were analyzed by use of allozyme electrophoresis and random amplified polymorphism DNA (RAPD) in order to clarify the challenged taxonomic status of the white-head langur. Among the 44 loci surveyed, only 1 locus (PGM-2) was found to be polymorphic. Nei's genetic distance was 0.0025. In total, thirty 10-mer arbitrary primers were used for RAPD analysis, of which 22 generated clear bands. Phylogenetic trees were constructed based on genetic distances using neighbor-joining and UPGMA methods. The results show that T. francoisi and T: leucocephalus are not monophyletic. T. francoisi from Guangxi, China and Vietnam could not be clearly distinguished, and they are not divided into 2 clusters. A t-test was performed to evaluate between genetic distances within and between T. leucocephalus and T. francoisi taxa groups. The statistical test shows that the taxa group within T: leucocephalus and T: francoisi does not significantly differ from that between T: leucocephalus and T: francoisi at the 5% level. Our results suggest that the level of genetic differentiation between T, leucocephalus and T. francoisi is relatively low. Recent gene flow might exist between T. francoisi and T. leucocephalus. Combining morphological features, geographical distribution, allozyme data, RAPD data, and mtDNA sequences, we suggest that the white-head langur might be a subspecies of T. francoisi.
Resumo:
Motor task variation has been shown to be a key ingredient in skill transfer, retention, and structural learning. However, many studies only compare training of randomly varying tasks to either blocked or null training, and it is not clear how experiencing different nonrandom temporal orderings of tasks might affect the learning process. Here we study learning in human subjects who experience the same set of visuomotor rotations, evenly spaced between -60° and +60°, either in a random order or in an order in which the rotation angle changed gradually. We compared subsequent learning of three test blocks of +30°→-30°→+30° rotations. The groups that underwent either random or gradual training showed significant (P < 0.01) facilitation of learning in the test blocks compared with a control group who had not experienced any visuomotor rotations before. We also found that movement initiation times in the random group during the test blocks were significantly (P < 0.05) lower than for the gradual or the control group. When we fit a state-space model with fast and slow learning processes to our data, we found that the differences in performance in the test block were consistent with the gradual or random task variation changing the learning and retention rates of only the fast learning process. Such adaptation of learning rates may be a key feature of ongoing meta-learning processes. Our results therefore suggest that both gradual and random task variation can induce meta-learning and that random learning has an advantage in terms of shorter initiation times, suggesting less reliance on cognitive processes.
Resumo:
Deformations of sandy soils around geotechnical structures generally involve strains in the range small (0·01%) to medium (0·5%). In this strain range the soil exhibits non-linear stress-strain behaviour, which should be incorporated in any deformation analysis. In order to capture the possible variability in the non-linear behaviour of various sands, a database was constructed including the secant shear modulus degradation curves of 454 tests from the literature. By obtaining a unique S-shaped curve of shear modulus degradation, a modified hyperbolic relationship was fitted. The three curve-fitting parameters are: an elastic threshold strain γe, up to which the elastic shear modulus is effectively constant at G0; a reference strain γr, defined as the shear strain at which the secant modulus has reduced to 0·5G0; and a curvature parameter a, which controls the rate of modulus reduction. The two characteristic strains γe and γr were found to vary with sand type (i.e. uniformity coefficient), soil state (i.e. void ratio, relative density) and mean effective stress. The new empirical expression for shear modulus reduction G/G0 is shown to make predictions that are accurate within a factor of 1·13 for one standard deviation of random error, as determined from 3860 data points. The initial elastic shear modulus, G0, should always be measured if possible, but a new empirical relation is shown to provide estimates within a factor of 1·6 for one standard deviation of random error, as determined from 379 tests. The new expressions for non-linear deformation are easy to apply in practice, and should be useful in the analysis of geotechnical structures under static loading.
Resumo:
This paper presents the vulnerabilities of single event effects (SEEs) simulated by heavy ions on ground and observed oil SJ-5 research satellite in space for static random access memories (SRAMs). A single event upset (SEU) prediction code has been used to estimate the proton-induced upset rates based oil the ground test curve of SEU cross-section versus heavy ion linear energy transfer (LET). The result agrees with that of the flight data.
Resumo:
In order to widely use Ge and III-V materials instead of Si in advanced CMOS technology, the process and integration of these materials has to be well established so that their high mobility benefit is not swamped by imperfect manufacturing procedures. In this dissertation number of key bottlenecks in realization of Ge devices are investigated; We address the challenge of the formation of low resistivity contacts on n-type Ge, comparing conventional and advanced rapid thermal annealing (RTA) and laser thermal annealing (LTA) techniques respectively. LTA appears to be a feasible approach for realization of low resistivity contacts with an incredibly sharp germanide-substrate interface and contact resistivity in the order of 10 -7 Ω.cm2. Furthermore the influence of RTA and LTA on dopant activation and leakage current suppression in n+/p Ge junction were compared. Providing very high active carrier concentration > 1020 cm-3, LTA resulted in higher leakage current compared to RTA which provided lower carrier concentration ~1019 cm-3. This is an indication of a trade-off between high activation level and junction leakage current. High ION/IOFF ratio ~ 107 was obtained, which to the best of our knowledge is the best reported value for n-type Ge so far. Simulations were carried out to investigate how target sputtering, dose retention, and damage formation is generated in thin-body semiconductors by means of energetic ion impacts and how they are dependent on the target physical material properties. Solid phase epitaxy studies in wide and thin Ge fins confirmed the formation of twin boundary defects and random nucleation growth, like in Si, but here 600 °C annealing temperature was found to be effective to reduce these defects. Finally, a non-destructive doping technique was successfully implemented to dope Ge nanowires, where nanowire resistivity was reduced by 5 orders of magnitude using PH3 based in-diffusion process.
Resumo:
Timing-related defects are major contributors to test escapes and in-field reliability problems for very-deep submicrometer integrated circuits. Small delay variations induced by crosstalk, process variations, power-supply noise, as well as resistive opens and shorts can potentially cause timing failures in a design, thereby leading to quality and reliability concerns. We present a test-grading technique that uses the method of output deviations for screening small-delay defects (SDDs). A new gate-delay defect probability measure is defined to model delay variations for nanometer technologies. The proposed technique intelligently selects the best set of patterns for SDD detection from an n-detect pattern set generated using timing-unaware automatic test-pattern generation (ATPG). It offers significantly lower computational complexity and excites a larger number of long paths compared to a current generation commercial timing-aware ATPG tool. Our results also show that, for the same pattern count, the selected patterns provide more effective coverage ramp-up than timing-aware ATPG and a recent pattern-selection method for random SDDs potentially caused by resistive shorts, resistive opens, and process variations. © 2010 IEEE.
Resumo:
1. A first step in the analysis of complex movement data often involves discretisation of the path into a series of step-lengths and turns, for example in the analysis of specialised random walks, such as Lévy flights. However, the identification of turning points, and therefore step-lengths, in a tortuous path is dependent on ad-hoc parameter choices. Consequently, studies testing for movement patterns in these data, such as Lévy flights, have generated debate. However, studies focusing on one-dimensional (1D) data, as in the vertical displacements of marine pelagic predators, where turning points can be identified unambiguously have provided strong support for Lévy flight movement patterns. 2. Here, we investigate how step-length distributions in 3D movement patterns would be interpreted by tags recording in 1D (i.e. depth) and demonstrate the dimensional symmetry previously shown mathematically for Lévy-flight movements. We test the veracity of this symmetry by simulating several measurement errors common in empirical datasets and find Lévy patterns and exponents to be robust to low-quality movement data. 3. We then consider exponential and composite Brownian random walks and show that these also project into 1D with sufficient symmetry to be clearly identifiable as such. 4. By extending the symmetry paradigm, we propose a new methodology for step-length identification in 2D or 3D movement data. The methodology is successfully demonstrated in a re-analysis of wandering albatross Global Positioning System (GPS) location data previously analysed using a complex methodology to determine bird-landing locations as turning points in a Lévy walk. For this high-resolution GPS data, we show that there is strong evidence for albatross foraging patterns approximated by truncated Lévy flights spanning over 3·5 orders of magnitude. 5. Our simple methodology and freely available software can be used with any 2D or 3D movement data at any scale or resolution and are robust to common empirical measurement errors. The method should find wide applicability in the field of movement ecology spanning the study of motile cells to humans.
Resumo:
Scepticism over stated preference surveys conducted online revolves around the concerns over “professional respondents” who might rush through the questionnaire without sufficiently considering the information provided. To gain insight on the validity of this phenomenon and test the effect of response time on choice randomness, this study makes use of a recently conducted choice experiment survey on ecological and amenity effects of an offshore windfarm in the UK. The positive relationship between self-rated and inferred attribute attendance and response time is taken as evidence for a link between response time and cognitive effort. Subsequently, the generalised multinomial logit model is employed to test the effect of response time on scale, which indicates the weight of the deterministic relative to the error component in the random utility model. Results show that longer response time increases scale, i.e. decreases choice randomness. This positive scale effect of response time is further found to be non-linear and wear off at some point beyond which extreme response time decreases scale. While response time does not systematically affect welfare estimates, higher response time increases the precision of such estimates. These effects persist when self-reported choice certainty is controlled for. Implications of the results for online stated preference surveys and further research are discussed.
Resumo:
Ensis siliqua is regarded as an increasingly valuable fishery resource with potential for commercial aquaculture in many European countries. The genetic variation of this razor clam was analysed by randomly amplified polymorphic DNA (RAPD) in six populations from Spain, Portugal and Ireland. Out of the 40 primers tested, five were chosen to assess genetic variation. A total of 61 RAPD loci were developed ranging in size from 400 to 2000 bp. The percentages of polymorphic loci, the allele effective number and the genetic diversity were comparable among populations, and demonstrated a high level of genetic variability. The values of Nei's genetic distance were small among the Spanish and Portuguese populations (0.051-0.065), and high between these and the Irish populations. Cluster and principal coordinate analyses supported these findings. A mantel test performed between geographic and genetic distance matrices showed a significant correlation (r=0.84, P
Resumo:
In this paper we present the application of Hidden Conditional Random Fields (HCRFs) to modelling speech for visual speech recognition. HCRFs may be easily adapted to model long range dependencies across an observation sequence. As a result visual word recognition performance can be improved as the model is able to take more of a contextual approach to generating state sequences. Results are presented from a speaker-dependent, isolated digit, visual speech recognition task using comparisons with a baseline HMM system. We firstly illustrate that word recognition rates on clean video using HCRFs can be improved by increasing the number of past and future observations being taken into account by each state. Secondly we compare model performances using various levels of video compression on the test set. As far as we are aware this is the first attempted use of HCRFs for visual speech recognition.
Resumo:
A wealth of palaeoecological studies (e.g. pollen, diatoms, chironomids and macrofossils from deposits such as lakes or bogs) have revealed major as well as more subtle ecosystem changes over decadal to multimillennial timescales. Such ecosystem changes are usually assumed to have been forced by specific environmental changes. Here, we test if the observed changes in palaeoecological records may be reproduced by random simulations, and we find that simple procedures generate abrupt events, long-term trends, quasi-cyclic behaviour, extinctions and immigrations. Our results highlight the importance of replicated and multiproxy data for reliable reconstructions of past climate and environmental changes.