258 resultados para Criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many ecosystems worldwide are dominated by introduced plant species, leading to loss of biodiversity and ecosystem function. A common but rarely tested assumption is that these plants are more abundant in introduced vs. native communities, because ecological or evolutionary-based shifts in populations underlie invasion success. Here, data for 26 herbaceous species at 39 sites, within eight countries, revealed that species abundances were similar at native (home) and introduced (away) sites – grass species were generally abundant home and away, while forbs were low in abundance, but more abundant at home. Sites with six or more of these species had similar community abundance hierarchies, suggesting that suites of introduced species are assembling similarly on different continents. Overall, we found that substantial changes to populations are not necessarily a pre-condition for invasion success and that increases in species abundance are unusual. Instead, abundance at home predicts abundance away, a potentially useful additional criterion for biosecurity programmes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A priority when designing control strategies for autonomous underwater vehicles is to emphasize their cost of implementation on a real vehicle and at the same time to minimize a prescribed criterion such as time, energy, payload or combination of those. Indeed, the major issue is that due to the vehicles' design and the actuation modes usually under consideration for underwater platforms the number of actuator switchings must be kept to a small value to ensure feasibility and precision. This constraint is typically not verified by optimal trajectories which might not even be piecewise constants. Our goal is to provide a feasible trajectory that minimizes the number of switchings while maintaining some qualities of the desired trajectory, such as optimality with respect to a given criterion. The one-sided Lipschitz constant is used to derive theoretical estimates. The theory is illustrated on two examples, one is a fully actuated underwater vehicle capable of motion in six degrees-of-freedom and one is minimally actuated with control motions constrained to the vertical plane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of ambiguity resolution (AR) of Global Navigation Satellite Systems (GNSS), decorrelation among entries of an ambiguity vector, integer ambiguity search and ambiguity validations are three standard procedures for solving integer least-squares problems. This paper contributes to AR issues from three aspects. Firstly, the orthogonality defect is introduced as a new measure of the performance of ambiguity decorrelation methods, and compared with the decorrelation number and with the condition number which are currently used as the judging criterion to measure the correlation of ambiguity variance-covariance matrix. Numerically, the orthogonality defect demonstrates slightly better performance as a measure of the correlation between decorrelation impact and computational efficiency than the condition number measure. Secondly, the paper examines the relationship of the decorrelation number, the condition number, the orthogonality defect and the size of the ambiguity search space with the ambiguity search candidates and search nodes. The size of the ambiguity search space can be properly estimated if the ambiguity matrix is decorrelated well, which is shown to be a significant parameter in the ambiguity search progress. Thirdly, a new ambiguity resolution scheme is proposed to improve ambiguity search efficiency through the control of the size of the ambiguity search space. The new AR scheme combines the LAMBDA search and validation procedures together, which results in a much smaller size of the search space and higher computational efficiency while retaining the same AR validation outcomes. In fact, the new scheme can deal with the case there are only one candidate, while the existing search methods require at least two candidates. If there are more than one candidate, the new scheme turns to the usual ratio-test procedure. Experimental results indicate that this combined method can indeed improve ambiguity search efficiency for both the single constellation and dual constellations respectively, showing the potential for processing high dimension integer parameters in multi-GNSS environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Clinical studies suggest that foot pain may be problematic in one-third of patients in early disease. The Foot Health Status Questionnaire (FHSQ) was developed and validated to evaluate the effectiveness of conservative (orthoses, taping, stretching) and surgery interventions. Despite this fact, there are few validated instruments that measure foot health status in Spanish. Thus, the primary aim of the current study was to translate and evaluate psychometrically a Spanish version of FHSQ. Methods: A cross-sectional study was designed in a university community-based podiatric clinic located in south of Spain. All participants (n = 107) recruited consecutively completed a Spanish version of FHSQ and EuroQoL Health Questionnaire 5 dimensions, and 29 participants repeated these same measures 48 h later. Data analysis included test–retest reliability, construct and criterion-related validity and factor analyses. Results: Construct validity was appropriate with moderate-to-high corrected item–subscale correlations (α = ≥0.739) for all subscales. Test–retest reliability was satisfactory (ICC > 0.932). Factor analysis revealed four dimensions with 86.6 % of the common variance explained. The confirmatory factor analysis findings demonstrated that the proposed structure was well supported (comparative fit index = 0.92, standardized root mean square = 0.09). The Spanish EuroQoL 5D score negatively correlated with the FHSQ pain (r = −0.445) and positively with general foot health and function (r = 0.261 − 0.579), confirming criterion-related validity. Conclusion: The clinimetric properties of the Spanish version of FHSQ were satisfactory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Urban renewal is a significant issue in developed urban areas, with a particular problem for urban planners being redevelopment of land to meet demand whilst ensuring compatibility with existing land use. This paper presents a geographic information systems (GIS)-based decision support tool (called LUDS) to quantitatively assess land-use suitability for site redevelopment in urban renewal areas. This consists of a model for the suitability analysis and an affiliated land-information database for residential, commercial, industrial, G/I/C (government/institution/community) and open space land uses. Development has occurred with support from interviews with industry experts, focus group meetings and an experimental trial, combined with several advanced techniques and tools, including GIS data processing and spatial analysis, multi-criterion analysis, as well as the AHP method for constructing the model and database. As demonstrated in the trial, LUDS assists planners in making land-use decisions and supports the planning process in assessing urban land-use suitability for site redevelopment. Moreover, it facilitates public consultation (participatory planning) by providing stakeholders with an explicit understanding of planners' views.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable ambiguity resolution (AR) is essential to Real-Time Kinematic (RTK) positioning and its applications, since incorrect ambiguity fixing can lead to largely biased positioning solutions. A partial ambiguity fixing technique is developed to improve the reliability of AR, involving partial ambiguity decorrelation (PAD) and partial ambiguity resolution (PAR). Decorrelation transformation could substantially amplify the biases in the phase measurements. The purpose of PAD is to find the optimum trade-off between decorrelation and worst-case bias amplification. The concept of PAR refers to the case where only a subset of the ambiguities can be fixed correctly to their integers in the integer least-squares (ILS) estimation system at high success rates. As a result, RTK solutions can be derived from these integer-fixed phase measurements. This is meaningful provided that the number of reliably resolved phase measurements is sufficiently large for least-square estimation of RTK solutions as well. Considering the GPS constellation alone, partially fixed measurements are often insufficient for positioning. The AR reliability is usually characterised by the AR success rate. In this contribution an AR validation decision matrix is firstly introduced to understand the impact of success rate. Moreover the AR risk probability is included into a more complete evaluation of the AR reliability. We use 16 ambiguity variance-covariance matrices with different levels of success rate to analyse the relation between success rate and AR risk probability. Next, the paper examines during the PAD process, how a bias in one measurement is propagated and amplified onto many others, leading to more than one wrong integer and to affect the success probability. Furthermore, the paper proposes a partial ambiguity fixing procedure with a predefined success rate criterion and ratio-test in the ambiguity validation process. In this paper, the Galileo constellation data is tested with simulated observations. Numerical results from our experiment clearly demonstrate that only when the computed success rate is very high, the AR validation can provide decisions about the correctness of AR which are close to real world, with both low AR risk and false alarm probabilities. The results also indicate that the PAR procedure can automatically chose adequate number of ambiguities to fix at given high-success rate from the multiple constellations instead of fixing all the ambiguities. This is a benefit that multiple GNSS constellations can offer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel technique for segmenting an audio stream into homogeneous regions according to speaker identities, background noise, music, environmental and channel conditions. Audio segmentation is useful in audio diarization systems, which aim to annotate an input audio stream with information that attributes temporal regions of the audio into their specific sources. The segmentation method introduced in this paper is performed using the Generalized Likelihood Ratio (GLR), computed between two adjacent sliding windows over preprocessed speech. This approach is inspired by the popular segmentation method proposed by the pioneering work of Chen and Gopalakrishnan, using the Bayesian Information Criterion (BIC) with an expanding search window. This paper will aim to identify and address the shortcomings associated with such an approach. The result obtained by the proposed segmentation strategy is evaluated on the 2002 Rich Transcription (RT-02) Evaluation dataset, and a miss rate of 19.47% and a false alarm rate of 16.94% is achieved at the optimal threshold.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road traffic crashes have emerged as a major health problem around the world. Road crash fatalities and injuries have been reduced significantly in developed countries, but they are still an issue in low and middle-income countries. The World Health Organization (WHO, 2009) estimates that the death toll from road crashes in low- and middle-income nations is more than 1 million people per year, or about 90% of the global road toll, even though these countries only account for 48% of the world's vehicles. Furthermore, it is estimated that approximately 265,000 people die every year in road crashes in South Asian countries and Pakistan stands out with 41,494 approximately deaths per year. Pakistan has the highest rate of fatalities per 100,000 population in the region and its road crash fatality rate of 25.3 per 100,000 population is more than three times that of Australia's. High numbers of road crashes not only cause pain and suffering to the population at large, but are also a serious drain on the country's economy, which Pakistan can ill-afford. Most studies identify human factors as the main set of contributing factors to road crashes, well ahead of road environment and vehicle factors. In developing countries especially, attention and resources are required in order to improve things such as vehicle roadworthiness and poor road infrastructure. However, attention to human factors is also critical. Human factors which contribute to crashes include high risk behaviours like speeding and drink driving, and neglect of protective behaviours such as helmet wearing and seat belt wearing. Much research has been devoted to the attitudes, beliefs and perceptions which contribute to these behaviours and omissions, in order to develop interventions aimed at increasing safer road use behaviours and thereby reducing crashes. However, less progress has been made in addressing human factors contributing to crashes in developing countries as compared to the many improvements in road environments and vehicle standards, and this is especially true of fatalistic beliefs and behaviours. This is a significant omission, since in different cultures in developing countries there are strong worldviews in which predestination persists as a central idea, i.e. that one's life (and death) and other events have been mapped out and are predetermined. Fatalism refers to a particular way in which people regard the events that occur in their lives, usually expressed as a belief that an individual does not have personal control over circumstances and that their lives are determined through a divine or powerful external agency (Hazen & Ehiri, 2006). These views are at odds with the dominant themes of modern health promotion movements, and present significant challenges for health advocates who aim to avert road crashes and diminish their consequences. The limited literature on fatalism reveals that it is not a simple concept, with religion, culture, superstition, experience, education and degree of perceived control of one's life all being implicated in accounts of fatalism. One distinction in the literature that seems promising is the distinction between empirical and theological fatalism, although there are areas of uncertainty about how well-defined the distinction between these types of fatalism is. Research into road safety in Pakistan is scarce, as is the case for other South Asian countries. From the review of the literature conducted, it is clear that the descriptions given of the different belief systems in developing countries including Pakistan are not entirely helpful for health promotion purposes and that further research is warranted on the influence of fatalism, superstition and other related beliefs in road safety. Based on the information available, a conceptual framework is developed as a means of structuring and focusing the research and analysis. The framework is focused on the influence of fatalism, superstition, religion and culture on beliefs about crashes and road user behaviour. Accordingly, this research aims to provide an understanding of the operation of fatalism and related beliefs in Pakistan to assist in the development and implementation of effective and culturally appropriate interventions. The research examines the influence of fatalism, superstition, religious and cultural beliefs on risky road use in Pakistan and is guided by three research questions: 1. What are the perceptions of road crash causation in Pakistan, in particular the role of fatalism, superstition, religious and cultural beliefs? 2. How does fatalism, superstition, and religious and cultural beliefs influence road user behaviour in Pakistan? 3. Do fatalism, superstition, and religious and cultural beliefs work as obstacles to road safety interventions in Pakistan? To address these questions, a qualitative research methodology was developed. The research focused on gathering data through individual in-depth interviewing using a semi-structured interview format. A sample of 30 participants was interviewed in Pakistan in the cities of Lahore, Rawalpindi and Islamabad. The participants included policy makers (with responsibility for traffic law), experienced police officers, religious orators, professional drivers (truck, bus and taxi) and general drivers selected through a combination of purposive, criterion and snowball sampling. The transcripts were translated from Urdu and analysed using a thematic analysis approach guided by the conceptual framework. The findings were divided into four areas: attribution of crash causation to fatalism; attribution of road crashes to beliefs about superstition and malicious acts; beliefs about road crash causation linked to popular concepts of religion; and implications for behaviour, safety and enforcement. Fatalism was almost universally evident, and expressed in a number of ways. Fate was used to rationalise fatal crashes using the argument that the people killed were destined to die that day, one way or another. Related to this was the sense of either not being fully in control of the vehicle, or not needing to take safety precautions, because crashes were predestined anyway. A variety of superstitious-based crash attributions and coping methods to deal with road crashes were also found, such as belief in the role of the evil eye in contributing to road crashes and the use of black magic by rivals or enemies as a crash cause. There were also beliefs related to popular conceptions of religion, such as the role of crashes as a test of life or a source of martyrdom. However, superstitions did not appear to be an alternative to religious beliefs. Fate appeared as the 'default attribution' for a crash when all other explanations failed to account for the incident. This pervasive belief was utilised to justify risky road use behaviour and to resist messages about preventive measures. There was a strong religious underpinning to the statement of fatalistic beliefs (this reflects popular conceptions of Islam rather than scholarly interpretations), but also an overlap with superstitious and other culturally and religious-based beliefs which have longer-standing roots in Pakistani culture. A particular issue which is explored in more detail is the way in which these beliefs and their interpretation within Pakistani society contributed to poor police reporting of crashes. The pervasive nature of fatalistic beliefs in Pakistan affects road user behaviour by supporting continued risk taking behaviour on the road, and by interfering with public health messages about behaviours which would reduce the risk of traffic crashes. The widespread influence of these beliefs on the ways that people respond to traffic crashes and the death of family members contribute to low crash reporting rates and to a system which appears difficult to change. Fate also appeared to be a major contributing factor to non-reporting of road crashes. There also appeared to be a relationship between police enforcement and (lack of) awareness of road rules. It also appears likely that beliefs can influence police work, especially in the case of road crash investigation and the development of strategies. It is anticipated that the findings could be used as a blueprint for the design of interventions aimed at influencing broad-spectrum health attitudes and practices among the communities where fatalism is prevalent. The findings have also identified aspects of beliefs that have complex social implications when designing and piloting driver intervention strategies. By understanding attitudes and behaviours related to fatalism, superstition and other related concepts, it should be possible to improve the education of general road users, such that they are less likely to attribute road crashes to chance, fate, or superstition. This study also underscores the understanding of this issue in high echelons of society (e.g., policy makers, senior police officers) as their role is vital in dispelling road users' misconceptions about the risks of road crashes. The promotion of an evidence or scientifically-based approach to road user behaviour and road safety is recommended, along with improved professional education for police and policy makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a worldwide demand for an increasingly sustainable built environment. This has resulted in the need for a more accurate evaluation of the level of sustainability of construction projects. To do this it involves the development of better measurement and benchmarking methods. One approach is to use a theoretical model to assess construction projects in terms of their sustainable development value (SDV) and sustainable development ability (SDA) for implementation in the project life cycle, where SDA measures the contribution of a project to development sustainability and as a major criterion for assessing its feasibility. This paper develops an improved SDA prototype model that incorporates the effects of dynamical factors on project sustainability. This involves the introduction of two major factors concerning technological advancement and changes in people's perceptions. A case study is used to demonstrate the procedures involved in simulation and modeling, one outcome of which is to demonstrate the greater influence of technological advancement on project sustainability than changes in perception.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide a new type of entry mode decision-making model for construction enterprises involved in international business. Design/methodology/approach – A hybrid method combining analytic hierarchy process (AHP) with preference ranking organization method for enrichment evaluations (PROMETHEE) is used to aid entry mode decisions. The AHP is used to decompose the entry mode problem into several dimensions and determine the weight of each criterion. In addition, PROMETHEE method is used to rank candidate entry modes and carry out sensitivity analyses. Findings – The proposed decision-making method is demonstrated to be a suitable approach to resolve the entry mode selection decision problem. Practical implications – The research provides practitioners with a more systematic decision framework and a more precise decision method. Originality/value – The paper sheds light on the further development of entry strategies for international construction markets. It not only introduces a new decision-making model for entry mode decision making, but also provides a conceptual framework with five determinants for a construction company entry mode selection based on the unique properties of the construction industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We sought to determine the impact of electrospinning parameters on a trustworthy criterion that could evidently improve the maximum applicability of fibrous scaffolds for tissue regeneration. We used an image analysis technique to elucidate the web permeability index (WPI) by modeling the formation of electrospun scaffolds. Poly(3-hydroxybutyrate) (P3HB) scaffolds were fabricated according to predetermined conditions of levels in a Taguchi orthogonal design. The material parameters were the polymer concentration, conductivity, and volatility of the solution. The processing parameters were the applied voltage and nozzle-to-collector distance. With a law to monitor the WPI values when the polymer concentration or the applied voltage was increased, the pore interconnectivity was decreased. The quality of the jet instability altered the pore numbers, areas, and other structural characteristics, all of which determined the scaffold porosity and aperture interconnectivity. An initial drastic increase was observed in the WPI values because of the chain entanglement phenomenon above a 6 wt % P3HB content. Although the solution mixture significantly (p < 0.05) changed the scaffold architectural characteristics as a function of the solution viscosity and surface tension, it had a minor impact on the WPI values. The solution mixture gained the third place of significance, and the distance was approved as the least important factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates advanced channel compensation techniques for the purpose of improving i-vector speaker verification performance in the presence of high intersession variability using the NIST 2008 and 2010 SRE corpora. The performance of four channel compensation techniques: (a) weighted maximum margin criterion (WMMC), (b) source-normalized WMMC (SN-WMMC), (c) weighted linear discriminant analysis (WLDA), and; (d) source-normalized WLDA (SN-WLDA) have been investigated. We show that, by extracting the discriminatory information between pairs of speakers as well as capturing the source variation information in the development i-vector space, the SN-WLDA based cosine similarity scoring (CSS) i-vector system is shown to provide over 20% improvement in EER for NIST 2008 interview and microphone verification and over 10% improvement in EER for NIST 2008 telephone verification, when compared to SN-LDA based CSS i-vector system. Further, score-level fusion techniques are analyzed to combine the best channel compensation approaches, to provide over 8% improvement in DCF over the best single approach, (SN-WLDA), for NIST 2008 interview/ telephone enrolment-verification condition. Finally, we demonstrate that the improvements found in the context of CSS also generalize to state-of-the-art GPLDA with up to 14% relative improvement in EER for NIST SRE 2010 interview and microphone verification and over 7% relative improvement in EER for NIST SRE 2010 telephone verification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Context: The Ober and Thomas tests are subjective and involve a "negative" or "positive" assessment, making them difficult to apply within the paradigm of evidence-based medicine. No authors have combined the subjective clinical assessment with an objective measurement for these special tests. Objective: To compare the subjective assessment of iliotibial band and iliopsoas flexibility with the objective measurement of a digital inclinometer, to establish normative values, and to provide an evidence-based critical criterion for determining tissue tightness. Design: Cross-sectional study. Setting: Clinical research laboratory. Patients or Other Participants: Three hundred recreational athletes (125 men, 175 women; 250 in injured group, 50 in control group). Main Outcome Measure(s): Iliotibial band and iliopsoas muscle flexibility were determined subjectively using the modified Ober and Thomas tests, respectively. Using a digital inclinometer, we objectively measured limb position. lnterrater reliability for the subjective assessment was compared between 2 clinicians for a random sample of 100 injured participants, who were classified subjectively as either negative or positive for iliotibial band and iliopsoas tightness. Percentage of agreement indicated interrater reliability for the subjective assessment. Results: For iliotibial band flexibility, the average inclinometer angle was -24.59 degrees +/- 7.27 degrees. A total of 432 limbs were subjectively assessed as negative (-27.13 degrees +/- 5.53 degrees) and 168 as positive (-16.29 degrees +/- 6.87 degrees). For iliopsoas flexibility, the average inclinometer angle was -10.60 degrees +/- 9.61 degrees. A total of 392 limbs were subjectively assessed as negative (-15.51 degrees +/- 5.82 degrees) and 208 as positive (0.34 degrees +/- 7.00 degrees). The critical criteria for iliotibial band and iliopsoas flexibility were determined to be -23.16 degrees and -9.69 degrees, respectively. Between-clinicians agreement was very good, ranging from 95.0% to 97.6% for the Thomas and Ober tests, respectively. Conclusions: Subjective assessments and instrumented measurements were combined to establish normative values and critical criterions for tissue flexibility for the modified Ober and Thomas tests.