196 resultados para Gaussian Plume model for multiple sources foe Cochin


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gaussian mixture models (GMMs) have become an established means of modeling feature distributions in speaker recognition systems. It is useful for experimentation and practical implementation purposes to develop and test these models in an efficient manner particularly when computational resources are limited. A method of combining vector quantization (VQ) with single multi-dimensional Gaussians is proposed to rapidly generate a robust model approximation to the Gaussian mixture model. A fast method of testing these systems is also proposed and implemented. Results on the NIST 1996 Speaker Recognition Database suggest comparable and in some cases an improved verification performance to the traditional GMM based analysis scheme. In addition, previous research for the task of speaker identification indicated a similar system perfomance between the VQ Gaussian based technique and GMMs

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evidence currently supports the view that intentional interpersonal coordination (IIC) is a self-organizing phenomenon facilitated by visual perception of co-actors in a coordinative coupling (Schmidt, Richardson, Arsenault, & Galantucci, 2007). The present study examines how apparent IIC is achieved in situations where visual information is limited for co-actors in a rowing boat. In paired rowing boats only one of the actors, [bow seat] gets to see the actions of the other [stroke seat]. Thus IIC appears to be facilitated despite the lack of important visual information for the control of the dyad. Adopting a mimetic approach to expert coordination, the present study qualitatively examined the experiences of expert performers (N=9) and coaches (N=4) with respect to how IIC was achieved in paired rowing boats. Themes were explored using inductive content analysis, which led to layered model of control. Rowers and coaches reported the use of multiple perceptual sources in order to achieve IIC. As expected(Kelso, 1995; Schmidt & O’Brien, 1997; Turvey, 1990), rowers in the bow of a pair boat make use of visual information provided by the partner in front of them [stroke]. However, this perceptual information is subordinate to perception Motor Learning and Control S111 of the relationship between the boat hull and water passing beside it. Stroke seat, in the absence of visual information about his/her partner, achieves coordination by picking up information about the lifting or looming of the boat’s stern along with water passage past the hull. In this case it appears that apparent or desired IIC is supported by the perception of extra-personal variables, in this case boat behavior; as this perceptual information source is used by both actors. To conclude, co-actors in two person rowing boats use multiple sources of perceptual information for apparent IIC that changes according to task constraints. Where visual information is restricted IIC is facilitated via extra-personal perceptual information and apparent IIC switches to intentional extra-personal coordination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emergency service workers (e.g., fire-fighters, police and paramedics) are exposed to elevated levels of potentially traumatising events through the course of their work. Such exposure can have lasting negative consequences (e. g., Post Traumatic Stress Disorder; PTSD) and/or positive outcomes (e. g., Posttraumatic Growth; PTG). Research had implicated trauma, occupational and personal variables that account for variance in post-trauma outcomes yet at this stage no research has investigated these factors and their relative influence on both PTSD and PTG in a single study. Based in Calhoun and Tedeschi’s (2013) model of PTG and previous research, in this study regression models of PTG and PTSD symptoms among 218 fire-fighters were tested. Results indicated organisational factors predicted symptoms of PTSD, while there was partial support for the hypothesis that coping and social support would be predictors of PTG. Experiencing multiple sources of trauma, higher levels of organisational and operational stress, and utilising cognitive reappraisal coping, were all significant predictors of PTSD symptoms. Increases in PTG were predicted by experiencing trauma from multiple sources and the use of self-care coping. Results highlight the importance of organisational factors in the development of PTSD symptoms, and of individual factors for promoting PTG.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Countless studies have stressed the importance of social identity, particularly its role in various organizational outcomes, yet questions remain as to how identities initially develop, shift and change based on the configuration of multiple, pluralistic relationships grounded in an organizational setting. The interactive model of social identity formation has been proposed recently to explain the internalization of shared norms and values – critical in identity formation – has not received empirical examination. We analyzed multiple sources of data from nine nuclear professionals over three years to understand the construction of social identity in new entrants entering an organization. Informed by our data analyses, we found support for the interactive model and that age and level of experience influenced whether they undertook an inductive or deductive route of the group norm and value internalization. This study represents an important contribution to the study of social identity and the process by which identities are formed, particularly under conditions of duress or significant organizational disruption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates condition monitoring (CM) of diesel engines using acoustic emission (AE) techniques. The AE signals recorded from a small size diesel engine are mixtures of multiple sources from multiple cylinders. Thus, it is difficult to interpret the information conveyed in the signals for CM purposes. This thesis develops a series of practical signal processing techniques to overcome this problem. Various experimental studies conducted to assess the CM capabilities of AE analysis for diesel engines. A series of modified signal processing techniques were proposed. These techniques showed promising results of capability for CM of multiple cylinders diesel engine using multiple AE sensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network coding is a method for achieving channel capacity in networks. The key idea is to allow network routers to linearly mix packets as they traverse the network so that recipients receive linear combinations of packets. Network coded systems are vulnerable to pollution attacks where a single malicious node floods the network with bad packets and prevents the receiver from decoding correctly. Cryptographic defenses to these problems are based on homomorphic signatures and MACs. These proposals, however, cannot handle mixing of packets from multiple sources, which is needed to achieve the full benefits of network coding. In this paper we address integrity of multi-source mixing. We propose a security model for this setting and provide a generic construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we identify two types of injustice as antecedents of abusive supervision and ultimately of subordinate psychological distress and insomnia. We examine distributive justice (an individual's evaluation of their input to output ratio compared to relevant others) and interactional injustice (the quality of interpersonal treatment received when procedures are implemented). Using a sample of Filipinos in a variety of occupations, we identify two types of injustice experienced by supervisors as stressors that provoke them to display abusive supervision to their subordinates. We examine two consequences of abusive supervision - subordinate psychological distress and insomnia. In addition, we identify two moderators of these relationships, namely, supervisor distress and subordinate self-esteem. We collected survey data from multiple sources including subordinates, their supervisors, and their partners. Data were obtained from 175 matched supervisor-subordinate dyads over a 6-month period, with subordinates' partners providing ratings of insomnia. Results of structural equation modelling analyses provided support for an indirect effects model in which supervisors' experience of unfair treatment cascades down the organization, resulting in subordinate psychological distress and, ultimately in their insomnia. In addition, results partially supported the proposed moderated relationships in the cascading model. © 2010 Taylor & Francis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the process by which newly recruited nuclear engineering and technical staff came to understand, define, think, feel and behave within a distinct group that has a direct contribution to the organization's overall emphasis on a culture of reliability and system safety. In the field of organizational behavior the interactive model of social identity formation has been recently proposed to explain the process by which the internalization of shared norms and values occurs, an element critical in identity formation. Using this rich model of organizational behavior we analyzed multiple sources of data from nine new hires over a period of three years. This was done from the time they were employed to investigate the construction of social identity by new entrants entering into a complex organizational setting reflected in the context of a nuclear facility. Informed by our data analyses, we found support for the interactive model of social identity development and report the unexpected finding that a newly appointed member's age and level of experience appears to influence the manner in which they adapt, and assimilate into their surroundings. This study represents an important contribution to the safety and reliability literature as it provides a rich insight into the way newly recruited employees enact the process by which their identities are formed and hence act, particularly under conditions of duress or significant organizational disruption in complex organizational settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current state of the practice in Blackspot Identification (BSI) utilizes safety performance functions based on total crash counts to identify transport system sites with potentially high crash risk. This paper postulates that total crash count variation over a transport network is a result of multiple distinct crash generating processes including geometric characteristics of the road, spatial features of the surrounding environment, and driver behaviour factors. However, these multiple sources are ignored in current modelling methodologies in both trying to explain or predict crash frequencies across sites. Instead, current practice employs models that imply that a single underlying crash generating process exists. The model mis-specification may lead to correlating crashes with the incorrect sources of contributing factors (e.g. concluding a crash is predominately caused by a geometric feature when it is a behavioural issue), which may ultimately lead to inefficient use of public funds and misidentification of true blackspots. This study aims to propose a latent class model consistent with a multiple crash process theory, and to investigate the influence this model has on correctly identifying crash blackspots. We first present the theoretical and corresponding methodological approach in which a Bayesian Latent Class (BLC) model is estimated assuming that crashes arise from two distinct risk generating processes including engineering and unobserved spatial factors. The Bayesian model is used to incorporate prior information about the contribution of each underlying process to the total crash count. The methodology is applied to the state-controlled roads in Queensland, Australia and the results are compared to an Empirical Bayesian Negative Binomial (EB-NB) model. A comparison of goodness of fit measures illustrates significantly improved performance of the proposed model compared to the NB model. The detection of blackspots was also improved when compared to the EB-NB model. In addition, modelling crashes as the result of two fundamentally separate underlying processes reveals more detailed information about unobserved crash causes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting temporal responses of ecosystems to disturbances associated with industrial activities is critical for their management and conservation. However, prediction of ecosystem responses is challenging due to the complexity and potential non-linearities stemming from interactions between system components and multiple environmental drivers. Prediction is particularly difficult for marine ecosystems due to their often highly variable and complex natures and large uncertainties surrounding their dynamic responses. Consequently, current management of such systems often rely on expert judgement and/or complex quantitative models that consider only a subset of the relevant ecological processes. Hence there exists an urgent need for the development of whole-of-systems predictive models to support decision and policy makers in managing complex marine systems in the context of industry based disturbances. This paper presents Dynamic Bayesian Networks (DBNs) for predicting the temporal response of a marine ecosystem to anthropogenic disturbances. The DBN provides a visual representation of the problem domain in terms of factors (parts of the ecosystem) and their relationships. These relationships are quantified via Conditional Probability Tables (CPTs), which estimate the variability and uncertainty in the distribution of each factor. The combination of qualitative visual and quantitative elements in a DBN facilitates the integration of a wide array of data, published and expert knowledge and other models. Such multiple sources are often essential as one single source of information is rarely sufficient to cover the diverse range of factors relevant to a management task. Here, a DBN model is developed for tropical, annual Halophila and temperate, persistent Amphibolis seagrass meadows to inform dredging management and help meet environmental guidelines. Specifically, the impacts of capital (e.g. new port development) and maintenance (e.g. maintaining channel depths in established ports) dredging is evaluated with respect to the risk of permanent loss, defined as no recovery within 5 years (Environmental Protection Agency guidelines). The model is developed using expert knowledge, existing literature, statistical models of environmental light, and experimental data. The model is then demonstrated in a case study through the analysis of a variety of dredging, environmental and seagrass ecosystem recovery scenarios. In spatial zones significantly affected by dredging, such as the zone of moderate impact, shoot density has a very high probability of being driven to zero by capital dredging due to the duration of such dredging. Here, fast growing Halophila species can recover, however, the probability of recovery depends on the presence of seed banks. On the other hand, slow growing Amphibolis meadows have a high probability of suffering permanent loss. However, in the maintenance dredging scenario, due to the shorter duration of dredging, Amphibolis is better able to resist the impacts of dredging. For both types of seagrass meadows, the probability of loss was strongly dependent on the biological and ecological status of the meadow, as well as environmental conditions post-dredging. The ability to predict the ecosystem response under cumulative, non-linear interactions across a complex ecosystem highlights the utility of DBNs for decision support and environmental management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The firm is faced with a decision concerning the nature of intra-organizational exchange relationships with internal human resources and the nature or inter-organizational exchange relationships with market firms. In both situations, the firm can develop an exchange that ranges from a discrete exchange to a relational exchange. Transaction Cost Economics (TCE) and the Resource Dependency View (RDV) represent alternative efficiency-based explanations fo the nature of the exchange relationship. The aim of the paper is to test these two theories in respect of air conditioning maintenance in retail centres. Multiple sources of information are genereated from case studies of Australian retail centres to test these theories in respoect of internalized operations management (concerning strategic aspects of air conditioning maintenance) and externalized planned routine air conditioning maintenance. The analysis of the data centres on pattern matching. It is concluded that the data supports TCE - on the basis of a development in TCE's contractual schema. Further research is suggested towards taking a pluralistic stance and developing a combined efficiency and power hypothesis - upon which Williamson has speculated. For practice, the conclusions also offer a timely cautionary note concerning the adoption of one approach in all exchange relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Participatory evaluation and participatory action research (PAR) are increasingly used in community-based programs and initiatives and there is a growing acknowledgement of their value. These methodologies focus more on knowledge generated and constructed through lived experience than through social science (Vanderplaat 1995). The scientific ideal of objectivity is usually rejected in favour of a holistic approach that acknowledges and takes into account the diverse perspectives, values and interpretations of participants and evaluation professionals. However, evaluation rigour need not be lost in this approach. Increasing the rigour and trustworthiness of participatory evaluations and PAR increases the likelihood that results are seen as credible and are used to continually improve programs and policies.----- Drawing on learnings and critical reflections about the use of feminist and participatory forms of evaluation and PAR over a 10-year period, significant sources of rigour identified include:----- • participation and communication methods that develop relations of mutual trust and open communication----- • using multiple theories and methodologies, multiple sources of data, and multiple methods of data collection----- • ongoing meta-evaluation and critical reflection----- • critically assessing the intended and unintended impacts of evaluations, using relevant theoretical models----- • using rigorous data analysis and reporting processes----- • participant reviews of evaluation case studies, impact assessments and reports.