968 resultados para neural modeling
Resumo:
Gaussian mixture models (GMMs) have become an established means of modeling feature distributions in speaker recognition systems. It is useful for experimentation and practical implementation purposes to develop and test these models in an efficient manner particularly when computational resources are limited. A method of combining vector quantization (VQ) with single multi-dimensional Gaussians is proposed to rapidly generate a robust model approximation to the Gaussian mixture model. A fast method of testing these systems is also proposed and implemented. Results on the NIST 1996 Speaker Recognition Database suggest comparable and in some cases an improved verification performance to the traditional GMM based analysis scheme. In addition, previous research for the task of speaker identification indicated a similar system perfomance between the VQ Gaussian based technique and GMMs
Resumo:
Despite many incidents about fake online consumer reviews have been reported, very few studies have been conducted to date to examine the trustworthiness of online consumer reviews. One of the reasons is the lack of an effective computational method to separate the untruthful reviews (i.e., spam) from the legitimate ones (i.e., ham) given the fact that prominent spam features are often missing in online reviews. The main contribution of our research work is the development of a novel review spam detection method which is underpinned by an unsupervised inferential language modeling framework. Another contribution of this work is the development of a high-order concept association mining method which provides the essential term association knowledge to bootstrap the performance for untruthful review detection. Our experimental results confirm that the proposed inferential language model equipped with high-order concept association knowledge is effective in untruthful review detection when compared with other baseline methods.
Resumo:
Delays are an important feature in temporal models of genetic regulation due to slow biochemical processes, such as transcription and translation. In this paper, we show how to model intrinsic noise effects in a delayed setting by either using a delay stochastic simulation algorithm (DSSA) or, for larger and more complex systems, a generalized Binomial τ-leap method (Bτ-DSSA). As a particular application, we apply these ideas to modeling somite segmentation in zebra fish across a number of cells in which two linked oscillatory genes (her1 and her7) are synchronized via Notch signaling between the cells.
Resumo:
Conceptual modeling continues to be an important means for graphically capturing the requirements of an information system. Observations of modeling practice suggest that modelers often use multiple modeling grammars in combination to articulate various aspects of real-world domains. We extend an ontological theory of representation to suggest why and how users employ multiple conceptual modeling grammars in combination. We provide an empirical test of the extended theory using survey data and structured interviews about the use of traditional and structured analysis grammars within an automated tool environment. We find that users of the analyzed tool combine grammars to overcome the ontological incompleteness that exists in each grammar. Users further selected their starting grammar from a predicted subset of grammars only. The qualitative data provides insights as to why some of the predicted deficiencies manifest in practice differently than predicted.
Resumo:
Trees, shrubs and other vegetation are of continued importance to the environment and our daily life. They provide shade around our roads and houses, offer a habitat for birds and wildlife, and absorb air pollutants. However, vegetation touching power lines is a risk to public safety and the environment, and one of the main causes of power supply problems. Vegetation management, which includes tree trimming and vegetation control, is a significant cost component of the maintenance of electrical infrastructure. For example, Ergon Energy, the Australia’s largest geographic footprint energy distributor, currently spends over $80 million a year inspecting and managing vegetation that encroach on power line assets. Currently, most vegetation management programs for distribution systems are calendar-based ground patrol. However, calendar-based inspection by linesman is labour-intensive, time consuming and expensive. It also results in some zones being trimmed more frequently than needed and others not cut often enough. Moreover, it’s seldom practicable to measure all the plants around power line corridors by field methods. Remote sensing data captured from airborne sensors has great potential in assisting vegetation management in power line corridors. This thesis presented a comprehensive study on using spiking neural networks in a specific image analysis application: power line corridor monitoring. Theoretically, the thesis focuses on a biologically inspired spiking cortical model: pulse coupled neural network (PCNN). The original PCNN model was simplified in order to better analyze the pulse dynamics and control the performance. Some new and effective algorithms were developed based on the proposed spiking cortical model for object detection, image segmentation and invariant feature extraction. The developed algorithms were evaluated in a number of experiments using real image data collected from our flight trails. The experimental results demonstrated the effectiveness and advantages of spiking neural networks in image processing tasks. Operationally, the knowledge gained from this research project offers a good reference to our industry partner (i.e. Ergon Energy) and other energy utilities who wants to improve their vegetation management activities. The novel approaches described in this thesis showed the potential of using the cutting edge sensor technologies and intelligent computing techniques in improve power line corridor monitoring. The lessons learnt from this project are also expected to increase the confidence of energy companies to move from traditional vegetation management strategy to a more automated, accurate and cost-effective solution using aerial remote sensing techniques.
Resumo:
Experimental action potential (AP) recordings in isolated ventricular myoctes display significant temporal beat-to-beat variability in morphology and duration. Furthermore, significant cell-to-cell differences in AP also exist even for isolated cells originating from the same region of the same heart. However, current mathematical models of ventricular AP fail to replicate the temporal and cell-to-cell variability in AP observed experimentally. In this study, we propose a novel mathematical framework for the development of phenomenological AP models capable of capturing cell-to-cell and temporal variabilty in cardiac APs. A novel stochastic phenomenological model of the AP is developed, based on the deterministic Bueno-Orovio/Fentonmodel. Experimental recordings of AP are fit to the model to produce AP models of individual cells from the apex and the base of the guinea-pig ventricles. Our results show that the phenomenological model is able to capture the considerable differences in AP recorded from isolated cells originating from the location. We demonstrate the closeness of fit to the available experimental data which may be achieved using a phenomenological model, and also demonstrate the ability of the stochastic form of the model to capture the observed beat-to-beat variablity in action potential duration.
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, few attempts have been made to explore the structure damage with frequency response functions (FRFs). This paper illustrates the damage identification and condition assessment of a beam structure using a new frequency response functions (FRFs) based damage index and Artificial Neural Networks (ANNs). In practice, usage of all available FRF data as an input to artificial neural networks makes the training and convergence impossible. Therefore one of the data reduction techniques Principal Component Analysis (PCA) is introduced in the algorithm. In the proposed procedure, a large set of FRFs are divided into sub-sets in order to find the damage indices for different frequency points of different damage scenarios. The basic idea of this method is to establish features of damaged structure using FRFs from different measurement points of different sub-sets of intact structure. Then using these features, damage indices of different damage cases of the structure are identified after reconstructing of available FRF data using PCA. The obtained damage indices corresponding to different damage locations and severities are introduced as input variable to developed artificial neural networks. Finally, the effectiveness of the proposed method is illustrated and validated by using the finite element modal of a beam structure. The illustrated results show that the PCA based damage index is suitable and effective for structural damage detection and condition assessment of building structures.
Resumo:
In an age where digital innovation knows no boundaries, research in the area of brain-computer interface and other neural interface devices go where none have gone before. The possibilities are endless and as dreams become reality, the implications of these amazing developments should be considered. Some of these new devices have been created to correct or minimise the effects of disease or injury so the paper discusses some of the current research and development in the area, including neuroprosthetics. To assist researchers and academics in identifying some of the legal and ethical issues that might arise as a result of research and development of neural interface devices, using both non-invasive techniques and invasive procedures, the paper discusses a number of recent observations of authors in the field. The issue of enhancing human attributes by incorporating these new devices is also considered. Such enhancement may be regarded as freeing the mind from the constraints of the body, but there are legal and moral issues that researchers and academics would be well advised to contemplate as these new devices are developed and used. While different fact situation surround each of these new devices, and those that are yet to come, consideration of the legal and ethical landscape may assist researchers and academics in dealing effectively with matters that arise in these times of transition. Lawyers could seek to facilitate the resolution of the legal disputes that arise in this area of research and development within the existing judicial and legislative frameworks. Whether these frameworks will suffice, or will need to change in order to enable effective resolution, is a broader question to be explored.
Resumo:
This paper argues for a renewed focus on statistical reasoning in the elementary school years, with opportunities for children to engage in data modeling. Data modeling involves investigations of meaningful phenomena, deciding what is worthy of attention, and then progressing to organizing, structuring, visualizing, and representing data. Reported here are some findings from a two-part activity (Baxter Brown’s Picnic and Planning a Picnic) implemented at the end of the second year of a current three-year longitudinal study (grade levels 1-3). Planning a Picnic was also implemented in a grade 7 class to provide an opportunity for the different age groups to share their products. Addressed here are the grade 2 children’s predictions for missing data in Baxter Brown’s Picnic, the questions posed and representations created by both grade levels in Planning a Picnic, and the metarepresentational competence displayed in the grade levels’ sharing of their products for Planning a Picnic.