74 resultados para Advanced Application of Geographical Information Systems
Resumo:
This paper investigates the application of the Hilbert spectrum (HS), which is a recent tool for the analysis of nonlinear and nonstationary time-series, to the study of electromyographic (EMG) signals. The HS allows for the visualization of the energy of signals through a joint time-frequency representation. In this work we illustrate the use of the HS in two distinct applications. The first is for feature extraction from EMG signals. Our results showed that the instantaneous mean frequency (IMNF) estimated from the HS is a relevant feature to clinical practice. We found that the median of the IMNF reduces when the force level of the muscle contraction increases. In the second application we investigated the use of the HS for detection of motor unit action potentials (MUAPs). The detection of MUAPs is a basic step in EMG decomposition tools, which provide relevant information about the neuromuscular system through the morphology and firing time of MUAPs. We compared, visually, how MUAP activity is perceived on the HS with visualizations provided by some traditional (e.g. scalogram, spectrogram, Wigner-Ville) time-frequency distributions. Furthermore, an alternative visualization to the HS, for detection of MUAPs, is proposed and compared to a similar approach based on the continuous wavelet transform (CWT). Our results showed that both the proposed technique and the CWT allowed for a clear visualization of MUAP activity on the time-frequency distributions, whereas results obtained with the HS were the most difficult to interpret as they were extremely affected by spurious energy activity. (c) 2008 Elsevier Inc. All rights reserved.
Resumo:
Increasingly, the microbiological scientific community is relying on molecular biology to define the complexity of the gut flora and to distinguish one organism from the next. This is particularly pertinent in the field of probiotics, and probiotic therapy, where identifying probiotics from the commensal flora is often warranted. Current techniques, including genetic fingerprinting, gene sequencing, oligonucleotide probes and specific primer selection, discriminate closely related bacteria with varying degrees of success. Additional molecular methods being employed to determine the constituents of complex microbiota in this area of research are community analysis, denaturing gradient gel electrophoresis (DGGE)/temperature gradient gel electrophoresis (TGGE), fluorescent in situ hybridisation (FISH) and probe grids. Certain approaches enable specific aetiological agents to be monitored, whereas others allow the effects of dietary intervention on bacterial populations to be studied. Other approaches demonstrate diversity, but may not always enable quantification of the population. At the heart of current molecular methods is sequence information gathered from culturable organisms. However, the diversity and novelty identified when applying these methods to the gut microflora demonstrates how little is known about this ecosystem. Of greater concern is the inherent bias associated with some molecular methods. As we understand more of the complexity and dynamics of this diverse microbiota we will be in a position to develop more robust molecular-based technologies to examine it. In addition to identification of the microbiota and discrimination of probiotic strains from commensal organisms, the future of molecular biology in the field of probiotics and the gut flora will, no doubt, stretch to investigations of functionality and activity of the microflora, and/or specific fractions. The quest will be to demonstrate the roles of probiotic strains in vivo and not simply their presence or absence.
Resumo:
Genetic algorithms (GAs) have been introduced into site layout planning as reported in a number of studies. In these studies, the objective functions were defined so as to employ the GAs in searching for the optimal site layout. However, few studies have been carried out to investigate the actual closeness of relationships between site facilities; it is these relationships that ultimately govern the site layout. This study has determined that the underlying factors of site layout planning for medium-size projects include work flow, personnel flow, safety and environment, and personal preferences. By finding the weightings on these factors and the corresponding closeness indices between each facility, a closeness relationship has been deduced. Two contemporary mathematical approaches - fuzzy logic theory and an entropy measure - were adopted in finding these results in order to minimize the uncertainty and vagueness of the collected data and improve the quality of the information. GAs were then applied to searching for the optimal site layout in a medium-size government project using the GeneHunter software. The objective function involved minimizing the total travel distance. An optimal layout was obtained within a short time. This reveals that the application of GA to site layout planning is highly promising and efficient.
Resumo:
Information modelling is a topic that has been researched a great deal, but still many questions around it have not been solved. An information model is essential in the design of a database which is the core of an information system. Currently most of databases only deal with information that represents facts, or asserted information. The ability of capturing semantic aspect has to be improved, and yet other types, such as temporal and intentional information, should be considered. Semantic Analysis, a method of information modelling, has offered a way to handle various aspects of information. It employs the domain knowledge and communication acts as sources of information modelling. It lends itself to a uniform structure whereby semantic, temporal and intentional information can be captured, which builds a sound foundation for building a semantic temporal database.
Resumo:
Emergency vehicles use high-amplitude sirens to warn pedestrians and other road users of their presence. Unfortunately, the siren noise enters the vehicle and corrupts the intelligibility of two-way radio voice com-munications from the emergency vehicle to a control room. Often the siren has to be turned off to enable the control room to hear what is being said which subsequently endangers people's lives. A digital signal processing (DSP) based system for the cancellation of siren noise embedded within speech is presented. The system has been tested with the least mean square (LMS), normalised least mean square (NLMS) and affine projection algorithm (APA) using recordings from three common types of sirens (two-tone, wail and yelp) from actual test vehicles. It was found that the APA with a projection order of 2 gives comparably improved cancellation over the LMS and NLMS with only a moderate increase in algorithm complexity and code size. Therefore, this siren noise cancellation system using the APA offers an improvement in cancellation achieved by previous systems. The removal of the siren noise improves the response time for the emergency vehicle and thus the system can contribute to saving lives. The system also allows voice communication to take place even when the siren is on and as such the vehicle offers less risk of danger when moving at high speeds in heavy traffic.
Resumo:
This paper suggests a method for identifying individuals who are most suited to using virtual reality (VR) systems. The aim is to help both an individual or employer to decide where that individual's skills and abilities would be best deployed. By considering a potential user's competence and temperament, a graphical representation is introduced that may then be used to crudely delineate a high-aptitude participant against those with lesser capabilities. By introducing standard tests for competence and a standard classifier for temperament, and by further weighting each measure with respect to the technology currently available and the application, a detailed representation of the effectiveness of different users is developed.
Resumo:
A statistical technique for fault analysis in industrial printing is reported. The method specifically deals with binary data, for which the results of the production process fall into two categories, rejected or accepted. The method is referred to as logistic regression, and is capable of predicting future fault occurrences by the analysis of current measurements from machine parts sensors. Individual analysis of each type of fault can determine which parts of the plant have a significant influence on the occurrence of such faults; it is also possible to infer which measurable process parameters have no significant influence on the generation of these faults. Information derived from the analysis can be helpful in the operator's interpretation of the current state of the plant. Appropriate actions may then be taken to prevent potential faults from occurring. The algorithm is being implemented as part of an applied self-learning expert system.
Resumo:
The aim of this paper is to critically examine the application of development appraisal to viability assessment in the planning system. This evaluation is of development appraisal models in general and also their use in particular applications associated with estimating planning obligation capacity. The paper is organised into four themes: · The context and conceptual basis for development viability appraisal · A review of development viability appraisal methods · A discussion of selected key inputs into a development viability appraisal · A discussion of the applications of development viability appraisals in the planning system It is assumed that readers are familiar with the basic models and information needs of development viability appraisal rather than at the cutting edge of practice and/or academe
Resumo:
Shiga toxin producing Escherichia coli (STEC) strains are foodborne pathogens whose ability to produce Shiga toxin (Stx) is due to the integration of Stx-encoding lambdoid bacteriophage (Stx phage). Circulating, infective Stx phages are very difficult to isolate, purify and propagate such that there is no information on their genetic composition and properties. Here we describe a novel approach that exploits the phage's ability to infect their host and form a lysogen, thus enabling purification of Stx phages by a series of sequential lysogen isolation and induction steps. A total of 15 Stx phages were rigorously purified from water samples in this way, classified by TEM and genotyped using a PCR-based multi-loci characterisation system. Each phage possessed only one variant of each target gene type, thus confirming its purity, with 9 of the 15 phages possessing a short tail-spike gene and identified by TEM as Podoviridae. The remaining 6 phages possessed long tails, four of which appeared to be contractile in nature (Myoviridae) and two of which were morphologically very similar to bacteriophage lambda (Siphoviridae).
Resumo:
Platelets in the circulation are triggered by vascular damage to activate, aggregate and form a thrombus that prevents excessive blood loss. Platelet activation is stringently regulated by intracellular signalling cascades, which when activated inappropriately lead to myocardial infarction and stroke. Strategies to address platelet dysfunction have included proteomics approaches which have lead to the discovery of a number of novel regulatory proteins of potential therapeutic value. Global analysis of platelet proteomes may enhance the outcome of these studies by arranging this information in a contextual manner that recapitulates established signalling complexes and predicts novel regulatory processes. Platelet signalling networks have already begun to be exploited with interrogation of protein datasets using in silico methodologies that locate functionally feasible protein clusters for subsequent biochemical validation. Characterization of these biological systems through analysis of spatial and temporal organization of component proteins is developing alongside advances in the proteomics field. This focused review highlights advances in platelet proteomics data mining approaches that complement the emerging systems biology field. We have also highlighted nucleated cell types as key examples that can inform platelet research. Therapeutic translation of these modern approaches to understanding platelet regulatory mechanisms will enable the development of novel anti-thrombotic strategies.
Resumo:
The three decades of on-going executives’ concerns of how to achieve successful alignment between business and information technology shows the complexity of such a vital process. Most of the challenges of alignment are related to knowledge and organisational change and several researchers have introduced a number of mechanisms to address some of these challenges. However, these mechanisms pay less attention to multi-level effects, which results in a limited un-derstanding of alignment across levels. Therefore, we reviewed these challenges from a multi-level learning perspective and found that business and IT alignment is related to the balance of exploitation and exploration strategies with the intellec-tual content of individual, group and organisational levels.
Resumo:
The problem of symmetric stability is examined within the context of the direct Liapunov method. The sufficient conditions for stability derived by Fjørtoft are shown to imply finite-amplitude, normed stability. This finite-amplitude stability theorem is then used to obtain rigorous upper bounds on the saturation amplitude of disturbances to symmetrically unstable flows.By employing a virial functional, the necessary conditions for instability implied by the stability theorem are shown to be in fact sufficient for instability. The results of Ooyama are improved upon insofar as a tight two-sided (upper and lower) estimate is obtained of the growth rate of (modal or nonmodal) symmetric instabilities.The case of moist adiabatic systems is also considered.