962 resultados para Series Summation Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metal cation toxicity to basidiomycete fungi is poorly understood, despite its well-known importance in terrestrial ecosystems. Moreover, there is no reported methodology for the routine evaluation of metal toxicity to basidiomycetes. In the present study, we describe the development of a procedure to assess the acute toxicity of metal cations (Na(+), K(+), Li(+), Ca(2+), Mg(2+), Co(2+), Zn(2+), Ni(2+), Mn(2+), Cd(2+), and Cu(2+)) to the bioluminescent basidiomycete fungus Gerronema viridilucens. The method is based on the decrease in the intensity of bioluminescence resulting from injuries sustained by the fungus mycelium exposed to either essential or nonessential metal toxicants. The assay described herein enables LIS to propose a metal toxicity series to Gerronenia viridilucens based on data obtained from the bioluminescence intensity (median effective concentration [EC50] values) versus metal concentration: Cd(2+) > Cu(2+) > Mn(2+) approximate to Ni(2+) approximate to Co(2+) > Zn(2+) > Mg(2+) > Li(+) > K(+) approximate to Na(+) > Ca(2+), and to shed some li-ht on the mechanism of toxic action of metal cations to basidiomycete fungi. Environ. Toxicol. Chem. 2010;29:320-326. (C) 2009 SETAC

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The subgradient optimization method is a simple and flexible linear programming iterative algorithm. It is much simpler than Newton's method and can be applied to a wider variety of problems. It also converges when the objective function is non-differentiable. Since an efficient algorithm will not only produce a good solution but also take less computing time, we always prefer a simpler algorithm with high quality. In this study a series of step size parameters in the subgradient equation is studied. The performance is compared for a general piecewise function and a specific p-median problem. We examine how the quality of solution changes by setting five forms of step size parameter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently DTW (dynamic time warping) has been recognized as the most robust distance function to measure the similarity between two time series, and this fact has spawned a flurry of research on this topic. Most indexing methods proposed for DTW are based on the R-tree structure. Because of high dimensionality and loose lower bounds for time warping distance, the pruning power of these tree structures are quite weak, resulting in inefficient search. In this paper, we propose a dimensionality reduction method motivated by observations about the inherent character of each time series. A very compact index file is constructed. By scanning the index file, we can get a very small candidate set, so that the number of page access is dramatically reduced. We demonstrate the effectiveness of our approach on real and synthetic datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose. NaCl has proven to be an effective bitterness inhibitor, but the reason remains unclear. The purpose of this study was to examine the influence of a variety of cations and anions on the bitterness of selected oral pharmaceuticals and bitter taste stimuli: pseudoephedrine, ranitidine, acetaminophen, quinine, and urea.
Method. Human psychophysical taste evaluation using a whole mouth exposure procedure was used.
Results. The cations (all associated with the acetate anion) inhibited bitterness when mixed with pharmaceutical solutions to varying degrees. The sodium cation significantly (P < 0.003) inhibited bitterness of the pharmaceuticals more than the other cations. The anions (all associated with the sodium cation) also inhibited bitterness to varying degrees. With the exception of salicylate, the glutamate and adenosine monophosphate anions significantly (P < 0.001) inhibited bitterness of the pharmaceuticals more than the other anions. Also, there were several specific inhibitory interactions between ammonium, sodium and salicylate and certain pharmaceuticals.
Conclusions. We conclude that sodium was the most successful cation and glutamate and AMP were the most successful anions at inhibiting bitterness. Structure forming and breaking properties of ions, as predicted by the Hofmeister series, and other physical-chemical ion properties failed to significantly predict bitterness inhibition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drawing as a means of recording is a very common practice in junior primary science lessons. This is largely due to the availability of necessary materials. Also, most youg children have some degree of drawing skill and enjoy drawing activities. Since 1956 the science curriculum to be implemented in primary classrooms in Victoria has changed from one that was based largely on nature study (biological) to one that includes physical and technological aspects. Further, there have been changes in the teaching methodologies advocated for use in science lessons. A modified Interactive Teaching Approach was used for the studies. Drawing was the main means by which the children recorded information. The topic of 'shells' was used to enable collection of data about the children's enjoyment of the activity and satisfaction with their achievement. This study was replicated using the topic 'rocks'; again data were collected concerning satisfaction and enjoyment. During a series of lessons on 'snails' data were collected concerning the achievement of 'process' and 'objective' purposes that teachers might have in mind when setting a drawing activity. In addition to providing data about purposes the study stimulated some questions regarding the techniques the children had used in their drawings. Accordingly, data concerning the use of graphic techniques by the children were collected during a series of lessons on 'oils'. The data collected and analysed in the various studies highlighted the value of drawing in junior primary school science lessons. It also validated strategies developed by the author and designed to help teachers and children use drawing effectively in science activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper continues the prior research undertaken by Warren and Leitch (2009), in which a series of initial research findings were presented. These findings identified that in Australia, Supply Chain Management (SCM) systems were the weak link of Australian critical infrastructure. This paper focuses upon the security and risk issues associated with SCM systems and puts forward a new SCM Security Risk Management method, continuing the research presented at the European Conference of Information Warfare in 2009.This paper proposes a new Security Risk Analysis model that deals with the complexity of protecting SCM critical infrastructure systems and also introduces a new approach that organisations can apply to protect their SCM systems. The paper describes the importance of SCM systems from a critical infrastructure protection perspective. The paper then discusses the importance of SCM systems in relation to supporting centres of populations and gives examples of the impact of failure. The paper proposes a new SCM security risk analysis method that deals with the security issues related to SCM security and the security issues associated with Information Security. The paper will also discuss a risk framework that can be used to protect against high and low level associated security risks using a new SCM security risk analysis method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature aggregation is a critical technique in content-based image retrieval (CBIR) that combines multiple feature distances to obtain image dissimilarity. Conventional parallel feature aggregation (PFA) schemes failed to effectively filter out the irrelevant images using individual visual features before ranking images in collection. Series feature aggregation (SFA) is a new scheme that aims to address this problem. This paper investigates three important properties of SFA that are significant for design of systems. They reveal the irrelevance of feature order and the convertibility of SFA and PFA as well as the superior performance of SFA. Furthermore, based on Gaussian kernel density estimator, the authors propose a new method to estimate the visual threshold, which is the key parameter of SFA. Experiments, conducted with IAPR TC-12 benchmark image collection (ImageCLEF2006) that contains over 20,000 photographic images and defined queries, have shown that SFA can outperform conventional PFA schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature aggregation is a critical technique in content-based image retrieval (CBIR) that combines multiple feature distances to obtain image dissimilarity. Conventional parallel feature aggregation (PFA) schemes failed to effectively filter out the irrelevant images using individual visual features before ranking images in collection. Series feature aggregation (SFA) is a new scheme that aims to address this problem. This paper investigates three important properties of SFA that are significant for design of systems. They reveal the irrelevance of feature order and the convertibility of SFA and PFA as well as the superior performance of SFA. Furthermore, based on Gaussian kernel density estimator, the authors propose a new method to estimate the visual threshold, which is the key parameter of SFA. Experiments, conducted with IAPR TC-12 benchmark image collection (ImageCLEF2006) that contains over 20,000 photographic images and defined queries, have shown that SFA can outperform conventional PFA schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optical inspection techniques have been widely used in industry as they are non-destructive. Since defect patterns are rooted from the manufacturing processes in semiconductor industry, efficient and effective defect detection and pattern recognition algorithms are in great demand to find out closely related causes. Modifying the manufacturing processes can eliminate defects, and thus to improve the yield. Defect patterns such as rings, semicircles, scratches, and clusters are the most common defects in the semiconductor industry. Conventional methods cannot identify two scale-variant or shift-variant or rotation-variant defect patterns, which in fact belong to the same failure causes. To address these problems, a new approach is proposed in this paper to detect these defect patterns in noisy images. First, a novel scheme is developed to simulate datasets of these 4 patterns for classifiers' training and testing. Second, for real optical images, a series of image processing operations have been applied in the detection stage of our method. In the identification stage, defects are resized and then identified by the trained support vector machine. Adaptive resonance theory network 1 is also implemented for comparisons. Classification results of both simulated data and real noisy raw data show the effectiveness of our method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

 In anticipation of helping students mature from passive to more active learners while engaging with the issues and concepts surrounding computer security, a student generated Multiple Choice Question (MCQ) learning strategy was designed and deployed as a replacement for an assessment task that was previously based on students providing solutions to a series of short-answer questions. To determine whether there was any educational value in students generating their own MCQs students were required to design MCQs. Prior to undertaking this assessment activity each participant completed a pre-test which consisted of 45 MCQs based on the topics of the assessment. Following the assessment activity the participants completed a post-test which consisted of the same MCQs as the pre-test. The pre and post test results as well as the post test and assessment activity results were tested for statistical significance. The results indicated that having students generate their own MCQs as a method of assessment did not have a negative effect on the learning experience. By providing a framework to the students based on the literature to support their engagement with the learning material, we believe the creation of well-structured MCQs resulted in a more advanced understanding of the relationships between the concepts of the learning material as compared with plainly answering a series of short-answer questions from a textbook. Further study is required to determine to what degree this learning strategy encouraged a deeper approach to learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polypyrrole is a material with immensely useful properties suitable for a wide range of electrochemical applications, but its development has been hindered by cumbersome manufacturing processes. Here we show that a simple modification to the standard electrochemical polymerization method produces polypyrrole films of equivalently high conductivity and superior mechanical properties in one-tenth of the polymerization time. Preparing the film as a series of electrodeposited layers with thorough solvent washing between layering was found to produce excellent quality films even when layer deposition was accelerated by high current. The washing step between the sequentially polymerized layers altered the deposition mechanism, eliminating the typical dendritic growth and generating nonporous deposits. Solvent washing was shown to reduce the concentration of oligomeric species in the near-electrode region and hinder the three-dimensional growth mechanism that occurs by deposition of secondary particles from solution. As artificial muscles, the high density sequentially polymerized films produced the highest mechanical work output yet reported for polypyrrole actuators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis based on the holistic multiple time series system has been a practical and crucial topic. In this paper, we mainly study a new problem that how the data is produced underneath the multiple time series system, which means how to model time series data generating and evolving rules (here denoted as semantics). We assume that there exist a set of latent states, which are the system basis and make the system run: data generating and evolving. Thus, there are several challenges on the problem: (1) How to detect the latent states; (2) How to learn the rules based on the states; (3) What the semantics can be used for. Hence, a novel correlation field-based semantics learning method is proposed to learn the semantics. In the method, we first detect latent state assignment by comprehensively considering kinds of multiple time series characteristics, which contain tick-by-tick data, temporal ordering, relationship among multiple time series and so on. Then, the semantics are learnt by Bayesian Markov characteristic. Actually, the learned semantics could be applied into various applications, such as prediction or anomaly detection for further analysis. Thus, we propose two algorithms based on the semantics knowledge, which are applied to make next-n step prediction and detect anomalies respectively. Some experiments on real world data sets were conducted to show the efficiency of our proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to design and develop an optimal motion cueing algorithm (MCA) based on the genetic algorithm (GA) that can generate high-fidelity motions within the motion simulator's physical limitations. Both, angular velocity and linear acceleration are adopted as the inputs to the MCA for producing the higher order optimal washout filter. The linear quadratic regulator (LQR) method is used to constrain the human perception error between the real and simulated driving tasks. To develop the optimal MCA, the latest mathematical models of the vestibular system and simulator motion are taken into account. A reference frame with the center of rotation at the driver's head to eliminate false motion cues caused by rotation of the simulator to the translational motion of the driver's head as well as to reduce the workspace displacement is employed. To improve the developed LQR-based optimal MCA, a new strategy based on optimal control theory and the GA is devised. The objective is to reproduce a signal that can follow closely the reference signal and avoid false motion cues by adjusting the parameters from the obtained LQR-based optimal washout filter. This is achieved by taking a series of factors into account, which include the vestibular sensation error between the real and simulated cases, the main dynamic limitations, the human threshold limiter in tilt coordination, the cross correlation coefficient, and the human sensation error fluctuation. It is worth pointing out that other related investigations in the literature normally do not consider the effects of these factors. The proposed optimized MCA based on the GA is implemented using the MATLAB/Simulink software. The results show the effectiveness of the proposed GA-based method in enhancing human sensation, maximizing the reference shape tracking, and reducing the workspace usage.