981 resultados para vector optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This PhD research has provided novel solutions to three major challenges which have prevented the wide spread deployment of speaker recognition technology: (1) combating enrolment/ verification mismatch, (2) reducing the large amount of development and training data that is required and (3) reducing the duration of speech required to verify a speaker. A range of applications of speaker recognition technology from forensics in criminal investigations to secure access in banking will benefit from the research outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. HRV analysis is an important tool to observe the heart’s ability to respond to normal regulatory impulses that affect its rhythm. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. A computer-based arrhythmia detection system of cardiac states is very useful in diagnostics and disease management. In this work, we studied the identification of the HRV signals using features derived from HOS. These features were fed to the support vector machine (SVM) for classification. Our proposed system can classify the normal and other four classes of arrhythmia with an average accuracy of more than 85%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Calls from 14 species of bat were classified to genus and species using discriminant function analysis (DFA), support vector machines (SVM) and ensembles of neural networks (ENN). Both SVMs and ENNs outperformed DFA for every species while ENNs (mean identification rate – 97%) consistently outperformed SVMs (mean identification rate – 87%). Correct classification rates produced by the ENNs varied from 91% to 100%; calls from six species were correctly identified with 100% accuracy. Calls from the five species of Myotis, a genus whose species are considered difficult to distinguish acoustically, had correct identification rates that varied from 91 – 100%. Five parameters were most important for classifying calls correctly while seven others contributed little to classification performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the philosophical motivation behind Civil Infrastructure Management Systems is to achieve optimal level of service at a minimum cost, the allocation of scarce resources among competing alternatives is still a matter of debate. It appears to be widely accepted that results from tradeoff analysis can be measured by the degree of accomplishment of the objectives. Road management systems not only deal with different asset types but also with conflicting objectives. This paper presents a case study of lifecycle optimization with tradeoff analysis for a road corridor in New Brunswick. Objectives of the study included condition of bridge and roads and road safety. A road safety index was created based on potential for improvement. Road condition was based on roughness, rutting and cracking. Initial results show lack of sustainability in bridge performance. Therefore, bridges where broken by components: deck, superstructure and substructure. Visual inspections, in addition to construction age of each bridge, were combined to generate a surrogate apparent age. Two life cycle analysis were conducted; one aimed to minimize overall cost while achieving sustainable results and another one purely for optimization. -used to identify required levels of budget. Such analyses were used to identify the minimum required budget and to demonstrate that with the same amount of money it was possible to achieve better levels of performance. Dominance and performance driven criteria were combined to identify and select an optimal result. It was found that achievement of optimally sustained results is conditioned by the availability of treatments for all asset classes at across their life spans. For the case study a disaggregated bridge condition index was introduced to the original algorithm to attempt to achieve sustainability in all bridges components, however lack of early stage treatments for substructures produce declining trends for such a component.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unidirectional inductive power transfer (UIPT) systems allow loads to consume power while bidirectional IPT (BIPT) systems are more suitable for loads requiring two way power flow such as vehicle to grid (V2G) applications with electric vehicles (EVs). Many attempts have been made to improve the performance of BIPT systems. In a typical BIPT system, the output power is control using the pickup converter phase shift angle (PSA) while the primary converter regulates the input current. This paper proposes an optimized phase shift modulation strategy to minimize the coil losses of a series – series (SS) compensated BIPT system. In addition, a comprehensive study on the impact of power converters on the overall efficiency of the system is also presented. A closed loop controller is proposed to optimize the overall efficiency of the BIPT system. Theoretical results are presented in comparison to both simulations and measurements of a 0.5 kW prototype to show the benefits of the proposed concept. Results convincingly demonstrate the applicability of the proposed system offering high efficiency over a wide range of output power.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the problems to be solved in attaining the full potentials of hematopoietic stem cell (HSC) applications is the limited availability of the cells. Growing HSCs in a bioreactor offers an alternative solution to this problem. Besides, it also offers the advantages of eliminating labour intensive process as well as the possible contamination involved in the periodic nutrient replenishments in the traditional T-flask stem cell cultivation. In spite of this, the optimization of HSC cultivation in a bioreactor has been barely explored. This manuscript discusses the development of a mathematical model to describe the dynamics in nutrient distribution and cell concentration of an ex vivo HSC cultivation in a microchannel perfusion bioreactor. The model was further used to optimize the cultivation by proposing three alternative feeding strategies in order to prevent the occurrence of nutrient limitation in the bioreactor. The evaluation of these strategies, the periodic step change increase in the inlet oxygen concentration, the periodic step change increase in the media inflow, and the feedback control of media inflow, shows that these strategies can successfully improve the cell yield of the bioreactor. In general, the developed model is useful for the design and optimization of bioreactor operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing numbers of preclinical and clinical studies are utilizing pDNA (plasmid DNA) as the vector. In addition, there has been a growing trend towards larger and larger doses of pDNA utilized in human trials. The growing demand on pDNA manufacture leads to pressure to make more in less time. A key intervention has been the use of monoliths as stationary phases in liquid chromatography. Monolithic stationary phases offer fast separation to pDNA owing to their large pore size, making pDNA in the size range from 100 nm to over 300 nm easily accessible. However, the convective transport mechanism of monoliths does not guarantee plasmid purity. The recovery of pure pDNA hinges on a proper balance in the properties of the adsorbent phase, the mobile phase and the feedstock. The effects of pH and ionic strength of binding buffer, temperature of feedstock, active group density and the pore size of the stationary phase were considered as avenues to improve the recovery and purity of pDNA using a methacrylate-based monolithic adsorbent and Escherichia coli DH5α-pUC19 clarified lysate as feedstock. pDNA recovery was found to be critically dependent on the pH and ionic strength of the mobile phase. Up to a maximum of approx. 92% recovery was obtained under optimum conditions of pH and ionic strength. Increasing the feedstock temperature to 80°C increased the purity of pDNA owing to the extra thermal stability associated with pDNA over contaminants such as proteins. Results from toxicological studies of the plasmid samples using endotoxin standard (E. coli 0.55:B5 lipopolysaccharide) show that endotoxin level decreases with increasing salt concentration. It was obvious that large quantities of pure pDNA can be obtained with minimal extra effort simply by optimizing process parameters and conditions for pDNA purification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Underwater wireless sensor networks (UWSNs) have become the seat of researchers' attention recently due to their proficiency to explore underwater areas and design different applications for marine discovery and oceanic surveillance. One of the main objectives of each deployed underwater network is discovering the optimized path over sensor nodes to transmit the monitored data to onshore station. The process of transmitting data consumes energy of each node, while energy is limited in UWSNs. So energy efficiency is a challenge in underwater wireless sensor network. Dual sinks vector based forwarding (DS-VBF) takes both residual energy and location information into consideration as priority factors to discover an optimized routing path to save energy in underwater networks. The modified routing protocol employs dual sinks on the water surface which improves network lifetime. According to deployment of dual sinks, packet delivery ratio and the average end to end delay are enhanced. Based on our simulation results in comparison with VBF, average end to end delay reduced more than 80%, remaining energy increased 10%, and the increment of packet reception ratio was about 70%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discounted Cumulative Gain (DCG) is a well-known ranking evaluation measure for models built with multiple relevance graded data. By handling tagging data used in recommendation systems as an ordinal relevance set of {negative,null,positive}, we propose to build a DCG based recommendation model. We present an efficient and novel learning-to-rank method by optimizing DCG for a recommendation model using the tagging data interpretation scheme. Evaluating the proposed method on real-world datasets, we demonstrate that the method is scalable and outperforms the benchmarking methods by generating a quality top-N item recommendation list.