883 resultados para Simulation Based Method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, numerical simulations are used in an attempt to find optimal Source profiles for high frequency radiofrequency (RF) volume coils. Biologically loaded, shielded/unshielded circular and elliptical birdcage coils operating at 170 MHz, 300 MHz and 470 MHz are modelled using the FDTD method for both 2D and 3D cases. Taking advantage of the fact that some aspects of the electromagnetic system are linear, two approaches have been proposed for the determination of the drives for individual elements in the RF resonator. The first method is an iterative optimization technique with a kernel for the evaluation of RF fields inside an imaging plane of a human head model using pre-characterized sensitivity profiles of the individual rungs of a resonator; the second method is a regularization-based technique. In the second approach, a sensitivity matrix is explicitly constructed and a regularization procedure is employed to solve the ill-posed problem. Test simulations show that both methods can improve the B-1-field homogeneity in both focused and non-focused scenarios. While the regularization-based method is more efficient, the first optimization method is more flexible as it can take into account other issues such as controlling SAR or reshaping the resonator structures. It is hoped that these schemes and their extensions will be useful for the determination of multi-element RF drives in a variety of applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The recent deregulation in electricity markets worldwide has heightened the importance of risk management in energy markets. Assessing Value-at-Risk (VaR) in electricity markets is arguably more difficult than in traditional financial markets because the distinctive features of the former result in a highly unusual distribution of returns-electricity returns are highly volatile, display seasonalities in both their mean and volatility, exhibit leverage effects and clustering in volatility, and feature extreme levels of skewness and kurtosis. With electricity applications in mind, this paper proposes a model that accommodates autoregression and weekly seasonals in both the conditional mean and conditional volatility of returns, as well as leverage effects via an EGARCH specification. In addition, extreme value theory (EVT) is adopted to explicitly model the tails of the return distribution. Compared to a number of other parametric models and simple historical simulation based approaches, the proposed EVT-based model performs well in forecasting out-of-sample VaR. In addition, statistical tests show that the proposed model provides appropriate interval coverage in both unconditional and, more importantly, conditional contexts. Overall, the results are encouraging in suggesting that the proposed EVT-based model is a useful technique in forecasting VaR in electricity markets. (c) 2005 International Institute of Forecasters. Published by Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the rapid increase in both centralized video archives and distributed WWW video resources, content-based video retrieval is gaining its importance. To support such applications efficiently, content-based video indexing must be addressed. Typically, each video is represented by a sequence of frames. Due to the high dimensionality of frame representation and the large number of frames, video indexing introduces an additional degree of complexity. In this paper, we address the problem of content-based video indexing and propose an efficient solution, called the Ordered VA-File (OVA-File) based on the VA-file. OVA-File is a hierarchical structure and has two novel features: 1) partitioning the whole file into slices such that only a small number of slices are accessed and checked during k Nearest Neighbor (kNN) search and 2) efficient handling of insertions of new vectors into the OVA-File, such that the average distance between the new vectors and those approximations near that position is minimized. To facilitate a search, we present an efficient approximate kNN algorithm named Ordered VA-LOW (OVA-LOW) based on the proposed OVA-File. OVA-LOW first chooses possible OVA-Slices by ranking the distances between their corresponding centers and the query vector, and then visits all approximations in the selected OVA-Slices to work out approximate kNN. The number of possible OVA-Slices is controlled by a user-defined parameter delta. By adjusting delta, OVA-LOW provides a trade-off between the query cost and the result quality. Query by video clip consisting of multiple frames is also discussed. Extensive experimental studies using real video data sets were conducted and the results showed that our methods can yield a significant speed-up over an existing VA-file-based method and iDistance with high query result quality. Furthermore, by incorporating temporal correlation of video content, our methods achieved much more efficient performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Testing for simultaneous vicariance across comparative phylogeographic data sets is a notoriously difficult problem hindered by mutational variance, the coalescent variance, and variability across pairs of sister taxa in parameters that affect genetic divergence. We simulate vicariance to characterize the behaviour of several commonly used summary statistics across a range of divergence times, and to characterize this behaviour in comparative phylogeographic datasets having multiple taxon-pairs. We found Tajima's D to be relatively uncorrelated with other summary statistics across divergence times, and using simple hypothesis testing of simultaneous vicariance given variable population sizes, we counter-intuitively found that the variance across taxon pairs in Nei and Li's net nucleotide divergence (pi(net)), a common measure of population divergence, is often inferior to using the variance in Tajima's D across taxon pairs as a test statistic to distinguish ancient simultaneous vicariance from variable vicariance histories. The opposite and more intuitive pattern is found for testing more recent simultaneous vicariance, and overall we found that depending on the timing of vicariance, one of these two test statistics can achieve high statistical power for rejecting simultaneous vicariance, given a reasonable number of intron loci (> 5 loci, 400 bp) and a range of conditions. These results suggest that components of these two composite summary statistics should be used in future simulation-based methods which can simultaneously use a pool of summary statistics to test comparative the phylogeographic hypotheses we consider here.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One approach to microbial genotyping is to make use of sets of single-nucleotide polymorphisms (SNPs) in combination with binary markers. Here we report the modification and automation of a SNP-plus-binary-marker-based approach to the genotyping of Staphylococcus aureus and its application to 391 S. aureus isolates from southeast Queensland, Australia. The SNPs used were arcC210, tpi243, arcC162, gmk318, pta294, tpi36, tpi241, and pta383. These provide a Simpson's index of diversity (D) of 0.95 with respect to the S. aureus multilocus sequence typing database and define 61 genotypes and the major clonal complexes. The binary markers used were pvl, cna, sdrE, pT181, and pUB110. Two novel real-time PCR formats for interrogating these markers were compared. One of these makes use of light upon extension (LUX) primers and biplexed reactions, while the other is a streamlined modification of kinetic PCR using SYBR green. The latter format proved to be more robust. In addition, automated methods for DNA template preparation, reaction setup, and data analysis were developed. A single SNP-based method for ST-93 (Queensland clone) identification was also devised. The genotyping revealed the numerical importance of the South West Pacific and Queensland community-acquired methicillin-resistant S. aureus (MRSA) clones and the clonal complex 239 Aus-1/Aus-2 hospital-associated MRSA. There was a strong association between the community-acquired clones and pvl.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Study Design. Development of an automatic measurement algorithm and comparison with manual measurement methods. Objectives. To develop a new computer-based method for automatic measurement of vertebral rotation in idiopathic scoliosis from computed tomography images and to compare the automatic method with two manual measurement techniques. Summary of Background Data. Techniques have been developed for vertebral rotation measurement in idiopathic scoliosis using plain radiographs, computed tomography, or magnetic resonance images. All of these techniques require manual selection of landmark points and are therefore subject to interobserver and intraobserver error. Methods. We developed a new method for automatic measurement of vertebral rotation in idiopathic scoliosis using a symmetry ratio algorithm. The automatic method provided values comparable with Aaro and Ho's manual measurement methods for a set of 19 transverse computed tomography slices through apical vertebrae, and with Aaro's method for a set of 204 reformatted computed tomography images through vertebral endplates. Results. Confidence intervals (95%) for intraobserver and interobserver variability using manual methods were in the range 5.5 to 7.2. The mean (+/- SD) difference between automatic and manual rotation measurements for the 19 apical images was -0.5 degrees +/- 3.3 degrees for Aaro's method and 0.7 degrees +/- 3.4 degrees for Ho's method. The mean (+/- SD) difference between automatic and manual rotation measurements for the 204 endplate images was 0.25 degrees +/- 3.8 degrees. Conclusions. The symmetry ratio algorithm allows automatic measurement of vertebral rotation in idiopathic scoliosis without intraobserver or interobserver error due to landmark point selection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of critical challenges in automatic recognition of TV commercials is to generate a unique, robust and compact signature. Uniqueness indicates the ability to identify the similarity among the commercial video clips which may have slight content variation. Robustness means the ability to match commercial video clips containing the same content but probably with different digitalization/encoding, some noise data, and/or transmission and recording distortion. Efficiency is about the capability of effectively matching commercial video sequences with a low computation cost and storage overhead. In this paper, we present a binary signature based method, which meets all the three criteria above, by combining the techniques of ordinal and color measurements. Experimental results on a real large commercial video database show that our novel approach delivers a significantly better performance comparing to the existing methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Non-technical losses (NTL) identification and prediction are important tasks for many utilities. Data from customer information system (CIS) can be used for NTL analysis. However, in order to accurately and efficiently perform NTL analysis, the original data from CIS need to be pre-processed before any detailed NTL analysis can be carried out. In this paper, we propose a feature selection based method for CIS data pre-processing in order to extract the most relevant information for further analysis such as clustering and classifications. By removing irrelevant and redundant features, feature selection is an essential step in data mining process in finding optimal subset of features to improve the quality of result by giving faster time processing, higher accuracy and simpler results with fewer features. Detailed feature selection analysis is presented in the paper. Both time-domain and load shape data are compared based on the accuracy, consistency and statistical dependencies between features.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Management of collaborative business processes that span multiple business entities has emerged as a key requirement for business success. These processes are embedded in sets of rules describing complex message-based interactions between parties such that if a logical expression defined on the set of received messages is satisfied, one or more outgoing messages are dispatched. The execution of these processes presents significant challenges since each contentrich message may contribute towards the evaluation of multiple expressions in different ways and the sequence of message arrival cannot be predicted. These challenges must be overcome in order to develop an efficient execution strategy for collaborative processes in an intensive operating environment with a large number of rules and very high throughput of messages. In this paper, we present a discussion on issues relevant to the evaluation of such expressions and describe a basic query-based method for this purpose, including suggested indexes for improved performance. We conclude by identifying several potential future research directions in this area. © 2010 IEEE. All rights reserved

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new control algorithm using parallel braking resistor (BR) and serial fault current limiter (FCL) for power system transient stability enhancement is presented in this paper. The proposed control algorithm can prevent transient instability during first swing by immediately taking away the transient energy gained in faulted period. It can also reduce generator oscillation time and efficiently make system back to the post-fault equilibrium. The algorithm is based on a new system energy function based method to choose optimal switching point. The parallel BR and serial FCL resistor can be switched at the calculated optimal point to get the best control result. This method allows optimum dissipation of the transient energy caused by disturbance so to make system back to equilibrium in minimum time. Case studies are given to verify the efficiency and effectiveness of this new control algorithm.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Investment in mining projects, like most business investment, is susceptible to risk and uncertainty. The ability to effectively identify, assess and manage risk may enable strategic investments to be sheltered and operations to perform closer to their potential. In mining, geological uncertainty is seen as the major contributor to not meeting project expectations. The need to assess and manage geological risk for project valuation and decision-making translates to the need to assess and manage risk in any pertinent parameter of open pit design and production scheduling. This is achieved by taking geological uncertainty into account in the mine optimisation process. This thesis develops methods that enable geological uncertainty to be effectively modelled and the resulting risk in long-term production scheduling to be quantified and managed. One of the main accomplishments of this thesis is the development of a new, risk-based method for the optimisation of long-term production scheduling. In addition to maximising economic returns, the new method minimises the risk of deviating from production forecasts, given the understanding of the orebody. This ability represents a major advance in the risk management of open pit mining.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We have carried out a discovery proteomics investigation aimed at identifying disease biomarkers present in saliva, and, more specifically, early biomarkers of inflammation. The proteomic characterization of saliva is possible due to the straightforward and non-invasive sample collection that allows repetitive analyses for pharmacokinetic studies. These advantages are particularly relevant in the case of newborn patients. The study was carried out with samples collected during the first 48 hours of life of the newborns according to an approved Ethic Committee procedure. In particular, the salivary samples were collected from healthy and infected (n=1) newborns. Proteins were extracted through cycles of sonication, precipitated in ice cold acetone, resuspended and resolved by 2D-electrophoresis. MALDI TOF/TOF mass spectrometry analysis was performed for each spot obtaining the proteins’ identifications. Then we compared healthy newborn salivary proteome and an infected newborn salivary proteome in order to investigate proteins differently expressed in inflammatory condition. In particular the protein alpha-1-antitrypsin (A1AT), correlated with inflammation, was detected differently expressed in the infected newborn saliva. Therefore, in the second part of the project we aimed to develop a robust LC-MS based method that identifies and quantifies this inflammatory protein within saliva that might represent the first relevant step to diagnose a condition of inflammation with a no-invasive assay. The same LC-MS method is also useful to investigate the presence of the F allelic variant of the A1AT in biological samples, which is correlated with the onset of pulmonary diseases. In the last part of the work we analysed newborn saliva samples in order to investigate how phospholipids and mediators of inflammation (eicosanoids) are subject to variations under inflammatory conditions and a trend was observed in lysophosphatidylcholines composition according to the inflammatory conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A simple method for training the dynamical behavior of a neural network is derived. It is applicable to any training problem in discrete-time networks with arbitrary feedback. The algorithm resembles back-propagation in that an error function is minimized using a gradient-based method, but the optimization is carried out in the hidden part of state space either instead of, or in addition to weight space. Computational results are presented for some simple dynamical training problems, one of which requires response to a signal 100 time steps in the past.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The n-tuple pattern recognition method has been tested using a selection of 11 large data sets from the European Community StatLog project, so that the results could be compared with those reported for the 23 other algorithms the project tested. The results indicate that this ultra-fast memory-based method is a viable competitor with the others, which include optimisation-based neural network algorithms, even though the theory of memory-based neural computing is less highly developed in terms of statistical theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Marketing scholars are increasingly recognizing the importance of investigating phenomena at multiple levels. However, the analyses methods that are currently dominant within marketing may not be appropriate to dealing with multilevel or nested data structures. We identify the state of contemporary multilevel marketing research, finding that typical empirical approaches within marketing research may be less effective at explicitly taking account of multilevel data structures than those in other organizational disciplines. A Monte Carlo simulation, based on results from a previously published marketing study, demonstrates that different approaches to analysis of the same data can result in very different results (both in terms of power and effect size). The implication is that marketing scholars should be cautious when analyzing multilevel or other grouped data, and we provide a discussion and introduction to the use of hierarchical linear modeling for this purpose.