973 resultados para Approximations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is an important noninvasive tool used in the neonatal intensive care unit (NICU) for the neurologic evaluation of the sick newborn infant. It provides an excellent assessment of at-risk newborns and formulates a prognosis for long-term neurologic outcome.The automated analysis of neonatal EEG data in the NICU can provide valuable information to the clinician facilitating medical intervention. The aim of this thesis is to develop a system for automatic classification of neonatal EEG which can be mainly divided into two parts: (1) classification of neonatal EEG seizure from nonseizure, and (2) classifying neonatal background EEG into several grades based on the severity of the injury using atomic decomposition. Atomic decomposition techniques use redundant time-frequency dictionaries for sparse signal representations or approximations. The first novel contribution of this thesis is the development of a novel time-frequency dictionary coherent with the neonatal EEG seizure states. This dictionary was able to track the time-varying nature of the EEG signal. It was shown that by using atomic decomposition and the proposed novel dictionary, the neonatal EEG transition from nonseizure to seizure states could be detected efficiently. The second novel contribution of this thesis is the development of a neonatal seizure detection algorithm using several time-frequency features from the proposed novel dictionary. It was shown that the time-frequency features obtained from the atoms in the novel dictionary improved the seizure detection accuracy when compared to that obtained from the raw EEG signal. With the assistance of a supervised multiclass SVM classifier and several timefrequency features, several methods to automatically grade EEG were explored. In summary, the novel techniques proposed in this thesis contribute to the application of advanced signal processing techniques for automatic assessment of neonatal EEG recordings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyzes a class of common-component allocation rules, termed no-holdback (NHB) rules, in continuous-review assemble-to-order (ATO) systems with positive lead times. The inventory of each component is replenished following an independent base-stock policy. In contrast to the usually assumed first-come-first-served (FCFS) component allocation rule in the literature, an NHB rule allocates a component to a product demand only if it will yield immediate fulfillment of that demand. We identify metrics as well as cost and product structures under which NHB rules outperform all other component allocation rules. For systems with certain product structures, we obtain key performance expressions and compare them to those under FCFS. For general product structures, we present performance bounds and approximations. Finally, we discuss the applicability of these results to more general ATO systems. © 2010 INFORMS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transformation optics (TO) is a powerful tool for the design of electromagnetic and optical devices with novel functionality derived from the unusual properties of the transformation media. In general, the fabrication of TO media is challenging, requiring spatially varying material properties with both anisotropic electric and magnetic responses. Though metamaterials have been proposed as a path for achieving such complex media, the required properties arising from the most general transformations remain elusive, and cannot implemented by state-of-the-art fabrication techniques. Here, we propose faceted approximations of TO media of arbitrary shape in which the volume of the TO device is divided into flat metamaterial layers. These layers can be readily implemented by standard fabrication and stacking techniques. We illustrate our approximation approach for the specific example of a two-dimensional, omnidirectional "invisibility cloak", and quantify its performance using the total scattering cross section as a practical figure of merit. © 2012 American Institute of Physics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

L'article examine comment s'affirme l'idéal-type émergent des eurorégions en Europe. En analysant les discours produits par des institutions, des acteurs économiques et des médias, nous reconstituons la définition du projet eurorégional à partir des diverses positions énonciatives et indépendamment des langues ou de la localisation géographique des eurorégions. D'un côté, les résultats mettent en évidence des métaphores caractéristiques du discours politique européen (la construction, l'expérimentation, le corps) qui contribuent à instaurer l'imaginaire d'un continuum territorial en Europe. D'un autre côté, les résultats dévoilent des zones d'ombre (dissensions, approximations, dispersions, concurrence) qui rendent la définition du projet eurorégional floue et difficile à appréhender pour le citoyen. L'analyse s'appuie sur un corpus authentique et multilingue en vue de déceler des régularités relatives au discours eurorégional. Elle mobilise des résultats textométriques simples mais vérifiables qui servent de repères à l'analyse qualitative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The equilibrium structure of the hydrogen bonded complex H2O HF has been calculated ab initio using the CCSD(T) method with basis sets up to sextuple- quality with diffuse functions and taking into account the basis set superposition error correction. The calculations carried out confirm the importance of diffuse functions and of counterpoise correction to obtain an accurate geometry. The most important point is that the basis set convergence is extremely slow and, for this reason an accurate ab initio structure requires a very large basis set. Nevertheless, the ab initio structure is significantly different from the experimental r0 and rm structures. Analysis of the basis set convergence and of the approximations used for the determination of the experimental structures indicates that the ab initio structure is expected to be more reliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Edge-element methods have proved very effective for 3-D electromagnetic computations and are widely used on unstructured meshes. However, the accuracy of standard edge elements can be criticised because of their low order. This paper analyses discrete dispersion relations together with numerical propagation accuracy to determine the effect of tetrahedral shape on the phase accuracy of standard 3-D edgeelement approximations in comparison to other methods. Scattering computations for the sphere obtained with edge elements are compared with results obtained with vertex elements, and a new formulation of the far-field integral approximations for use with edge elements is shown to give improved cross sections over conventional formulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (sometimes for the original problem, sometimes the coarsest) and then iteratively refined at each level. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (most notably in the form of multigrid techniques). However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial optimisation problems. In this paper we address the issue of multilevel refinement for such problems and, with the aid of examples and results in graph partitioning, graph colouring and the travelling salesman problem, make a case for its use as a metaheuristic. The results provide compelling evidence that, although the multilevel framework cannot be considered as a panacea for combinatorial problems, it can provide an extremely useful addition to the combinatorial optimisation toolkit. We also give a possible explanation for the underlying process and extract some generic guidelines for its future use on other combinatorial problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multilevel approaches to computational problems are pervasive across many areas of applied mathematics and scientific computing. The multilevel paradigm uses recursive coarsening to create a hierarchy of approximations to the original problem, then an initial solution is found for the coarsest problem and iteratively refined and improved at each level, coarsest to finest. The solution process is aided by the global perspective (or `global view') imparted to the optimisation by the coarsening. This paper looks at their application to the Vehicle Routing Problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work explores the impact of response time distributions on high-rise building evacuation. The analysis utilises response times extracted from printed accounts and interviews of evacuees from the WTC North Tower evacuation of 11 September 2001. Evacuation simulations produced using these “real” response time distributions are compared with simulations produced using instant and engineering response time distributions. Results suggest that while typical engineering approximations to the response time distribution may produce reasonable evacuation times for up to 90% of the building population, using this approach may underestimate total evacuation times by as much as 61%. These observations are applicable to situations involving large high-rise buildings in which travel times are generally expected to be greater than response times

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the application of the multilevel (ML) refinement technique to the Vehicle Routing Problem (VRP), and compare it to its single-level (SL) counterpart. Multilevel refinement recursively coarsens to create a hierarchy of approximations to the problem and refines at each level. A SL algorithm, which uses a combination of standard VRP heuristics, is developed first to solve instances of the VRP. A ML version, which extends the global view of these heuristics, is then created, using variants of the construction and improvement heuristics at each level. Finally some multilevel enhancements are developed. Experimentation is used to find suitable parameter settings and the final version is tested on two well-known VRP benchmark suites. Results comparing both SL and ML algorithms are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We discuss the application of the multilevel (ML) refinement technique to the Vehicle Routing Problem (VRP), and compare it to its single-level (SL) counterpart. Multilevel refinement recursively coarsens to create a hierarchy of approximations to the problem and refines at each level. A SL heuristic, termed the combined node-exchange composite heuristic (CNCH), is developed first to solve instances of the VRP. A ML version (the ML-CNCH) is then created, using the construction and improvement heuristics of the CNCH at each level. Experimentation is used to find a suitable combination, which extends the global view of these heuristics. Results comparing both SL and ML are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The multilevel paradigm as applied to combinatorial optimisation problems is a simple one, which at its most basic involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found, usually at the coarsest level, and then iteratively refined at each level, coarsest to finest, typically by using some kind of heuristic optimisation algorithm (either a problem-specific local search scheme or a metaheuristic). Solution extension (or projection) operators can transfer the solution from one level to another. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (for example multigrid techniques can be viewed as a prime example of the paradigm). Overview papers such as [] attest to its efficacy. However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial problems and in this chapter we discuss recent developments. In this chapter we survey the use of multilevel combinatorial techniques and consider their ability to boost the performance of (meta)heuristic optimisation algorithms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The eigenphase formulation of Blatt and Biedenharn is applied to fine-structure transitions in *P atoms colliding with ‘S perturbers. Consideration is given to the limit of weak spin-orbit interaction. If the eigenphases are equal to the phaseshifts for elastic scattering by the molecular potentials then the expression for the total cross section reduces to the expression derived in the elastic approximation. However, a numerical comparison for the Li(2p ’P) + He(’S) system shows that the elastic molecular phaseshifts are not good approximations to the eigenphases. Hence the elastic approximation cannot be reliable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent growth in the shape-from-shading psychophysics literature has been paralled by an increasing availability of computer graphics hardware and software, to the extent that most psychophysical studies in this area now employ computer lighting algorithms. The most widely used of algorithms is shape-from-shading psychophysics is the Phong lighting model. This model, and other shading models of its genre, produce readily ineterpretable imiages of three-dimensional scenes. However, such algorithms are only approximations of how light interacts with real objects in the natural environment. Nevertheless, the results from psychophysical experiments using these techniques have been used to infer the processes underlying the perception of shape-from-shading in natural environments. It is important to establish whether this substitution is ever valid. We report a series of experiments investigating whether two recently reported illusions seen computer-generated, Phond shaded images occur for solid objects under real illuminants. The two illusions investigated are three-dimensional curvature contrast and the illuminant-position effect on perceived curvature. We show that both effects do occur for solid objects, and that the magnitude of these effects are equivalent regardless of whether subjects are presented with ray traced or solid objects.