960 resultados para Runga-Kutta formulas.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives a review of recent progress in the design of numerical methods for computing the trajectories (sample paths) of solutions to stochastic differential equations. We give a brief survey of the area focusing on a number of application areas where approximations to strong solutions are important, with a particular focus on computational biology applications, and give the necessary analytical tools for understanding some of the important concepts associated with stochastic processes. We present the stochastic Taylor series expansion as the fundamental mechanism for constructing effective numerical methods, give general results that relate local and global order of convergence and mention the Magnus expansion as a mechanism for designing methods that preserve the underlying structure of the problem. We also present various classes of explicit and implicit methods for strong solutions, based on the underlying structure of the problem. Finally, we discuss implementation issues relating to maintaining the Brownian path, efficient simulation of stochastic integrals and variable-step-size implementations based on various types of control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Obesity is a disease with excess body fat where health is adversely affected. Therefore it is prudent to make the diagnosis of obesity based on the measure of percentage body fat. Body composition of a group of Australian children of Sri Lankan origin were studied to evaluate the applicability of some bedside techniques in the measurement of percentage body fat. Methods: Height (H) and weight (W) was measured and BMI (W/H-2) calculated. Bioelectrical impedance analysis (BIA) was measured using tetra polar technique with an 800 mu A current of 50 Hz frequency. Total body water was used as a reference method and was determined by deuterium dilution and fat free mass and hence fat mass (FM) derived using age and gender specific constants. Percentage FM was estimated using four predictive equations, which used BIA and anthropometric measurements. Results: Twenty-seven boys and 15 girls were studied with mean ages being 9.1 years and 9.6 years, respectively. Girls had a significantly higher FM compared to boys. The mean percentage FM of boys (22.9 +/- 8.7%) was higher than the limit for obesity and for girls (29.0 +/- 6.0%) it was just below the cut-off. BMI was comparatively low. All but BIA equation in boys under estimated the percentage FM. The impedance index and weight showed a strong association with total body water (r(2)= 0.96, P < 0.001). Except for BIA in boys all other techniques under diagnosed obesity. Conclusions: Sri Lankan Australian children appear to have a high percentage of fat with a low BMI and some of the available indirect techniques are not helpful in the assessment of body composition. Therefore ethnic and/or population specific predictive equations have to be developed for the assessment of body composition, especially in a multicultural society using indirect methods such as BIA or anthropometry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research work analyses techniques for implementing a cell-centred finite-volume time-domain (ccFV-TD) computational methodology for the purpose of studying microwave heating. Various state-of-the-art spatial and temporal discretisation methods employed to solve Maxwell's equations on multidimensional structured grid networks are investigated, and the dispersive and dissipative errors inherent in those techniques examined. Both staggered and unstaggered grid approaches are considered. Upwind schemes using a Riemann solver and intensity vector splitting are studied and evaluated. Staggered and unstaggered Leapfrog and Runge-Kutta time integration methods are analysed in terms of phase and amplitude error to identify which method is the most accurate and efficient for simulating microwave heating processes. The implementation and migration of typical electromagnetic boundary conditions. from staggered in space to cell-centred approaches also is deliberated. In particular, an existing perfectly matched layer absorbing boundary methodology is adapted to formulate a new cell-centred boundary implementation for the ccFV-TD solvers. Finally for microwave heating purposes, a comparison of analytical and numerical results for standard case studies in rectangular waveguides allows the accuracy of the developed methods to be assessed. © 2004 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chaotic orientations of a top containing a fluid filled cavity are investigated analytically and numerically under small perturbations. The top spins and rolls in nonsliding contact with a rough horizontal plane and the fluid in the ellipsoidal shaped cavity is considered to be ideal and describable by finite degrees of freedom. A Hamiltonian structure is established to facilitate the application of Melnikov-Holmes-Marsden (MHM) integrals. In particular, chaotic motion of the liquid-filled top is identified to be arisen from the transversal intersections between the stable and unstable manifolds of an approximated, disturbed flow of the liquid-filled top via the MHM integrals. The developed analytical criteria are crosschecked with numerical simulations via the 4th Runge-Kutta algorithms with adaptive time steps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE. The match between the reading level of occupational therapy education materials and older clients' reading ability and comprehension was determined. The sociodemographic and literacy characteristics that influenced clients' reading ability and comprehension were investigated. METHOD. The reading level of 110 written education materials (handouts, brochures, and information leaflets), distributed to older clients (65 years of age and older) by occupational therapists working in Queensland hospitals, was analyzed using the Flesch formula. The reading ability of 214 older persons (mean age 77 years, 63% female) was assessed using the Rapid Estimate of Adult Literacy in Medicine. Participants' comprehension of information of increasing reading difficulty was measured using the Cloze procedure. RESULTS. The written materials required a mean reading level between the ninth and tenth grades. Participants' mean reading ability was seventh to eighth grade. Therefore some materials may have been too difficult for participants to read and understand. Participants with a managerial or professional or clerical background (p = 0.001) and those who perceived they read well (p = 0.001) had a significantly higher reading ability, Older age was significantly related to poorer comprehension (p = 0.018), with participants 75 years of age and over having a mean comprehension score of 25.6 compared to 30.3 for those 65 to 74 years of age. CONCLUSION. Occupational therapists must analyze the reading level of the written education materials they develop for and use with clients by applying readability formulas. There should be a match between the reading level of written materials and clients' reading ability. Clients' reading ability may be assessed informally by discussing years of education and literacy habits or formally using reading assessments. Content and design characteristics should be considered when developing written education materials for clients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new integration scheme is developed for nonequilibrium molecular dynamics simulations where the temperature is constrained by a Gaussian thermostat. The utility of the scheme is demonstrated by its application to the SLLOD algorithm which is the standard nonequilibrium molecular dynamics algorithm for studying shear flow. Unlike conventional integrators, the new integrators are constructed using operator-splitting techniques to ensure stability and that little or no drift in the kinetic energy occurs. Moreover, they require minimum computer memory and are straightforward to program. Numerical experiments show that the efficiency and stability of the new integrators compare favorably with conventional integrators such as the Runge-Kutta and Gear predictor-corrector methods. (C) 1999 American Institute of Physics. [S0021-9606(99)50125-6].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal methods have significant benefits for developing safety critical systems, in that they allow for correctness proofs, model checking safety and liveness properties, deadlock checking, etc. However, formal methods do not scale very well and demand specialist skills, when developing real-world systems. For these reasons, development and analysis of large-scale safety critical systems will require effective integration of formal and informal methods. In this paper, we use such an integrative approach to automate Failure Modes and Effects Analysis (FMEA), a widely used system safety analysis technique, using a high-level graphical modelling notation (Behavior Trees) and model checking. We inject component failure modes into the Behavior Trees and translate the resulting Behavior Trees to SAL code. This enables us to model check if the system in the presence of these faults satisfies its safety properties, specified by temporal logic formulas. The benefit of this process is tool support that automates the tedious and error-prone aspects of FMEA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Domain specific information retrieval has become in demand. Not only domain experts, but also average non-expert users are interested in searching domain specific (e.g., medical and health) information from online resources. However, a typical problem to average users is that the search results are always a mixture of documents with different levels of readability. Non-expert users may want to see documents with higher readability on the top of the list. Consequently the search results need to be re-ranked in a descending order of readability. It is often not practical for domain experts to manually label the readability of documents for large databases. Computational models of readability needs to be investigated. However, traditional readability formulas are designed for general purpose text and insufficient to deal with technical materials for domain specific information retrieval. More advanced algorithms such as textual coherence model are computationally expensive for re-ranking a large number of retrieved documents. In this paper, we propose an effective and computationally tractable concept-based model of text readability. In addition to textual genres of a document, our model also takes into account domain specific knowledge, i.e., how the domain-specific concepts contained in the document affect the document’s readability. Three major readability formulas are proposed and applied to health and medical information retrieval. Experimental results show that our proposed readability formulas lead to remarkable improvements in terms of correlation with users’ readability ratings over four traditional readability measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objetivo central desta pesquisa é avaliar os valores e possibilidades da aliança pregada no Deuteronômio. Para tanto, procuro captar a necessária tensão de qualquer tipo de aliança. Faço esse exercício, primeiramente, no próprio campo da hermenêutica. Sugiro uma leitura subalterna que agregue diferentes lutas no interior das interpretações libertárias (feminista, queer e pós-colonial). Nesse ínterim, forjo o trabalho do exegeta orgânico , a saber, aquele intérprete que articula vozes dissidentes para fazer frente às estruturas sistêmicas de subordinação. Após essa proposição teórica, avalio o Deuteronômio enquanto discursos concatenados em forma de arquivo. A principal sugestão é de que os textos deuteronômicos foram coletados ou produzidos em prol de um ideal de berit aliança . Esse ideal origina-se do material agora disposto em 4,44-26+28: um contrato comunitário atávico com Yhvh. Esse resultado é possibilitado pela crítica retórica ao texto e seus interesses propagandísticos desde o nascedouro arquivístico. Após uma comparação honesta com os tratados do Antigo Oriente Próximo, não se pode mais negar a pedagogia da obediência intrínseca ao contrato. A isso chamo, muitas vezes, de colusão do povo santo . A crítica retórica, entretanto, não encaminha apenas uma reificação desse ideal de berit, ao apontar, antes, para o debate interno da comunidade. Um contrato retórico, afinal, guarda em si, memórias silenciadas para que a propaganda se efetive. Nesse momento é que busco colisões de memórias, em especial, dentro das perícopes proibitivas do contrato. Todo o lixo deuteronômico, por assim dizer, está assinalado por duas fórmulas básicas: ki to abat yhvh eis uma abominação para Yhvh e ubi arta ha-ra mi-qirbeka exterminarás o per/vertido do teu meio . Dedico-me aos textos marcados por essas fórmulas, ao fomentar uma episódica unificação de abomináveis e per/vertidos . Avalio a luta particular de cada um/a, para então, propor uma agenda subalterna que promova a justiça social por reconhecimento e redistribuição. A aliança abominável e per/vertida intra-Deuteronômio apresenta uma proposta radicalmente democrática (i) em favor de uma cultura aberta ao Outro e (ii) contra estruturas autoritárias piramidais. Assinalo, portanto, que com essa dupla tática, os valores imperiais de hierarquização e subtração da irmandade deuteronômica são retoricamente postos em debate na comunidade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An interactive hierarchical Generative Topographic Mapping (HGTM) ¸iteHGTM has been developed to visualise complex data sets. In this paper, we build a more general visualisation system by extending the HGTM visualisation system in 3 directions: bf (1) We generalize HGTM to noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM) developed in ¸iteKabanpami. bf (2) We give the user a choice of initializing the child plots of the current plot in either em interactive, or em automatic mode. In the interactive mode the user interactively selects ``regions of interest'' as in ¸iteHGTM, whereas in the automatic mode an unsupervised minimum message length (MML)-driven construction of a mixture of LTMs is employed. bf (3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualisation plots, since they can highlight the boundaries between data clusters. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. We illustrate our approach on a toy example and apply our system to three more complex real data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research carried out in this thesis was mainly concerned with the effects of large induction motors and their transient performance in power systems. Computer packages using the three phase co-ordinate frame of reference were developed to simulate the induction motor transient performance. A technique using matrix algebra was developed to allow extension of the three phase co-ordinate method to analyse asymmetrical and symmetrical faults on both sides of the three phase delta-star transformer which is usually required when connecting large induction motors to the supply system. System simulation, applying these two techniques, was used to study the transient stability of a power system. The response of a typical system, loaded with a group of large induction motors, two three-phase delta-star transformers, a synchronous generator and an infinite system was analysed. The computer software developed to study this system has the advantage that different types of fault at different locations can be studied by simple changes in input data. The research also involved investigating the possibility of using different integrating routines such as Runge-Kutta-Gill, RungeKutta-Fehlberg and the Predictor-Corrector methods. The investigation enables the reduction of computation time, which is necessary when solving the induction motor equations expressed in terms of the three phase variables. The outcome of this investigation was utilised in analysing an introductory model (containing only minimal control action) of an isolated system having a significant induction motor load compared to the size of the generator energising the system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We assess the accuracy of the Visante anterior segment optical coherence tomographer (AS-OCT) and present improved formulas for measurement of surface curvature and axial separation. Measurements are made in physical model eyes. Accuracy is compared for measurements of corneal thickness (d1) and anterior chamber depth (d2) using-built-in AS-OCT software versus the improved scheme. The improved scheme enables measurements of lens thickness (d 3) and surface curvature, in the form of conic sections specified by vertex radii and conic constants. These parameters are converted to surface coordinates for error analysis. The built-in AS-OCT software typically overestimates (mean±standard deviation(SD)]d1 by +62±4 μm and d2 by +4±88μm. The improved scheme reduces d1 (-0.4±4 μm) and d2 (0±49 μm) errors while also reducing d3 errors from +218±90 (uncorrected) to +14±123 μm (corrected). Surface x coordinate errors gradually increase toward the periphery. Considering the central 6-mm zone of each surface, the x coordinate errors for anterior and posterior corneal surfaces reached +3±10 and 0±23 μm, respectively, with the improved scheme. Those of the anterior and posterior lens surfaces reached +2±22 and +11±71 μm, respectively. Our improved scheme reduced AS-OCT errors and could, therefore, enhance pre- and postoperative assessments of keratorefractive or cataract surgery, including measurement of accommodating intraocular lenses. © 2007 Society of Photo-Optical Instrumentation Engineers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The generating functional method is employed to investigate the synchronous dynamics of Boolean networks, providing an exact result for the system dynamics via a set of macroscopic order parameters. The topology of the networks studied and its constituent Boolean functions represent the system's quenched disorder and are sampled from a given distribution. The framework accommodates a variety of topologies and Boolean function distributions and can be used to study both the noisy and noiseless regimes; it enables one to calculate correlation functions at different times that are inaccessible via commonly used approximations. It is also used to determine conditions for the annealed approximation to be valid, explore phases of the system under different levels of noise and obtain results for models with strong memory effects, where existing approximations break down. Links between Boolean networks and general Boolean formulas are identified and results common to both system types are highlighted. © 2012 Copyright Taylor and Francis Group, LLC.