981 resultados para Reciprocal collaborative method
Resumo:
In this paper we present the composite Euler method for the strong solution of stochastic differential equations driven by d-dimensional Wiener processes. This method is a combination of the semi-implicit Euler method and the implicit Euler method. At each step either the semi-implicit Euler method or the implicit Euler method is used in order to obtain better stability properties. We give criteria for selecting the semi-implicit Euler method or the implicit Euler method. For the linear test equation, the convergence properties of the composite Euler method depend on the criteria for selecting the methods. Numerical results suggest that the convergence properties of the composite Euler method applied to nonlinear SDEs is the same as those applied to linear equations. The stability properties of the composite Euler method are shown to be far superior to those of the Euler methods, and numerical results show that the composite Euler method is a very promising method. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
This article discusses the design of a comprehensive evaluation of a community development programme for young people 'at-risk' of self-harming behaviour. It outlines considerations in the design of the evaluation and focuses on the complexities and difficulties associated with the evaluation of a community development programme. The challenge was to fulfil the needs of the funding body for a broad, outcome-focused evaluation while remaining close enough to the programme to accurately represent its activities and potential effects at a community level. Specifically, the strengths and limitations of a mixed-method evaluation plan are discussed with recommendations for future evaluation practice.
Resumo:
A new wavelet-based method for solving population balance equations with simultaneous nucleation, growth and agglomeration is proposed, which uses wavelets to express the functions. The technique is very general, powerful and overcomes the crucial problems of numerical diffusion and stability that often characterize previous techniques in this area. It is also applicable to an arbitrary grid to control resolution and computational efficiency. The proposed technique has been tested for pure agglomeration, simultaneous nucleation and growth, and simultaneous growth and agglomeration. In all cases, the predicted and analytical particle size distributions are in excellent agreement. The presence of moving sharp fronts can be addressed without the prior investigation of the characteristics of the processes. (C) 2001 Published by Elsevier Science Ltd.
Resumo:
A flow tagging technique based upon ionic fluorescence in strontium is investigated for applications to velocity measurements in gas flows. The method is based upon a combination of two laser based spectroscopic techniques, i.e. resonantly-enhanced ionisation and laser-induced ionic fluorescence. Strontium is first ionised and then planar laser-induced fluorescence is utilised to give 2D 'bright images' of the ionised region of the flow at a given time delay. The results show that this method can be used for velocity measurements. The velocities were measured in two types of air-acetylene flames - a slot burner and a circular burner yielding velocities of 5.1 +/- 0.1 m/s and 9.3 +/- 0.2 m/s, respectively. The feasibility of the method for the determination of velocities in faster flows than those investigated here is discussed.
Resumo:
We describe the progress towards developing a patient rated toxicity index that meets all of the patient-important attributes defined by the OMERACT Drug Safety Working Party, These attributes are frequency, severity. importance to patient, importance to the clinician, impact on economics, impact on activities, and integration of adverse effects with benefits. The Stanford Toxicity Index (STI) has been revised to collect all attributes with the exception of impact on activities. However, since the STI is a part of the Health Assessment Questionnaire (HAQ). impact on activities is collected by the HAQ. In particular, a new question asks patients to rate overall satisfaction, taking into consideration both benefits and adverse effects. The nest step in the development of this tool is to ensure that the STI meets the OMERACT filter of truth, discrimination, and feasibility. Although truth and feasibility have been confirmed by comparisons within the ARAMIS database, discrimination needs to be assessed in clinical trials.
Resumo:
The application of the N-1-(4,4-dimethyl-2,6-dioxocyclohexylidene)ethyl (Dde) linker for the solid-phase synthesis of oligosaccharides is described. The oligosaccharide products can be cleaved from the resin by hydrazine, ammonia or primary amines, but the linker is stable under the conditions of oligosaccharide synthesis. The first sugar can be attached to the resin linker via a vinylogous amide bond, or by ether linkage using a p-aminobenzyl alcohol converter. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
We develop a new iterative filter diagonalization (FD) scheme based on Lanczos subspaces and demonstrate its application to the calculation of bound-state and resonance eigenvalues. The new scheme combines the Lanczos three-term vector recursion for the generation of a tridiagonal representation of the Hamiltonian with a three-term scalar recursion to generate filtered states within the Lanczos representation. Eigenstates in the energy windows of interest can then be obtained by solving a small generalized eigenvalue problem in the subspace spanned by the filtered states. The scalar filtering recursion is based on the homogeneous eigenvalue equation of the tridiagonal representation of the Hamiltonian, and is simpler and more efficient than our previous quasi-minimum-residual filter diagonalization (QMRFD) scheme (H. G. Yu and S. C. Smith, Chem. Phys. Lett., 1998, 283, 69), which was based on solving for the action of the Green operator via an inhomogeneous equation. A low-storage method for the construction of Hamiltonian and overlap matrix elements in the filtered-basis representation is devised, in which contributions to the matrix elements are computed simultaneously as the recursion proceeds, allowing coefficients of the filtered states to be discarded once their contribution has been evaluated. Application to the HO2 system shows that the new scheme is highly efficient and can generate eigenvalues with the same numerical accuracy as the basic Lanczos algorithm.
Resumo:
Traditional gentamicin dosing every 8–24 h depending on age and weight in neonates does not provide the ideal concentration–time profile to both optimize the concentration-dependent killing by aminoglycosides and minimize toxicity. Fifty-three neonates were audited prospectively while receiving gentamicin 2.5 mg/kg every 8–24 h, aiming for peak concentrations (Cmax) of 6–10 mg/L and trough concentrations (Cmin) 10 mg/L after the first dose. The mean area under the concentration versus time curve AUC0–24 was 93 mg•h/L (target = 100 mg•h/L). The extended interval dosing achieved higher Cmax values while ensuring that overall exposure per 24 h was acceptable. Prospective testing of the method demonstrated good predictive ability.
Resumo:
Amultidisciplinary collaborative study examining cognition in a large sample of twins is outlined. A common experimental protocol and design is used in The Netherlands, Australia and Japan to measure cognitive ability using traditional IQ measures (i.e., psychometric IQ), processing speed (e.g., reaction time [RT] and inspection time [IT]), and working memory (e.g., spatial span, delayed response [DR] performance). The main aim is to investigate the genetic covariation among these cognitive phenotypes in order to use the correlated biological markers in future linkage and association analyses to detect quantitativetrait loci (QTLs). We outline the study and methodology, and report results from our preliminary analyses that examines the heritability of processing speed and working memory indices, and their phenotypic correlation with IQ. Heritability of Full Scale IQ was 87% in the Netherlands, 83% in Australia, and 71% in Japan. Heritability estimates for processing speed and working memory indices ranged from 33–64%. Associations of IQ with RT and IT (−0.28 to −0.36) replicated previous findings with those of higher cognitive ability showing faster speed of processing. Similarly, significant correlations were indicated between IQ and the spatial span working memory task (storage [0.31], executive processing [0.37]) and the DR working memory task (0.25), with those of higher cognitive ability showing better memory performance. These analyses establish the heritability of the processing speed and working memory measures to be used in our collaborative twin study of cognition, and support the findings that individual differences in processing speed and working memory may underlie individual differences in psychometric IQ.
Resumo:
My purpose here is to put forward a conception of genre as a way to conduct Futures Studies. To demonstrate the method, I present some examples of contemporary political and corporate discourses and contextualise them in broader institutional and historical settings. I elaborate the method further by giving examples of ‘genre chaining’ and ‘genre hybridity’ (Fairclough 1992 2000) to show how past, present, and future change can be viewed through the lens of genre.
Resumo:
Understanding the genetic architecture of quantitative traits can greatly assist the design of strategies for their manipulation in plant-breeding programs. For a number of traits, genetic variation can be the result of segregation of a few major genes and many polygenes (minor genes). The joint segregation analysis (JSA) is a maximum-likelihood approach for fitting segregation models through the simultaneous use of phenotypic information from multiple generations. Our objective in this paper was to use computer simulation to quantify the power of the JSA method for testing the mixed-inheritance model for quantitative traits when it was applied to the six basic generations: both parents (P-1 and P-2), F-1, F-2, and both backcross generations (B-1 and B-2) derived from crossing the F-1 to each parent. A total of 1968 genetic model-experiment scenarios were considered in the simulation study to quantify the power of the method. Factors that interacted to influence the power of the JSA method to correctly detect genetic models were: (1) whether there were one or two major genes in combination with polygenes, (2) the heritability of the major genes and polygenes, (3) the level of dispersion of the major genes and polygenes between the two parents, and (4) the number of individuals examined in each generation (population size). The greatest levels of power were observed for the genetic models defined with simple inheritance; e.g., the power was greater than 90% for the one major gene model, regardless of the population size and major-gene heritability. Lower levels of power were observed for the genetic models with complex inheritance (major genes and polygenes), low heritability, small population sizes and a large dispersion of favourable genes among the two parents; e.g., the power was less than 5% for the two major-gene model with a heritability value of 0.3 and population sizes of 100 individuals. The JSA methodology was then applied to a previously studied sorghum data-set to investigate the genetic control of the putative drought resistance-trait osmotic adjustment in three crosses. The previous study concluded that there were two major genes segregating for osmotic adjustment in the three crosses. Application of the JSA method resulted in a change in the proposed genetic model. The presence of the two major genes was confirmed with the addition of an unspecified number of polygenes.