955 resultados para Random Subspace Method
Resumo:
This article discusses the design of a comprehensive evaluation of a community development programme for young people 'at-risk' of self-harming behaviour. It outlines considerations in the design of the evaluation and focuses on the complexities and difficulties associated with the evaluation of a community development programme. The challenge was to fulfil the needs of the funding body for a broad, outcome-focused evaluation while remaining close enough to the programme to accurately represent its activities and potential effects at a community level. Specifically, the strengths and limitations of a mixed-method evaluation plan are discussed with recommendations for future evaluation practice.
Resumo:
A new wavelet-based method for solving population balance equations with simultaneous nucleation, growth and agglomeration is proposed, which uses wavelets to express the functions. The technique is very general, powerful and overcomes the crucial problems of numerical diffusion and stability that often characterize previous techniques in this area. It is also applicable to an arbitrary grid to control resolution and computational efficiency. The proposed technique has been tested for pure agglomeration, simultaneous nucleation and growth, and simultaneous growth and agglomeration. In all cases, the predicted and analytical particle size distributions are in excellent agreement. The presence of moving sharp fronts can be addressed without the prior investigation of the characteristics of the processes. (C) 2001 Published by Elsevier Science Ltd.
Resumo:
Objective. The diagnostic value of tests for antimyeloperoxidase antibodies (anti-MPO) for systemic vasculitis is less established than that for cytoplasmic antineutrophil cytoplasmic antibody (cANCA)/antiproteinase 3 antibodies (anti-PR3). Controversy exists regarding the optimal utilization of indirect immunofluorescence (IIF) ANCA testing versus antigen-specific ANCA testing. To summarize the pertinent data, we conducted a metaanalysis examining the diagnostic value of ANCA testing systems that include assays for anti-MPO. Methods. We performed a structured Medline search and reference list review. Target articles in the search strategy were those reporting the diagnostic value of immunoassays for anti-MPO for the spectrum of systemic necrotizing vasculitides that includes Wegener's granulomatosis, microscopic polyangiitis, the Churg-Strauss syndrome, and isolated pauci-immune necrotizing or crescentic glomerulonephritis, regardless of other types of ANCA tests. Inclusion criteria required specification of a consecutive or random patient selection method and the use of acceptable criteria for the diagnosis of vasculitis exclusive of ANCA test results. Weighted pooled summary estimates of sensitivity and specificity were calculated for anti-MPO alone, anti-MPO + perinuclear ANCA (pANCA), and anti-MPO/pANCA + anti-PR3/cANCA. Results. Of 457 articles reviewed, only 7 met the selection criteria. Summary estimates of sensitivity and specificity (against disease controls only) of assays for anti-MPO for the diagnosis of systemic necrotizing vasculitides were 37.1% (confidence interval 26.6% to 47.6%) and 96.3% (CI 94.1% to 98.5%), respectively. When the pANCA pattern by IIF was combined with anti-MPO testing, the specificity improved to 99.4%, with a lower sensitivity, 31.5%. The combined ANCA testing system (anti-PR3/cANCA + anti-MPO/pANCA) increased the sensitivity to 85.5% with a specificity of 98.6%. Conclusion. These results suggest that while anti-MPO is relatively specific for the diagnosis of systemic vasculitis, the combination system of immunoassays for anti-MPO and IIF for pANCA is highly specific and both tests should be used together given the high diagnostic precision required for these conditions. Because patients with ANCA associated vasculitis have either anti-MPO with pANCA or anti-PR3 with cANCA, and rarely both, a combined ANCA testing system including anti-PR3/cANCA and anti-MPO/pANCA is recommended to optimize the diagnostic performance of ANCA testing. (J Rheumatol 2001;28:1584-90)
Resumo:
A flow tagging technique based upon ionic fluorescence in strontium is investigated for applications to velocity measurements in gas flows. The method is based upon a combination of two laser based spectroscopic techniques, i.e. resonantly-enhanced ionisation and laser-induced ionic fluorescence. Strontium is first ionised and then planar laser-induced fluorescence is utilised to give 2D 'bright images' of the ionised region of the flow at a given time delay. The results show that this method can be used for velocity measurements. The velocities were measured in two types of air-acetylene flames - a slot burner and a circular burner yielding velocities of 5.1 +/- 0.1 m/s and 9.3 +/- 0.2 m/s, respectively. The feasibility of the method for the determination of velocities in faster flows than those investigated here is discussed.
Resumo:
We describe the progress towards developing a patient rated toxicity index that meets all of the patient-important attributes defined by the OMERACT Drug Safety Working Party, These attributes are frequency, severity. importance to patient, importance to the clinician, impact on economics, impact on activities, and integration of adverse effects with benefits. The Stanford Toxicity Index (STI) has been revised to collect all attributes with the exception of impact on activities. However, since the STI is a part of the Health Assessment Questionnaire (HAQ). impact on activities is collected by the HAQ. In particular, a new question asks patients to rate overall satisfaction, taking into consideration both benefits and adverse effects. The nest step in the development of this tool is to ensure that the STI meets the OMERACT filter of truth, discrimination, and feasibility. Although truth and feasibility have been confirmed by comparisons within the ARAMIS database, discrimination needs to be assessed in clinical trials.
Resumo:
Activated sludge models are used extensively in the study of wastewater treatment processes. While various commercial implementations of these models are available, there are many people who need to code models themselves using the simulation packages available to them, Quality assurance of such models is difficult. While benchmarking problems have been developed and are available, the comparison of simulation data with that of commercial models leads only to the detection, not the isolation of errors. To identify the errors in the code is time-consuming. In this paper, we address the problem by developing a systematic and largely automated approach to the isolation of coding errors. There are three steps: firstly, possible errors are classified according to their place in the model structure and a feature matrix is established for each class of errors. Secondly, an observer is designed to generate residuals, such that each class of errors imposes a subspace, spanned by its feature matrix, on the residuals. Finally. localising the residuals in a subspace isolates coding errors. The algorithm proved capable of rapidly and reliably isolating a variety of single and simultaneous errors in a case study using the ASM 1 activated sludge model. In this paper a newly coded model was verified against a known implementation. The method is also applicable to simultaneous verification of any two independent implementations, hence is useful in commercial model development.
Resumo:
The application of the N-1-(4,4-dimethyl-2,6-dioxocyclohexylidene)ethyl (Dde) linker for the solid-phase synthesis of oligosaccharides is described. The oligosaccharide products can be cleaved from the resin by hydrazine, ammonia or primary amines, but the linker is stable under the conditions of oligosaccharide synthesis. The first sugar can be attached to the resin linker via a vinylogous amide bond, or by ether linkage using a p-aminobenzyl alcohol converter. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
We compare the performance of two different low-storage filter diagonalisation (LSFD) strategies in the calculation of complex resonance energies of the HO2, radical. The first is carried out within a complex-symmetric Lanczos subspace representation [H. Zhang, S.C. Smith, Phys. Chem. Chem. Phys. 3 (2001) 2281]. The second involves harmonic inversion of a real autocorrelation function obtained via a damped Chebychev recursion [V.A. Mandelshtam, H.S. Taylor, J. Chem. Phys. 107 (1997) 6756]. We find that while the Chebychev approach has the advantage of utilizing real algebra in the time-consuming process of generating the vector recursion, the Lanczos, method (using complex vectors) requires fewer iterations, especially for low-energy part of the spectrum. The overall efficiency in calculating resonances for these two methods is comparable for this challenging system. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Time-dependent wavepacket evolution techniques demand the action of the propagator, exp(-iHt/(h)over-bar), on a suitable initial wavepacket. When a complex absorbing potential is added to the Hamiltonian for combating unwanted reflection effects, polynomial expansions of the propagator are selected on their ability to cope with non-Hermiticity. An efficient subspace implementation of the Newton polynomial expansion scheme that requires fewer dense matrix-vector multiplications than its grid-based counterpart has been devised. Performance improvements are illustrated with some benchmark one and two-dimensional examples. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Traditional gentamicin dosing every 8–24 h depending on age and weight in neonates does not provide the ideal concentration–time profile to both optimize the concentration-dependent killing by aminoglycosides and minimize toxicity. Fifty-three neonates were audited prospectively while receiving gentamicin 2.5 mg/kg every 8–24 h, aiming for peak concentrations (Cmax) of 6–10 mg/L and trough concentrations (Cmin) 10 mg/L after the first dose. The mean area under the concentration versus time curve AUC0–24 was 93 mg•h/L (target = 100 mg•h/L). The extended interval dosing achieved higher Cmax values while ensuring that overall exposure per 24 h was acceptable. Prospective testing of the method demonstrated good predictive ability.
Resumo:
Hereditary nonpolyposis colorectal cancer syndrome (HNPCC) is an autosomal dominant condition accounting for 2–5% of all colorectal carcinomas as well as a small subset of endometrial, upper urinary tract and other gastrointestinal cancers. An assay to detect the underlying defect in HNPCC, inactivation of a DNA mismatch repair enzyme, would be useful in identifying HNPCC probands. Monoclonal antibodies against hMLH1 and hMSH2, two DNA mismatch repair proteins which account for most HNPCC cancers, are commercially available. This study sought to investigate the potential utility of these antibodies in determining the expression status of these proteins in paraffin-embedded formalin-fixed tissue and to identify key technical protocol components associated with successful staining. A set of 20 colorectal carcinoma cases of known hMLH1 and hMSH2 mutation and expression status underwent immunoperoxidase staining at multiple institutions, each of which used their own technical protocol. Staining for hMSH2 was successful in most laboratories while staining for hMLH1 proved problematic in multiple labs. However, a significant minority of laboratories demonstrated excellent results including high discriminatory power with both monoclonal antibodies. These laboratories appropriately identified hMLH1 or hMSH2 inactivation with high sensitivity and specificity. The key protocol point associated with successful staining was an antigen retrieval step involving heat treatment and either EDTA or citrate buffer. This study demonstrates the potential utility of immunohistochemistry in detecting HNPCC probands and identifies key technical components for successful staining.
Resumo:
My purpose here is to put forward a conception of genre as a way to conduct Futures Studies. To demonstrate the method, I present some examples of contemporary political and corporate discourses and contextualise them in broader institutional and historical settings. I elaborate the method further by giving examples of ‘genre chaining’ and ‘genre hybridity’ (Fairclough 1992 2000) to show how past, present, and future change can be viewed through the lens of genre.
Resumo:
Objectives. Intrusive memories of extreme trauma can disrupt a stepwise approach to imaginal exposure. Concurrent tasks that load the visuospatial sketchpad (VSSP) of working memory reduce the vividness of recalled images. This study tested whether relief of distress from competing VSSP tasks during imaginal exposure is at the cost of impaired desensitization. Design. This study examined repeated exposure to emotive memories using 18 unselected undergraduates and a within-subjects design with three exposure conditions (Eye Movement, Visual Noise, Exposure Alone) in random, counterbalanced order. Method. At baseline, participants recalled positive and negative experiences, and rated the vividness and emotiveness of each image. A different positive and negative recollection was then used for each condition. Vividness and emotiveness were rated after each of eight exposure trials. At a post-exposure session 1 week later, participants rated each image without any concurrent task. Results. Consistent with previous research, vividness and distress during imaging were lower during Eye Movements than in Exposure Alone, with passive visual interference giving intermediate results. A reduction in emotional responses from Baseline to Post was of similar size for the three conditions. Conclusion. Visuospatial tasks may offer a temporary response aid for imaginal exposure without affecting desensitization.