995 resultados para Resolution algorithm
Resumo:
The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The aim of this work was to exemplify the specific contribution of both two- and three-dimensional (31)) X-ray computed tomography to characterise earthworm burrow systems. To achieve this purpose we used 3D mathematical morphology operators to characterise burrow systems resulting from the activity of an anecic (Aporrectodea noctunia), and an endogeic species (Allolobophora chlorotica), when both species were introduced either separately or together into artificial soil cores. Images of these soil cores were obtained using a medical X-ray tomography scanner. Three-dimensional reconstructions of burrow systems were obtained using a specifically developed segmentation algorithm. To study the differences between burrow systems, a set of classical tools of mathematical morphology (granulometries) were used. So-called granulometries based on different structuring elements clearly separated the different burrow systems. They enabled us to show that burrows made by the anecic species were fatter, longer, more vertical, more continuous but less sinuous than burrows of the endogeic species. The granulometry transform of the soil matrix showed that burrows made by A. nocturna were more evenly distributed than those of A. chlorotica. Although a good discrimination was possible when only one species was introduced into the soil cores, it was not possible to separate burrows of the two species from each other in cases where species were introduced into the same soil core. This limitation, partly due to the insufficient spatial resolution of the medical scanner, precluded the use of the morphological operators to study putative interactions between the two species.
Resumo:
A graph clustering algorithm constructs groups of closely related parts and machines separately. After they are matched for the least intercell moves, a refining process runs on the initial cell formation to decrease the number of intercell moves. A simple modification of this main approach can deal with some practical constraints, such as the popular constraint of bounding the maximum number of machines in a cell. Our approach makes a big improvement in the computational time. More importantly, improvement is seen in the number of intercell moves when the computational results were compared with best known solutions from the literature. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Pulmonary interstitial emphysema is a common complication of mechanical ventilation in preterm babies. We report a case of severe unilateral pulmonary interstitial emphysema in a premature newborn, treated with high-frequency oscillatory ventilation, lateral decubitus positioning and selective intubation. After complete radiological resolution of the pulmonary emphysema in the left lung, the patient was studied by electrical impedance tomography and a marked reduction of ventilation was identified in the left lung despite radiological resolution of the cysts. This finding indicates that functional abnormalities may persist for longer periods after radiologic resolution of such lesions.
Resumo:
Extended gcd computation is interesting itself. It also plays a fundamental role in other calculations. We present a new algorithm for solving the extended gcd problem. This algorithm has a particularly simple description and is practical. It also provides refined bounds on the size of the multipliers obtained.
Resumo:
Study Design. A clinical study was conducted on 39 patients with acute, first-episode, unilateral low back pain and unilateral, segmental inhibition of the multifidus muscle. Patients were allocated randomly to a control or treatment group. Objectives. To document the natural course of lumbar multifidus recovery and to evaluate the effectiveness of specific, localized, exercise therapy on muscle recovery. Summary of Background Data. Acute low back pain usually resolves spontaneously, but the recurrence rate is high. Inhibition of multifidus occurs with acute, first-episode, low back pain, and pathologic changes in this muscle have been linked with poor outcome and recurrence of symptoms. Methods. Patients in group 1 received medical treatment only. Patients in group 2 received medical treatment and specific, localized, exercise therapy. Outcome measures for both groups included 4 weekly assessments of pain, disability, range of motion, and size of the multifidus cross-sectional area. Independent examiners were blinded to group allocation. Patients were reassessed at a 10-week follow-up examination. Results. Multifidus muscle recovery was not spontaneous on remission of painful symptoms in patients in group 1. Muscle recovery was more rapid and more complete in patients in group 2 who received exercise therapy (P = 0.0001). Other outcome measurements were similar for the two groups at the 4-week examination. Although they resumed normal levels of activity, patients in group 1 still had decreased multifidus muscle size at the 10-week follow-up examination. Conclusions. Multifidus muscle recovery is not spontaneous on remission of painful symptoms. Lack of localized, muscle support may be one reason for the high recurrence rate of low back pain following the initial episode.
Resumo:
Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.
Resumo:
Conducting dielectric samples are often used in high-resolution experiments at high held. It is shown that significant amplitude and phase distortions of the RF magnetic field may result from perturbations caused by such samples. Theoretical analyses demonstrate the spatial variation of the RF field amplitude and phase across the sample, and comparisons of the effect are made for a variety of sample properties and operating field strengths. Although the effect is highly nonlinear, it tends to increase with increasing field strength, permittivity, conductivity, and sample size. There are cases, however, in which increasing the conductivity of the sample improves the homogeneity of the amplitude of the RF field across the sample at the expense of distorted RF phase. It is important that the perturbation effects be calculated for the experimental conditions used, as they have the potential to reduce the signal-to-noise ratio of NMR experiments and may increase the generation of spurious coherences. The effect of RF-coil geometry on the coherences is also modeled, with the use of homogeneous resonators such as the birdcage design being preferred, Recommendations are made concerning methods of reducing sample-induced perturbations. Experimental high-field imaging and high-resolution studies demonstrate the effect. (C) 1997 Academic Press.
Resumo:
Purpose: To evaluate the changes over time in the pattern and extent of parenchymal abnormalities in asbestos-exposed workers after cessation of exposure and to compare 3 proposed semiquantitative methods with a careful side-by-side comparison of the initial and the follow-Lip computed tomography (CT) images. Materials and Methods: The study included 52 male asbestos workers (mean age SD, 62.2y +/- 8.2) who had baseline high-resolution CT after cessation of exposure and follow-up CT 3 to 5 years later. Two independent thoracic radiologists quantified the findings according to the scoring systems proposed by Huuskonen, Gamsu, and Sette and then did a side-by-side comparison of the 2 sets of scans without awareness of the dates of the CT scans. Results: There was no difference in the prevalence of the 2 most common parenchymal abnormalities (centrilobular small dotlike or branching opacities and interstitial lines) between the initial and follow-up CT scans. Honeycombing (20%) and traction bronchiectasis and bronchiolectasis (50%) were seen more commonly on the follow-up CT than on the initial examination (10% and 33%, respectively) (P = 0.01). Increased extent of parenchymal abnormalities was evident on side-by-side comparison in 42 (81%) patients but resulted in an increase in score in at least 1 semiquantitative system in only 16 (31%) patients (all P > 0.01, signed test). Conclusions: The majority of patients with previous asbestos exposure show evidence of progression of disease on CT at 3 to 5 years follow-up but this progression is usually not detected by the 3 proposed semiquantitative scoring schemes.
Resumo:
An algorithm for explicit integration of structural dynamics problems with multiple time steps is proposed that averages accelerations to obtain subcycle states at a nodal interface between regions integrated with different time steps. With integer time step ratios, the resulting subcycle updates at the interface sum to give the same effect as a central difference update over a major cycle. The algorithm is shown to have good accuracy, and stability properties in linear elastic analysis similar to those of constant velocity subcycling algorithms. The implementation of a generalised form of the algorithm with non-integer time step ratios is presented. (C) 1997 by John Wiley & Sons, Ltd.