903 resultados para Algorithm-oriented design
Resumo:
A rapid spherical harmonic calculation method is used for the design of Nuclear Magnetic Resonance shim coils. The aim is to design each shim such that it generates a field described purely by a single spherical harmonic. By applying simulated annealing techniques, coil arrangements are produced through the optimal positioning of current-carrying circular arc conductors of rectangular cross-section. This involves minimizing the undesirable harmonies in relation to a target harmonic. The design method is flexible enough to be applied for the production of coil arrangements that generate fields consisting significantly of either zonal or tesseral harmonics. Results are presented for several coil designs which generate tesseral harmonics of degree one.
Resumo:
In this paper we propose a new framework for evaluating designs based on work domain analysis, the first phase of cognitive work analysis. We develop a rationale for a new approach to evaluation by describing the unique characteristics of complex systems and by showing that systems engineering techniques only partially accommodate these characteristics. We then present work domain analysis as a complementary framework for evaluation. We explain this technique by example by showing how the Australian Defence Force used work domain analysis to evaluate design proposals for a new system called Airborne Early Warning and Control. This case study also demonstrates that work domain analysis is a useful and feasible approach that complements standard techniques for evaluation and that promotes a central role for human factors professionals early in the system design and development process. Actual or potential applications of this research include the evaluation of designs for complex systems.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Ecological interface design (EID) is proving to be a promising approach to the design of interfaces for complex dynamic systems. Although the principles of EID and examples of its effective use are widely available, few readily available examples exist of how the individual displays that constitute an ecological interface are developed. This paper presents the semantic mapping process within EID in the context of prior theoretical work in this area. The semantic mapping process that was used in developing an ecological interface for the Pasteurizer II microworld is outlined, and the results of an evaluation of the ecological interface against a more conventional interface are briefly presented. Subjective reports indicate features of the ecological interface that made it particularly valuable for participants. Finally, we outline the steps of an analytic process for using EID. The findings presented here can be applied in the design of ecological interfaces or of configural displays for dynamic processes.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The efficient expression and purification of an interfacially active peptide (mLac21) was achieved by using bioprocess-centered molecular design (BMD), wherein key bioprocess considerations are addressed during the initial molecular biology work. The 21 amino acid mLac21 peptide sequence is derived from the lac repressor protein and is shown to have high affinity for the oil-water interface, causing a substantial reduction in interfacial tension following adsorption. The DNA coding for the peptide sequence was cloned into a modified pET-31(b) vector to permit the expression of mLac21 as a fusion to ketosteroid isomerase (KSI). Rational iterative molecular design, taking into account the need for a scaleable bioprocess flowsheet, led to a simple and efficient bioprocess yielding mLac21 at 86% purity following ion exchange chromatography (and >98% following chromatographic polishing). This case study demonstrates that it is possible to produce acceptably pure peptide for potential commodity applications using common scaleable bioprocess unit operations. Moreover, it is shown that BMD is a powerful strategy that can be deployed to reduce bioseparation complexity. (C) 2004 Wiley Periodicals, Inc.
Resumo:
Proteins incorporated into phospholipid Langmuir-Blodgett (LB) films are a good model system for biomembranes and enzyme immobilization studies. The specific fluidity of biomembranes, an important requisite for enzymatic activity, is naturally controlled by varying phospholipid compositions. In a model system, instead, LB film fluidity may be varied by covering the top layer with different substances able to interact simultaneously with the phospholipid and the protein to be immobilized. In this study, we immobilized a carbohydrate rich Neurospora crassa alkaline phosphatase (NCAP) in monolayers of the sodium salt of dihexadecylphosphoric acid (DHP), a synthetic phospholipid that provides very condensed Langmuir films. The binding of NCAP to DHP Langmuir-Blodgett (LB) films was mediated by the anionic polysaccharide iota-carrageenan (iota-car). Combining results from surface isotherms and the quartz crystal microbalance technique, we concluded that the polysaccharide was essential to promote the interaction between DHP and NCAP and also to increase the fluidity of the film. An estimate of DHP:iota-car ratio within the film also revealed that the polysaccharide binds to DHP LB film in an extended conformation. Furthermore, the investigation of the polysaccharide conformation at molecular level, using sum-frequency vibrational spectroscopy (SFG), indicated a preferential conformation of the carrageenan molecules with the sulfate groups oriented toward the phospholipid monolayer, and both the hydroxyl and ether groups interacting preferentially with the protein. These results demonstrate how interfacial electric fields can reorient and induce conformational changes in macromolecules, which may significantly affect intermolecular interactions at interfaces. This detailed knowledge of the interaction mechanism between the enzyme and the LB film is relevant to design strategies for enzyme immobilization when orientation and fluidity properties of the film provided by the matrix are important to improve enzymatic activity.
Resumo:
In this work a new approach for designing planar gradient coils is outlined for the use in an existing MRI apparatus. A technique that allows for gradient field corrections inside the diameter-sensitive volume is deliberated. These corrections are brought about by making changes to the wire paths that constitute the coil windings, and hence, is called the path correction method. The existing well-known target held method is used to gauge the performance of a typical gradient coil. The gradient coil design methodology is demonstrated for planar openable gradient coils that can be inserted into an existing MRI apparatus. The path corrected gradient coil is compared to the coil obtained using the target field method. It is shown that using a wire path correction with optimized variables, winding patterns that can deliver high magnetic gradient field strengths and large imaging regions can be obtained.
Resumo:
Pulmonary Vascular Research Institute
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Resumo:
Objectives: Pneumothorax is a frequent complication during mechanical ventilation. Electrical impedance tomography (EIT) is a noninvasive tool that allows real-time imaging of regional ventilation. The purpose of this study was to 1) identify characteristic changes in the EIT signals associated with pneumothoraces; 2) develop and fine-tune an algorithm for their automatic detection; and 3) prospectively evaluate this algorithm for its sensitivity and specificity in detecting pneumothoraces in real time. Design: Prospective controlled laboratory animal investigation. Setting: Experimental Pulmonology Laboratory of the University of Sao Paulo. Subjects: Thirty-nine anesthetized mechanically ventilated supine pigs (31.0 +/- 3.2 kg, mean +/- SD). Interventions. In a first group of 18 animals monitored by EIT, we either injected progressive amounts of air (from 20 to 500 mL) through chest tubes or applied large positive end-expiratory pressure (PEEP) increments to simulate extreme lung overdistension. This first data set was used to calibrate an EIT-based pneumothorax detection algorithm. Subsequently, we evaluated the real-time performance of the detection algorithm in 21 additional animals (with normal or preinjured lungs), submitted to multiple ventilatory interventions or traumatic punctures of the lung. Measurements and Main Results: Primary EIT relative images were acquired online (50 images/sec) and processed according to a few imaging-analysis routines running automatically and in parallel. Pneumothoraces as small as 20 mL could be detected with a sensitivity of 100% and specificity 95% and could be easily distinguished from parenchymal overdistension induced by PEEP or recruiting maneuvers, Their location was correctly identified in all cases, with a total delay of only three respiratory cycles. Conclusions. We created an EIT-based algorithm capable of detecting early signs of pneumothoraces in high-risk situations, which also identifies its location. It requires that the pneumothorax occurs or enlarges at least minimally during the monitoring period. Such detection was operator-free and in quasi real-time, opening opportunities for improving patient safety during mechanical ventilation.
Resumo:
Extended gcd computation is interesting itself. It also plays a fundamental role in other calculations. We present a new algorithm for solving the extended gcd problem. This algorithm has a particularly simple description and is practical. It also provides refined bounds on the size of the multipliers obtained.
Resumo:
Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.
Resumo:
The main aim of this study is to evaluate the capacity of human dental pulp stem cells (hDPSC), isolated from deciduous teeth, to reconstruct large-sized cranial bone defects in nonimmunosuppressed (NIS) rats. To our knowledge, these cells were not used before in similar experiments. We performed two symmetric full-thickness cranial defects (5 x 8 mm) on each parietal region of eight NIS rats. In six of them, the left side was supplied with collagen membrane only and the right side (RS) with collagen membrane and hDPSC. In two rats, the RS had collagen membrane only and nothing was added at the left side (controls). Cells were used after in vitro characterization as mesenchymal cells. Animals were euthanized at 7, 20, 30, 60, and 120 days postoperatively and cranial tissue samples were taken from the defects for histologic analysis. Analysis of the presence of human cells in the new bone was confirmed by molecular analysis. The hDPSC lineage was positive for the four mesenchymal cell markers tested and showed osteogenic, adipogenic, and myogenic in vitro differentiation. We observed bone formation 1 month after surgery in both sides, but a more mature bone was present in the RS. Human DNA was polymerase chain reaction-amplified only at the RS, indicating that this new bone had human cells. The us e of hDPSC in NIS rats did not cause any graft. rejection. Our findings suggest that hDPSC is an additional cell resource for correcting large cranial defects in rats and constitutes a promising model for reconstruction of human large cranial defects in craniofacial surgery.
Resumo:
Background: Despite significant advancements in psychopharmacology, treating major depressive disorder (MDD) is still a challenge considering the efficacy, tolerability, safety, and economical costs of most antidepressant drugs. One approach that has been increasingly investigated is modulation of cortical activity with tools of non-invasive brain stimulation - such as transcranial magnetic stimulation and transcranial direct current stimulation (tDCS). Due to its profile, tDCS seems to be a safe and affordable approach. Methods and design: The SELECT TDCS trial aims to compare sertraline vs. tDCS in a double-blinded, randomized, factorial trial enrolling 120 participants to be allocated to four groups to receive sertraline + tDCS, sertraline, tDCS or placebo. Eligibility criteria are moderate-to-severe unipolar depression (Hamilton Depression Rating Scale >17) not currently on sertraline treatment. Treatment will last 6 weeks and the primary outcome is depression change in the Montgomery-Asberg Depression Rating Score (MADRS). Potential biological markers that mediate response, such as BDNF serum levels, Val66Met BDNF polymorphism, and heart rate variability will also be examined. A neuropsychological battery with a focus on executive functioning will be administered. Discussion: With this design we will be able to investigate whether tDCS is more effective than placebo in a sample of patients free of antidepressants and in addition, we will be able to secondarily compare the effect sizes of sertraline vs. tDCS and also the comparison between tDCS and combination of tDCS and sertraline. (C) 2010 Elsevier Inc. All rights reserved.