111 resultados para Parallel methods
Resumo:
I noted with interest the article by Drs Perrin and Guex, entitled &dquo;Edema and leg volume: Methods of assessment,&dquo; published in Angiology 51:9-12, 2000. This was a timely and comprehensive review of the various methods in clinical use for the assessment of peripheral edema, notably in the leg. I would like to take this opportunity to alert readers to a further technique useful for this purpose, namely, bioelectrical impedance analysis. An early reportl described its use for the measurement of edema in the leg, but other than its successful use for the assessment of edema in the arm following masteCtoMy,2,1 the potential of the method remains to be fully realized. This is unfortunate since the method directly and quantifiably measures edema.
Resumo:
Depending on the size and shape of the materials, methods employed to achieve effective fluidization during fluid bed drying varies from use of simple hole distributors for small, light weight materials to special techniques for lager and/or moist materials. This paper reviews common air distributors used in fluidized bed drying of food particulates. Also it reviews special methods of fluidizing larger irregular food particulates.
Resumo:
Recent research has begun to provide support for the assumptions that memories are stored as a composite and are accessed in parallel (Tehan & Humphreys, 1998). New predictions derived from these assumptions and from the Chappell and Humphreys (1994) implementation of these assumptions were tested. In three experiments, subjects studied relatively short lists of words. Some of the Lists contained two similar targets (thief and theft) or two dissimilar targets (thief and steal) associated with the same cue (ROBBERY). AS predicted, target similarity affected performance in cued recall but not free association. Contrary to predictions, two spaced presentations of a target did not improve performance in free association. Two additional experiments confirmed and extended this finding. Several alternative explanations for the target similarity effect, which incorporate assumptions about separate representations and sequential search, are rejected. The importance of the finding that, in at least one implicit memory paradigm, repetition does not improve performance is also discussed.
Resumo:
The dendrite coherency point of Al-Si-Cu alloys was determined by thermal analysis and rheological measurement methods by performing parallel measurements at two cooling rates for aluminum alloys across a wide range of silicon and copper contents. Contrary to previous findings, the two methods yield significantly different values for the fraction solid at the dendrite coherency point. This disparity is greatest for alloys of low solute concentration. The results from this study also contradict previously reported tl ends in the effect of cooling rate on the dendritic coherency point. Consideration of the results shows that thermal analysis is not a valid technique for the measurement of coherency. Analysis of the results from rheological testing indicates that silicon concentration has a dominant effect on grain size and dendritic morphology, independent of cooling rate and copper content, and thus is the factor that determines the fraction solid at dendrite coherency for Al-Si-Cu alloys.
Resumo:
A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.
Resumo:
The compound eyes of mantis shrimps, a group of tropical marine crustaceans, incorporate principles of serial and parallel processing of visual information that may be applicable to artificial imaging systems. Their eyes include numerous specializations for analysis of the spectral and polarizational properties of light, and include more photoreceptor classes for analysis of ultraviolet light, color, and polarization than occur in any other known visual system. This is possible because receptors in different regions of the eye are anatomically diverse and incorporate unusual structural features, such as spectral filters, not seen in other compound eyes. Unlike eyes of most other animals, eyes of mantis shrimps must move to acquire some types of visual information and to integrate color and polarization with spatial vision. Information leaving the retina appears to be processed into numerous parallel data streams leading into the central nervous system, greatly reducing the analytical requirements at higher levels. Many of these unusual features of mantis shrimp vision may inspire new sensor designs for machine vision
Resumo:
This paper presents a means of structuring specifications in real-time Object-Z: an integration of Object-Z with the timed refinement calculus. Incremental modification of classes using inheritance and composition of classes to form multi-component systems are examined. Two approaches to the latter are considered: using Object-Z's notion of object instantiation and introducing a parallel composition operator similar to those found in process algebras. The parallel composition operator approach is both more concise and allows more general modelling of concurrency. Its incorporation into the existing semantics of real-time Object-Z is presented.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Simulations provide a powerful means to help gain the understanding of crustal fault system physics required to progress towards the goal of earthquake forecasting. Cellular Automata are efficient enough to probe system dynamics but their simplifications render interpretations questionable. In contrast, sophisticated elasto-dynamic models yield more convincing results but are too computationally demanding to explore phase space. To help bridge this gap, we develop a simple 2D elastodynamic model of parallel fault systems. The model is discretised onto a triangular lattice and faults are specified as split nodes along horizontal rows in the lattice. A simple numerical approach is presented for calculating the forces at medium and split nodes such that general nonlinear frictional constitutive relations can be modeled along faults. Single and multi-fault simulation examples are presented using a nonlinear frictional relation that is slip and slip-rate dependent in order to illustrate the model.
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
The apposition compound eyes of stomatopod crustaceans contain a morphologically distinct eye region specialized for color and polarization vision, called the mid-band. In two stomatopod superfamilies, the mid-band is constructed from six rows of enlarged ommatidia containing multiple photoreceptor classes for spectral and polarization vision. The aim of this study was to begin to analyze the underlying neuroarchitecture, the design of which might reveal clues how the visual system interprets and communicates to deeper levels of the brain the multiple channels of information supplied by the retina. Reduced silver methods were used to investigate the axon pathways from different retinal regions to the lamina ganglionaris and from there to the medulla externa, the medulla interna, and the medulla terminalis. A swollen band of neuropil-here termed the accessory lobe-projects across the equator of. the lamina ganglionaris, the medulla externa, and the medulla interna and represents, structurally, the retina's mid-band. Serial semithin and ultrathin resin sections were used to reconstruct the projection of photoreceptor axons from the retina to the lamina ganglionaris. The eight axons originating from one ommatidium project to the same lamina cartridge. Seven short visual fibers end at two distinct levels in each lamina cartridge, thus geometrically separating the two channels of polarization and spectral information. The eighth visual fiber runs axially through the cartridge and terminates in the medulla externa. We conclude that spatial, color, and polarization information is divided into three parallel data streams from the retina to the central nervous system. (C) 2003 Wiley-Liss, Inc.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Background: A major goal in the post-genomic era is to identify and characterise disease susceptibility genes and to apply this knowledge to disease prevention and treatment. Rodents and humans have remarkably similar genomes and share closely related biochemical, physiological and pathological pathways. In this work we utilised the latest information on the mouse transcriptome as revealed by the RIKEN FANTOM2 project to identify novel human disease-related candidate genes. We define a new term patholog to mean a homolog of a human disease-related gene encoding a product ( transcript, anti-sense or protein) potentially relevant to disease. Rather than just focus on Mendelian inheritance, we applied the analysis to all potential pathologs regardless of their inheritance pattern. Results: Bioinformatic analysis and human curation of 60,770 RIKEN full-length mouse cDNA clones produced 2,578 sequences that showed similarity ( 70 - 85% identity) to known human-disease genes. Using a newly developed biological information extraction and annotation tool ( FACTS) in parallel with human expert analysis of 17,051 MEDLINE scientific abstracts we identified 182 novel potential pathologs. Of these, 36 were identified by computational tools only, 49 by human expert analysis only and 97 by both methods. These pathologs were related to neoplastic ( 53%), hereditary ( 24%), immunological ( 5%), cardio-vascular (4%), or other (14%), disorders. Conclusions: Large scale genome projects continue to produce a vast amount of data with potential application to the study of human disease. For this potential to be realised we need intelligent strategies for data categorisation and the ability to link sequence data with relevant literature. This paper demonstrates the power of combining human expert annotation with FACTS, a newly developed bioinformatics tool, to identify novel pathologs from within large-scale mouse transcript datasets.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.