540 resultados para refining
Resumo:
BACKGROUND: Patients, clinicians, researchers and payers are seeking to understand the value of using genomic information (as reflected by genotyping, sequencing, family history or other data) to inform clinical decision-making. However, challenges exist to widespread clinical implementation of genomic medicine, a prerequisite for developing evidence of its real-world utility. METHODS: To address these challenges, the National Institutes of Health-funded IGNITE (Implementing GeNomics In pracTicE; www.ignite-genomics.org ) Network, comprised of six projects and a coordinating center, was established in 2013 to support the development, investigation and dissemination of genomic medicine practice models that seamlessly integrate genomic data into the electronic health record and that deploy tools for point of care decision making. IGNITE site projects are aligned in their purpose of testing these models, but individual projects vary in scope and design, including exploring genetic markers for disease risk prediction and prevention, developing tools for using family history data, incorporating pharmacogenomic data into clinical care, refining disease diagnosis using sequence-based mutation discovery, and creating novel educational approaches. RESULTS: This paper describes the IGNITE Network and member projects, including network structure, collaborative initiatives, clinical decision support strategies, methods for return of genomic test results, and educational initiatives for patients and providers. Clinical and outcomes data from individual sites and network-wide projects are anticipated to begin being published over the next few years. CONCLUSIONS: The IGNITE Network is an innovative series of projects and pilot demonstrations aiming to enhance translation of validated actionable genomic information into clinical settings and develop and use measures of outcome in response to genome-based clinical interventions using a pragmatic framework to provide early data and proofs of concept on the utility of these interventions. Through these efforts and collaboration with other stakeholders, IGNITE is poised to have a significant impact on the acceleration of genomic information into medical practice.
Resumo:
Three hundred participants, including volunteers from an obsessional support group, filled in questionnaires relating to disgust sensitivity, health anxiety, anxiety, fear of death, fear of contamination and obsessionality as part of an investigation into the involvement of disgust sensitivity in types of obsessions. Overall, the data supported the hypothesis that a relationship does exist between disgust sensitivity and the targeted variables. A significant predictive relationship was found between disgust sensitivity and total scores on the obsessive compulsive inventory (OCI; Psychological Assessment 10 (1998) 206) for both frequency and distress of symptomatology. Disgust sensitivity scores were significantly related to health anxiety scores and general anxiety scores and to all the obsessional subscales, with the exception of hoarding. Additionally, multiple regression analyses revealed that disgust sensitivity may be more specifically related to washing compulsions: frequency of washing behaviour was best predicted by disgust sensitivity scores. Washing distress scores were best predicted by health anxiety scores, though disgust sensitivity entered in the second model. It is suggested that further research on the relationship between disgust sensitivity and obsessionality could be helpful in refining the theoretical understanding of obsessions.
Resumo:
Most lead bullion is refined by pyrometallurgical methods - this involves a serics of processes that remove the antimony (softening) silver (Parkes process), zinc (vacuum dezincing) and if need be, bismuth (Betterton-Kroll process). The first step, softening, removes the antimony, arsenic and tin by air oxidation in a furnace or by the Harris process. Next, in the Parkes process, zinc is added to the melt to remove the silver and gold. Insoluble zinc, silver and gold compounds are skimmed off from the melt surface. Excess zinc added during desilvering is removed from lead bullion using one of ghree methods: * Vacuum dezincing; * Chlorine dezincing; or * Harris dezincing. The present study concentrates on the Vacuum dezincing process for lead refining. The main aims of the research are to develop mathematical model(s), using Computational Fluid Dyanmics (CFD) a Surface Averaged Model (SAM), to predict the process behaviour under various operating conditions, thus providing detailed information of the process - insight into its reaction to changes of key operating parameters. Finally, the model will be used to optimise the process in terms of initial feed concentration, temperature, vacuum height cooling rate, etc.
Resumo:
Computational analysis software is now widely accepted as a key industrial tool for plant design and process analysis. This is due in part to increased accuracy in the models, larger and faster computer systems and better graphical interfaces that allow easy use of the technology by engineers. The use of computational modelling to test new ideas and analyse current processes helps to take the guesswork out of industrial process design and offers attractive cost savings. An overview of computer-based modelling techniques as applied to the materials processing industry is presented and examples of their application are provided in the contexts of the mixing and refining of lead bullion and the manufacture of lead ingots.
Resumo:
This is the first report from ALT’s new Annual Survey launched in December 2014. This survey was primarily for ALT members (individual or at an organisation which is an organisational member) it could however also be filled in by others, perhaps those interested in taking out membership. The report and data highlight emerging work areas that are important to the survey respondents. Analysis of the survey responses indicates a number of areas ALT should continue to support and develop. Priorities for the membership are ‘Intelligent use of learning technology’ and ‘Research and practice’, aligned to this is the value placed by respondent’s on by communication via the ALT Newsletter/News, social media and Research in Learning Technology. The survey also reveals ‘Data and Analytics’ and ‘Open Education’ are areas where the majority of respondents are finding are becoming increasingly important. As such our community may benefit from development opportunities ALT can provide. The survey is also a reminder that ALT has an essential role in enabling members to develop research and practice in areas which might be considered as minority interest. For example whilst the majority of respondents didn't indicate areas such as ‘Digital and Open Badges’, and ‘Game Based Learning’ as important there are still members who consider these areas are very significant and becoming increasingly valuable and as such ALT will continue to better support these groups within our community. Whilst ALT has conducted previous surveys of ALT membership this is the first iteration in this form. ALT has committed to surveying the sector on an annual basis, refining the core question set but trying to preserve an opportunity for longitudinal analysis.
Resumo:
Accurate quantification of carbohydrate content of biomass is crucial for many bio-refining applications. The standardised NREL two stage complete acid hydrolysis protocol was evaluated for its suitability towards seaweeds, as the protocol was originally developed for lignocellulosic feedstocks. The compositional differences between the major polysaccharides in seaweeds and terrestrial plants, and seaweed’s less recalcitrant nature, could suggest the NREL based protocol may be too extreme. Underestimations of carbohydrate content through the degradation of liberated sugars into furan compounds may yield erroneous data. An optimised analysis method for carbohydrate quantification in the brown seaweed L. digitata was thus developed and evaluated. Results from this study revealed stage 1 of the assay was crucial for optimisation however stage 2 proved to be less crucial. The newly optimised protocol for L. digitata yielded 210 mg of carbohydrate per g of biomass compared to a yield of only 166 mg/g from the original NREL protocol. Use of the new protocol on two other species of seaweed also gave consistent results; higher carbohydrate and significantly lower sugar degradation products generation than the original protocol. This study demonstrated the importance of specific individual optimisations of the protocol for accurate sugar quantification, particularly for different species of seaweed
Resumo:
Food is one of the main exogenous sources of genotoxic compounds. In heated food products, polycyclic aromatic hydrocarbons (PAHs) represent a priority group of genotoxic, mutagenic and/or carcinogenic chemical pollutants with adverse long-term health effects. People can be exposed to these compounds through different environments and via various routes: inhalation, ingestion of foods and water and even percutaneously. The presence of these compounds in food may be due to environmental contamination, to industrial handling and processing of foods and to oil processing and refining. The highest levels of these compounds are found in smoked foods, in seafood which is found in polluted waters, in grilled meats and, to a lesser extent, in vegetable fats and oils. Lower levels of PAHs are found in vegetables and in cereals and its products.
Resumo:
Radiocarbon dating has been rarely used for chronological problems relating to the Anglo-Saxon period. The "flatness" of the calibration curve and the resultant wide range in calendrical dates provide little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology have, however, created the possibility of refining and checking the established chronologies, based on typology of artifacts, against 14C dates. The calibration process, within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have therefore re-measured, at decadal intervals, a section of the Irish oak chronology for the period AD 495–725. These measurements have been included in IntCal04.
Resumo:
Genome-scale metabolic models promise important insights into cell function. However, the definition of pathways and functional network modules within these models, and in the biochemical literature in general, is often based on intuitive reasoning. Although mathematical methods have been proposed to identify modules, which are defined as groups of reactions with correlated fluxes, there is a need for experimental verification. We show here that multivariate statistical analysis of the NMR-derived intra- and extracellular metabolite profiles of single-gene deletion mutants in specific metabolic pathways in the yeast Saccharomyces cerevisiae identified outliers whose profiles were markedly different from those of the other mutants in their respective pathways. Application of flux coupling analysis to a metabolic model of this yeast showed that the deleted gene in an outlying mutant encoded an enzyme that was not part of the same functional network module as the other enzymes in the pathway. We suggest that metabolomic methods such as this, which do not require any knowledge of how a gene deletion might perturb the metabolic network, provide an empirical method for validating and ultimately refining the predicted network structure.
Resumo:
Radiocarbon dating has been used infrequently as a chronological tool for research in Anglo-Saxon archaeology. Primarily, this is because the uncertainty of calibrated dates provides little advantage over traditional archaeological dating in this period. Recent advances in Bayesian methodology in conjunction with high-precision 14C dating have, however, created the possibility of both testing and refining the established Anglo-Saxon chronologies based on typology of artifacts. The calibration process within such a confined age range, however, relies heavily on the structural accuracy of the calibration curve. We have previously reported decadal measurements on a section of the Irish oak chronology for the period AD 495–725 (McCormac et al. 2004). In this paper, we present decadal measurements for the periods AD 395–485 and AD 735–805,which extends the original calibration set.
Resumo:
Although data quality and weighting decisions impact the outputs of reserve selection algorithms, these factors have not been closely studied. We examine these methodological issues in the use of reserve selection algorithms by comparing: (1) quality of input data and (2) use of different weighting methods for prioritizing among species. In 2003, the government of Madagascar, a global biodiversity hotspot, committed to tripling the size of its protected area network to protect 10% of the country’s total land area. We apply the Zonation reserve selection algorithm to distribution data for 52 lemur species to identify priority areas for the expansion of Madagascar’s reserve network. We assess the similarity of the areas selected, as well as the proportions of lemur ranges protected in the resulting areas when different forms of input data were used: extent of occurrence versus refined extent of occurrence. Low overlap between the areas selected suggests that refined extent of occurrence data are highly desirable, and to best protect lemur species, we recommend refining extent of occurrence ranges using habitat and altitude limitations. Reserve areas were also selected for protection based on three different species weighting schemes, resulting in marked variation in proportional representation of species among the IUCN Red List of Threatened Species extinction risk categories. This result demonstrates that assignment of species weights influences whether a reserve network prioritizes maximizing overall species protection or maximizing protection of the most threatened species.
Resumo:
Background
Recently, clinical and research attention has been focused on refining weaning processes to improve outcomes for critically ill patients who require mechanical ventilation. One such process, use of a weaning protocol, has yielded conflicting results, arguably because of the influence of existing context and processes.
Objective
To compare international data to assess differences in context and processes in intensive care units that could influence weaning.
Methods
Review of existing national data on provision of care for critically ill patients, including structure, staffing, skill mix, education, roles, and responsibilities for weaning in intensive care units of selected countries.
Results
Australia, New Zealand, Denmark, Norway, Sweden, and the United Kingdom showed similarities in critical care provision, structure, skill mix, and staffing ratios in intensive care units. Weaning in these countries is generally a collaborative process between nurses and physicians. Notable differences in intensive care units in the United States were the frequent use of an open structure and inclusion of respiratory therapists on the intensive care unit’s health care team. Nurses may be excluded from direct management of ventilator weaning in some institutions, as this role is primarily assumed by respiratory therapists guided by medical directives. Availability of critical care beds was highest in the United States and lowest in the United Kingdom.
Conclusion
Context and processes of care that could influence ventilator weaning outcomes varied considerably across countries. Further quantification of these contextual influences should be considered when translating research findings into local clinical practice and when designing randomized, controlled trials.
Resumo:
We present a Spatio-temporal 2D Models Framework (STMF) for 2D-Pose tracking. Space and time are discretized and a mixture of probabilistic "local models" is learnt associating 2D Shapes and 2D Stick Figures. Those spatio-temporal models generalize well for a particular viewpoint and state of the tracked action but some spatio-temporal discontinuities can appear along a sequence, as a direct consequence of the discretization. To overcome the problem, we propose to apply a Rao-Blackwellized Particle Filter (RBPF) in the 2D-Pose eigenspace, thus interpolating unseen data between view-based clusters. The fitness to the images of the predicted 2D-Poses is evaluated combining our STMF with spatio-temporal constraints. A robust, fast and smooth human motion tracker is obtained by tracking only the few most important dimensions of the state space and by refining deterministically with our STMF.
Resumo:
The Kawakawa/Oruanui tephra (KOT) is a key chronostratigraphic marker in terrestrial and marine deposits of the New Zealand (NZ) sector of the southwest Pacific. Erupted early during the Last Glacial Maximum (LGM), the wide distribution of the KOT enables inter-regional alignment of proxy records and facilitates comparison between NZ climatic variations and those from well-dated records elsewhere. We present 22 new radiocarbon ages for the KOT from sites and materials considered optimal for dating, and apply Bayesian statistical methods via OxCal4.1.7 that incorporate stratigraphic information to develop a new age probability model for KOT. The revised calibrated age, ±2 standard deviations, for the eruption of the KOT is 25,360 ± 160 cal yr BP. The age revision provides a basis for refining marine reservoir ages for the LGM in the southwest Pacific.
Resumo:
This paper proposes a two-level 3D human pose tracking method for a specific action captured by several cameras. The generation of pose estimates relies on fitting a 3D articulated model on a Visual Hull generated from the input images. First, an initial pose estimate is constrained by a low dimensional manifold learnt by Temporal Laplacian Eigenmaps. Then, an improved global pose is calculated by refining individual limb poses. The validation of our method uses a public standard dataset and demonstrates its accurate and computational efficiency. © 2011 IEEE.