862 resultados para COMBINATORIAL TECHNOLOGIES
Resumo:
A set of 13 US based experts in post-combustion and oxy-fuel combustion CO2 capture systems responded to an extensive questionnaire asking their views on the present status and future expected performance and costs for amine-based, chilled ammonia, and oxy-combustion retrofits of coal-fired power plants. This paper presents the experts' responses for technology maturity, ideal plant characteristics for early adopters, and the extent to which R&D and deployment incentives will impact costs. It also presents the best estimates and 95% confidence limits of the energy penalties associated with amine-based systems. The results show a general consensus that amine-based systems are closer to commercial application, but potential for improving performance and lowering costs is limited; chilled ammonia and oxy-combustion offer greater potential for cost reductions, but not without greater uncertainty regarding scale and technical feasibility. © 2011 Elsevier Ltd.
Resumo:
Gemstone Team FRESH
Resumo:
BACKGROUND: Measurement of CD4+ T-lymphocytes (CD4) is a crucial parameter in the management of HIV patients, particularly in determining eligibility to initiate antiretroviral treatment (ART). A number of technologies exist for CD4 enumeration, with considerable variation in cost, complexity, and operational requirements. We conducted a systematic review of the performance of technologies for CD4 enumeration. METHODS AND FINDINGS: Studies were identified by searching electronic databases MEDLINE and EMBASE using a pre-defined search strategy. Data on test accuracy and precision included bias and limits of agreement with a reference standard, and misclassification probabilities around CD4 thresholds of 200 and 350 cells/μl over a clinically relevant range. The secondary outcome measure was test imprecision, expressed as % coefficient of variation. Thirty-two studies evaluating 15 CD4 technologies were included, of which less than half presented data on bias and misclassification compared to the same reference technology. At CD4 counts <350 cells/μl, bias ranged from -35.2 to +13.1 cells/μl while at counts >350 cells/μl, bias ranged from -70.7 to +47 cells/μl, compared to the BD FACSCount as a reference technology. Misclassification around the threshold of 350 cells/μl ranged from 1-29% for upward classification, resulting in under-treatment, and 7-68% for downward classification resulting in overtreatment. Less than half of these studies reported within laboratory precision or reproducibility of the CD4 values obtained. CONCLUSIONS: A wide range of bias and percent misclassification around treatment thresholds were reported on the CD4 enumeration technologies included in this review, with few studies reporting assay precision. The lack of standardised methodology on test evaluation, including the use of different reference standards, is a barrier to assessing relative assay performance and could hinder the introduction of new point-of-care assays in countries where they are most needed.
Resumo:
p.1-11
Resumo:
p.1-11
Resumo:
The conception of the FUELCON architecture, of a composite tool for the generation and validation of patterns for assigning fuel assemblies to the positions in the grid of a reactor core section, has undergone an evolution throughout the history of the project. Different options for various subtask were possible, envisioned, or actually explored or adopted. We project these successive, or even concomitant configurations of the architecture, into a meta-architecture, which quite not by chance happens to reflect basic choices in the field's history over the last decade.
Resumo:
The consecutive, partly overlapping emergence of expert systems and then neural computation methods among intelligent technologies, is reflected in the evolving scene of their application to nuclear engineering. This paper provides a bird's eye view of the state of the application in the domain, along with a review of a particular task, the one perhaps economically more important: refueling design in nuclear power reactors.
Resumo:
Preface [Special Issue containing a selection of papers presented at the International Symposium on Combinatorial Optimisation (CO2000) held at the University of Greenwich, London, from 12-14 July 2000.
Resumo:
We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (sometimes for the original problem, sometimes the coarsest) and then iteratively refined at each level. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (most notably in the form of multigrid techniques). However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial optimisation problems. In this paper we address the issue of multilevel refinement for such problems and, with the aid of examples and results in graph partitioning, graph colouring and the travelling salesman problem, make a case for its use as a metaheuristic. The results provide compelling evidence that, although the multilevel framework cannot be considered as a panacea for combinatorial problems, it can provide an extremely useful addition to the combinatorial optimisation toolkit. We also give a possible explanation for the underlying process and extract some generic guidelines for its future use on other combinatorial problems.
Resumo:
Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.
Resumo:
The use of games technology in education is not a new phenomenon. Even back in the days of 286 processors, PCs were used in some schools along with (what looks like now) primitive simulation software to teach a range of different skills and techniques – from basic programming using Logo (the turtle style car with a pen at the back that could be used to draw on the floor – always a good way of attracting the attention of school kids!) up to quite sophisticated replications of physical problems, such as working out the trajectory of a missile to blow up an enemies’ tank. So why are games not more widely used in education (especially in FE and HE)? Can they help to support learners even at this advanced stage in their education? We aim to provide in this article an overview of the use of game technologies in education (almost as a small literature review for interested parties) and then go more in depth into one particular example we aim to introduce from this coming academic year (Sept. 2006) to help with teaching and assessment of one area of our Multimedia curriculum. Of course, we will not be able to fully provide the reader with data on how successful this is but we will be running a blog (http://themoviesineducation.blogspot.com/) to keep interested parties up to date with the progress of the project and to hopefully help others to set up similar solutions themselves. We will also only consider a small element of the implementation here and cover how the use of such assessment processes could be used in a broader context. The use of a game to aid learning and improve achievement is suggested because traditional methods of engagement are currently failing on some levels. By this it is meant that various parts of the production process we normally cover in our Multimedia degree are becoming difficult to monitor and continually assess.
Resumo:
The aim of this work is to improve retrieval and navigation services on bibliographic data held in digital libraries. This paper presents the design and implementation of OntoBib¸ an ontology-based bibliographic database system that adopts ontology-driven search in its retrieval. The presented work exemplifies how a digital library of bibliographic data can be managed using Semantic Web technologies and how utilizing the domain specific knowledge improves both search efficiency and navigation of web information and document retrieval.
Resumo:
Assembly processes used to bond components to printed circuit boards can have a significant impact on these boards and the final packaged component. Traditional approaches to bonding components to printed circuit boards results in heat being applied across the whole board assembly. This can lead to board warpage and possibly high residual stresses. Another approach discussed in this paper is to use Variable Frequency Microwave (VFM) heating to cure adhesives and underfills and bond components to printed circuit boards. In terms of energy considerations the use of VFM technology is much more cost effective compared to convection/radiation heating. This paper will discuss the impact of traditional reflow based processes on flexible substrates and it will demonstrate the possible advantages of using localised variable frequency microwave heating to cure materials in an electronic package.
Resumo:
Numerical modelling technology and software is now being used to underwrite the design of many microelectronic and microsystems components. The demands for greater capability of these analysis tools are increasing dramatically, as the user community is faced with the challenge of producing reliable products in ever shorter lead times. This leads to the requirement for analysis tools to represent the interactions amongst the distinct phenomena and physics at multiple length and timescales. Multi-physics and Multi-scale technology is now becoming a reality with many code vendors. This chapter discusses the current status of modelling tools that assess the impact of nano-technology on the fabrication/packaging and testing of microsystems. The chapter is broken down into three sections: Modelling Technologies, Modelling Application to Fabrication, and Modelling Application to Assembly/Packing and Modelling Applied for Test and Metrology.