882 resultados para Branch and bound algorithms
Resumo:
In this paper, a Computational Fluid Dynamics framework is presented for the modelling of key processes which involve granular material (i.e. segregation, degradation, caking). Appropriate physical models and sophisticated algorithms have been developed for the correct representation of the different material components in a granular mixture. The various processes, which arise from the micromechanical properties of the different mixture species can be obtained and parametrised in a DEM / experimental framework, thus enabling the continuum theory to correctly account for the micromechanical properties of a granular system. The present study establishes the link between the micromechanics and continuum theory and demonstrates the model capabilities in simulations of processes which are of great importance to the process engineering industry and involve granular materials in complex geometries.
Resumo:
We discuss the application of the multilevel (ML) refinement technique to the Vehicle Routing Problem (VRP), and compare it to its single-level (SL) counterpart. Multilevel refinement recursively coarsens to create a hierarchy of approximations to the problem and refines at each level. A SL algorithm, which uses a combination of standard VRP heuristics, is developed first to solve instances of the VRP. A ML version, which extends the global view of these heuristics, is then created, using variants of the construction and improvement heuristics at each level. Finally some multilevel enhancements are developed. Experimentation is used to find suitable parameter settings and the final version is tested on two well-known VRP benchmark suites. Results comparing both SL and ML algorithms are presented.
Resumo:
This paper provides a summary of our studies on robust speech recognition based on a new statistical approach – the probabilistic union model. We consider speech recognition given that part of the acoustic features may be corrupted by noise. The union model is a method for basing the recognition on the clean part of the features, thereby reducing the effect of the noise on recognition. To this end, the union model is similar to the missing feature method. However, the two methods achieve this end through different routes. The missing feature method usually requires the identity of the noisy data for noise removal, while the union model combines the local features based on the union of random events, to reduce the dependence of the model on information about the noise. We previously investigated the applications of the union model to speech recognition involving unknown partial corruption in frequency band, in time duration, and in feature streams. Additionally, a combination of the union model with conventional noise-reduction techniques was studied, as a means of dealing with a mixture of known or trainable noise and unknown unexpected noise. In this paper, a unified review, in the context of dealing with unknown partial feature corruption, is provided into each of these applications, giving the appropriate theory and implementation algorithms, along with an experimental evaluation.
Resumo:
19 B-type stars, selected from the Palomar-Green Survey, have been observed at infrared wavelengths to search for possible infrared excesses, as part of an ongoing programme to investigate the nature of early-type stars at high Galactic latitudes. The resulting infrared fluxes, along with Stromgren photometry, are compared with theoretical flux profiles to determine whether any of the targets show evidence of circumstellar material, which may be indicative of post-main- sequence evolution. Eighteen of the targets have flux distributions in good agreement with theoretical predictions. However, one star, PG 2120 + 062, shows a small near-infrared excess, which may be due either to a cool companion of spectral type F5-F7, or to circumstellar material, indicating that it may be an evolved object such as a post-asymptotic giant branch star, in the transition region between the asymptotic giant branch and planetary nebula phase, with the infrared excess due to recent mass loss during giant branch evolution.
Resumo:
In this paper we concentrate on the direct semi-blind spatial equalizer design for MIMO systems with Rayleigh fading channels. Our aim is to develop an algorithm which can outperform the classical training based method with the same training information used, and avoid the problems of low convergence speed and local minima due to pure blind methods. A general semi-blind cost function is first constructed which incorporates both the training information from the known data and some kind of higher order statistics (HOS) from the unknown sequence. Then, based on the developed cost function, we propose two semi-blind iterative and adaptive algorithms to find the desired spatial equalizer. To further improve the performance and convergence speed of the proposed adaptive method, we propose a technique to find the optimal choice of step size. Simulation results demonstrate the performance of the proposed algorithms and comparable schemes.
Resumo:
This paper outlines how the immediate life support (ILS) course was incorporated into an undergraduate-nursing curriculum in a university in Northern Ireland. It also reports on how the students perceived the impact of this course on their clinical practice. The aim was to develop the student’s ability to recognise the acutely ill patient and to determine the relevance of this to clinical practice. Prior to this the ILS course was only available to qualified nurses and this paper reports on the first time students were provided with an ILS course in an undergraduate setting. The ILS course was delivered to 89 third year nursing students (Adult Branch) and comprised one full teaching day per week over two weeks. Recognised Advanced Life Support (ALS) instructors, in keeping with the United Kingdom Resuscitation Council guidelines, taught the students. Participants completed a 17 item questionnaire which comprised an open-ended section for student comment. Questionnaire data was analysed descriptively using SSPSS version 15.0. Open-ended responses from the questionnaire data was analysed by content and thematic analysis. Results Student feedback reported that the ILS course helped them understand what constituted the acutely ill patient and the role of the nurse in managing a deteriorating situation. Students also reported that they valued the experience as highlighting gaps in their knowledge Conclusion. The inclusion of the ILS course provides students with necessary skills to assess and manage the deteriorating patient. In addition the data from this study suggest the ILS course should be delivered in an inter-professional setting – i.e taught jointly with medical students. References: Department of Health & Quality Assurance Agency (2006). Department of Health Phase 2 benchmarking project – final report. Gloucester: Department of Health, London and Quality Assurance Agency for Higher Education
Resumo:
We study the predictability of a theoretical model for earthquakes, using a pattern recognition algorithm similar to the CN and M8 algorithms known in seismology. The model, which is a stochastic spring-block model with both global correlation and local interaction, becomes more predictable as the strength of the global correlation or the local interaction is increased.
Resumo:
Many scientific applications are programmed using hybrid programming models that use both message passing and shared memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared memory or message passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoption of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74 percent on average and up to 13.8 percent) with some performance gain (up to 7.5 percent) or negligible performance loss.
Resumo:
BACKGROUND: Antibiotics are frequently prescribed for older adults who reside in long-term care facilities. A substantial proportion of antibiotic use in this setting is inappropriate. Antibiotics are often prescribed for asymptomatic bacteriuria, a condition for which randomized trials of antibiotic therapy indicate no benefit and in fact harm. This proposal describes a randomized trial of diagnostic and therapeutic algorithms to reduce the use of antibiotics in residents of long-term care facilities. METHODS: In this on-going study, 22 nursing homes have been randomized to either use of algorithms (11 nursing homes) or to usual practise (11 nursing homes). The algorithms describe signs and symptoms for which it would be appropriate to send urine cultures or to prescribe antibiotics. The algorithms are introduced by inservicing nursing staff and by conducting one-on-one sessions for physicians using case-scenarios. The primary outcome of the study is courses of antibiotics per 1000 resident days. Secondary outcomes include urine cultures sent and antibiotic courses for urinary indications. Focus groups and semi-structured interviews with key informants will be used to assess the process of implementation and to identify key factors for sustainability.
Resumo:
The optimization of full-scale biogas plant operation is of great importance to make biomass a competitive source of renewable energy. The implementation of innovative control and optimization algorithms, such as Nonlinear Model Predictive Control, requires an online estimation of operating states of biogas plants. This state estimation allows for optimal control and operating decisions according to the actual state of a plant. In this paper such a state estimator is developed using a calibrated simulation model of a full-scale biogas plant, which is based on the Anaerobic Digestion Model No.1. The use of advanced pattern recognition methods shows that model states can be predicted from basic online measurements such as biogas production, CH4 and CO2 content in the biogas, pH value and substrate feed volume of known substrates. The machine learning methods used are trained and evaluated using synthetic data created with the biogas plant model simulating over a wide range of possible plant operating regions. Results show that the operating state vector of the modelled anaerobic digestion process can be predicted with an overall accuracy of about 90%. This facilitates the application of state-based optimization and control algorithms on full-scale biogas plants and therefore fosters the production of eco-friendly energy from biomass.
Resumo:
Here we review the recent progress made in the detection, examination, characterisation and interpretation of oscillations manifesting in small-scale magnetic elements in the solar photosphere. This region of the Sun's atmosphere is especially dynamic, and importantly, permeated with an abundance of magnetic field concentrations. Such magnetic features can span diameters of hundreds to many tens of thousands of km, and are thus commonly referred to as the `building blocks' of the magnetic solar atmosphere. However, it is the smallest magnetic elements that have risen to the forefront of solar physics research in recent years. Structures, which include magnetic bright points, are often at the diffraction limit of even the largest of solar telescopes. Importantly, it is the improvements in facilities, instrumentation, imaging techniques and processing algorithms during recent years that have allowed researchers to examine the motions, dynamics and evolution of such features on the smallest spatial and temporal scales to date. It is clear that while these structures may demonstrate significant magnetic field strengths, their small sizes make them prone to the buffeting supplied by the ubiquitous surrounding convective plasma motions. Here, it is believed that magnetohydrodynamic waves can be induced, which propagate along the field lines, carrying energy upwards to the outermost extremities of the solar corona. Such wave phenomena can exist in a variety of guises, including fast and slow magneto-acoustic modes, in addition to Alfven waves. Coupled with rapid advancements in magnetohydrodynamic wave theory, we are now in an ideal position to thoroughly investigate how wave motion is generated in the solar photosphere, which oscillatory modes are most prevalent, and the role that these waves play in supplying energy to various layers of the solar atmosphere.
Resumo:
In this paper, we propose a malware categorization method that models malware behavior in terms of instructions using PageRank. PageRank computes ranks of web pages based on structural information and can also compute ranks of instructions that represent the structural information of the instructions in malware analysis methods. Our malware categorization method uses the computed ranks as features in machine learning algorithms. In the evaluation, we compare the effectiveness of different PageRank algorithms and also investigate bagging and boosting algorithms to improve the categorization accuracy.
Resumo:
This paper describes a novel RISC microprocessor that can be utilised to rapidly develop a reprogrammable and high performance embedded security-processing system in SoC designs. Generic and innovative algorithm-specific instructions have been developed for a wide range of private-key and hash algorithms. To the authors' knowledge, this is the first generic cryptographic microprocessor to be reported in the literature.
Resumo:
O projecto de sequenciação do genoma humano veio abrir caminho para o surgimento de novas áreas transdisciplinares de investigação, como a biologia computacional, a bioinformática e a bioestatística. Um dos resultados emergentes desde advento foi a tecnologia de DNA microarrays, que permite o estudo do perfil da expressão de milhares de genes, quando sujeitos a perturbações externas. Apesar de ser uma tecnologia relativamente consolidada, continua a apresentar um conjunto vasto de desafios, nomeadamente do ponto de vista computacional e dos sistemas de informação. São exemplos a optimização dos procedimentos de tratamento de dados bem como o desenvolvimento de metodologias de interpretação semi-automática dos resultados. O principal objectivo deste trabalho consistiu em explorar novas soluções técnicas para agilizar os procedimentos de armazenamento, partilha e análise de dados de experiências de microarrays. Com esta finalidade, realizou-se uma análise de requisitos associados às principais etapas da execução de uma experiência, tendo sido identificados os principais défices, propostas estratégias de melhoramento e apresentadas novas soluções. Ao nível da gestão de dados laboratoriais, é proposto um LIMS (Laboratory Information Management System) que possibilita a gestão de todos os dados gerados e dos procedimentos realizados. Este sistema integra ainda uma solução que permite a partilha de experiências, de forma a promover a participação colaborativa de vários investigadores num mesmo projecto, mesmo usando LIMS distintos. No contexto da análise de dados, é apresentado um modelo que facilita a integração de algoritmos de processamento e de análise de experiências no sistema desenvolvido. Por fim, é proposta uma solução para facilitar a interpretação biológica de um conjunto de genes diferencialmente expressos, através de ferramentas que integram informação existente em diversas bases de dados biomédicas.
Resumo:
O desenvolvimento de equipamentos de descodificação massiva de genomas veio aumentar de uma forma brutal os dados disponíveis. No entanto, para desvendarmos informação relevante a partir da análise desses dados é necessário software cada vez mais específico, orientado para determinadas tarefas que auxiliem o investigador a obter conclusões o mais rápido possível. É nesse campo que a bioinformática surge, como aliado fundamental da biologia, uma vez que tira partido de métodos e infra-estruturas computacionais para desenvolver algoritmos e aplicações informáticas. Por outro lado, na maior parte das vezes, face a novas questões biológicas é necessário responder com novas soluções específicas, pelo que o desenvolvimento de aplicações se torna um desafio permanente para os engenheiros de software. Foi nesse contexto que surgiram os principais objectivos deste trabalho, centrados na análise de tripletos e de repetições em estruturas primárias de DNA. Para esse efeito, foram propostos novos métodos e novos algoritmos que permitirem o processamento e a obtenção de resultados sobre grandes volumes de dados. Ao nível da análise de tripletos de codões e de aminoácidos foi proposto um sistema concebido para duas vertentes: por um lado o processamento dos dados, por outro a disponibilização na Web dos dados processados, através de um mecanismo visual de composição de consultas. Relativamente à análise de repetições, foi proposto e desenvolvido um sistema para identificar padrões de nucleótidos e aminoácidos repetidos em sequências específicas, com particular aplicação em genes ortólogos. As soluções propostas foram posteriormente validadas através de casos de estudo que atestam a mais-valia do trabalho desenvolvido.