976 resultados para optimisation methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is focused on the optimisation of resource utilisation in wireless mobile networks with the consideration of the users’ experienced quality of video streaming services. The study specifically considers the new generation of mobile communication networks, i.e. 4G-LTE, as the main research context. The background study provides an overview of the main properties of the relevant technologies investigated. These include video streaming protocols and networks, video service quality assessment methods, the infrastructure and related functionalities of LTE, and resource allocation algorithms in mobile communication systems. A mathematical model based on an objective and no-reference quality assessment metric for video streaming, namely Pause Intensity, is developed in this work for the evaluation of the continuity of streaming services. The analytical model is verified by extensive simulation and subjective testing on the joint impairment effects of the pause duration and pause frequency. Various types of the video contents and different levels of the impairments have been used in the process of validation tests. It has been shown that Pause Intensity is closely correlated with the subjective quality measurement in terms of the Mean Opinion Score and this correlation property is content independent. Based on the Pause Intensity metric, an optimised resource allocation approach is proposed for the given user requirements, communication system specifications and network performances. This approach concerns both system efficiency and fairness when establishing appropriate resource allocation algorithms, together with the consideration of the correlation between the required and allocated data rates per user. Pause Intensity plays a key role here, representing the required level of Quality of Experience (QoE) to ensure the best balance between system efficiency and fairness. The 3GPP Long Term Evolution (LTE) system is used as the main application environment where the proposed research framework is examined and the results are compared with existing scheduling methods on the achievable fairness, efficiency and correlation. Adaptive video streaming technologies are also investigated and combined with our initiatives on determining the distribution of QoE performance across the network. The resulting scheduling process is controlled through the prioritization of users by considering their perceived quality for the services received. Meanwhile, a trade-off between fairness and efficiency is maintained through an online adjustment of the scheduler’s parameters. Furthermore, Pause Intensity is applied to act as a regulator to realise the rate adaptation function during the end user’s playback of the adaptive streaming service. The adaptive rates under various channel conditions and the shape of the QoE distribution amongst the users for different scheduling policies have been demonstrated in the context of LTE. Finally, the work for interworking between mobile communication system at the macro-cell level and the different deployments of WiFi technologies throughout the macro-cell is presented. A QoEdriven approach is proposed to analyse the offloading mechanism of the user’s data (e.g. video traffic) while the new rate distribution algorithm reshapes the network capacity across the macrocell. The scheduling policy derived is used to regulate the performance of the resource allocation across the fair-efficient spectrum. The associated offloading mechanism can properly control the number of the users within the coverages of the macro-cell base station and each of the WiFi access points involved. The performance of the non-seamless and user-controlled mobile traffic offloading (through the mobile WiFi devices) has been evaluated and compared with that of the standard operator-controlled WiFi hotspots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tool life is an important factor to be considered during the optimisation of a machining process since cutting parameters can be adjusted to optimise tool changing, reducing cost and time of production. Also the performance of a tool is directly linked to the generated surface roughness and this is important in cases where there are strict surface quality requirements. The prediction of tool life and the resulting surface roughness in milling operations has attracted considerable research efforts. The research reported herein is focused on defining the influence of milling cutting parameters such as cutting speed, feed rate and axial depth of cut, on three major tool performance parameters namely, tool life, material removal and surface roughness. The research is seeking to define methods that will allow the selection of optimal parameters for best tool performance when face milling 416 stainless steel bars. For this study the Taguchi method was applied in a special design of an orthogonal array that allows studying the entire parameter space with only a number of experiments representing savings in cost and time of experiments. The findings were that the cutting speed has the most influence on tool life and surface roughness and very limited influence on material removal. By last tool life can be judged either from tool life or volume of material removal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in renewable energy generators introduced into the electricity grid is putting pressure on its stability and management as predictions of renewable energy sources cannot be accurate or fully controlled. This, with the additional pressure of fluctuations in demand, presents a problem more complex than the current methods of controlling electricity distribution were designed for. A global approximate and distributed optimisation method for power allocation that accommodates uncertainties and volatility is suggested and analysed. It is based on a probabilistic method known as message passing [1], which has deep links to statistical physics methodology. This principled method of optimisation is based on local calculations and inherently accommodates uncertainties; it is of modest computational complexity and provides good approximate solutions.We consider uncertainty and fluctuations drawn from a Gaussian distribution and incorporate them into the message-passing algorithm. We see the effect that increasing uncertainty has on the transmission cost and how the placement of volatile nodes within a grid, such as renewable generators or consumers, effects it.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: There is increasing evidence that electronic prescribing (ePrescribing) or computerised provider/physician order entry (CPOE) systems can improve the quality and safety of healthcare services. However, it has also become clear that their implementation is not straightforward and may create unintended or undesired consequences once in use. In this context, qualitative approaches have been particularly useful and their interpretative synthesis could make an important and timely contribution to the field. This review will aim to identify, appraise and synthesise qualitative studies on ePrescribing/CPOE in hospital settings, with or without clinical decision support. Methods and analysis: Data sources will include the following bibliographic databases: MEDLINE, MEDLINE In Process, EMBASE, PsycINFO, Social Policy and Practice via Ovid, CINAHL via EBSCO, The Cochrane Library (CDSR, DARE and CENTRAL databases), Nursing and Allied Health Sources, Applied Social Sciences Index and Abstracts via ProQuest and SCOPUS. In addition, other sources will be searched for ongoing studies (ClinicalTrials.gov) and grey literature: Healthcare Management Information Consortium, Conference Proceedings Citation Index (Web of Science) and Sociological abstracts. Studies will be independently screened for eligibility by 2 reviewers. Qualitative studies, either standalone or in the context of mixed-methods designs, reporting the perspectives of any actors involved in the implementation, management and use of ePrescribing/CPOE systems in hospital-based care settings will be included. Data extraction will be conducted by 2 reviewers using a piloted form. Quality appraisal will be based on criteria from the Critical Appraisal Skills Programme checklist and Standards for Reporting Qualitative Research. Studies will not be excluded based on quality assessment. A postsynthesis sensitivity analysis will be undertaken. Data analysis will follow the thematic synthesis method. Ethics and dissemination: The study does not require ethical approval as primary data will not be collected. The results of the study will be published in a peer-reviewed journal and presented at relevant conferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phospholipid oxidation by adventitious damage generates a wide variety of products with potentially novel biological activities that can modulate inflammatory processes associated with various diseases. To understand the biological importance of oxidised phospholipids (OxPL) and their potential role as disease biomarkers requires precise information about the abundance of these compounds in cells and tissues. There are many chemiluminescence and spectrophotometric assays available for detecting oxidised phospholipids, but they all have some limitations. Mass spectrometry coupled with liquid chromatography is a powerful and sensitive approach that can provide detailed information about the oxidative lipidome, but challenges still remain. The aim of this work is to develop improved methods for detection of OxPLs by optimisation of chromatographic separation through testing several reverse phase columns and solvent systems, and using targeted mass spectrometry approaches. Initial experiments were carried out using oxidation products generated in vitro to optimise the chromatography separation parameters and mass spectrometry parameters. We have evaluated the chromatographic separation of oxidised phosphatidylcholines (OxPCs) and oxidised phosphatidylethanolamines (OXPEs) using C8, C18 and C30 reverse phase, polystyrene – divinylbenzene based monolithic and mixed – mode hydrophilic interaction (HILIC) columns, interfaced with mass spectrometry. Our results suggest that the monolithic column was best able to separate short chain OxPCs and OxPEs from long chain oxidised and native PCs and PEs. However, variation in charge of polar head groups and extreme diversity of oxidised species make analysis of several classes of OxPLs within one analytical run impractical. We evaluated and optimised the chromatographic separation of OxPLs by serially coupling two columns: HILIC and monolith column that provided us the larger coverage of OxPL species in a single analytical run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an implementation of a method capable of integrating parametric, feature based, CAD models based on commercial software (CATIA) with the SU2 software framework. To exploit the adjoint based methods for aerodynamic optimisation within the SU2, a formulation to obtain geometric sensitivities directly from the commercial CAD parameterisation is introduced, enabling the calculation of gradients with respect to CAD based design variables. To assess the accuracy and efficiency of the alternative approach, two aerodynamic optimisation problems are investigated: an inviscid, 3D, problem with multiple constraints, and a 2D high-lift aerofoil, viscous problem without any constraints. Initial results show the new parameterisation obtaining reliable optimums, with similar levels of performance of the software native parameterisations. In the final paper, details of computing CAD sensitivities will be provided, including accuracy as well as linking geometric sensitivities to aerodynamic objective functions and constraints; the impact in the robustness of the overall method will be assessed and alternative parameterisations will be included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Schistosomiasis is a chronically debilitating helminth infection with a significant socio-economic and public health impact. Accurate diagnostics play a pivotal role in achieving current schistosomiasis control and elimination goals. However, many of the current diagnostic procedures, which rely on detection of schistosome eggs, have major limitations including lack of accuracy and the inability to detect pre-patent infections. DNA-based detection methods provide a viable alternative to the current tests commonly used for schistosomiasis diagnosis. Here we describe the optimisation of a novel droplet digital PCR (ddPCR) duplex assay for the diagnosis of Schistosoma japonicum infection which provides improved detection sensitivity and specificity. The assay involves the amplification of two specific and abundant target gene sequences in S. japonicum; a retrotransposon (SjR2) and a portion of a mitochondrial gene (nad1). The assay detected target sequences in different sources of schistosome DNA isolated from adult worms, schistosomules and eggs, and exhibits a high level of specificity, thereby representing an ideal tool for the detection of low levels of parasite DNA in different clinical samples including parasite cell free DNA in the host circulation and other bodily fluids. Moreover, being quantitative, the assay can be used to determine parasite infection intensity and, could provide an important tool for the detection of low intensity infections in low prevalence schistosomiasis-endemic areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Kidney Exchange Problem (KEP) is an optimisation problem that was first discussed in Rapaport (1986) but has only more recently been the subject of much work by combinatorial optimisation re-searchers. This has been in parallel with its increased prevalence in the medical community. In the basic formulation of a KEP, each instance of the problem features a directed graph D = (V,A) . Each node i ∈ V represents an incompatible pair wherein the patient needs to trade kidneys with the patient of another incompatible pair. The goal is to find an optimal set of cycles such that as many patients as possible receive a transplant. The problem is further complicated by the imposition of a cycle-size constraint, usually considered to be 3 or 4. Kidney exchange programs around the world implement different algorithms to solve the allocation problem by matching up kidneys from potential donors to patients. In some systems all transplants are considered equally desirable, whereas in others, ranking criteria such as the age of the patient or distance they will need to travel are applied, hence the multi-criteria nature of the KEP. To address the multi-criteria aspect of the KEP, in this paper we propose a two-stage approach for the kidney exchange optimisation problem. In the first stage the goal is to find the optimal number of exchanges, and in the second stage the goal is to maximise the weighted sum of the kidney matches, subject to the added constraint that the number of exchanges must remain optimal. The idea can potentially be extended to multiple-objectives, by repeating the process in multiple runs. In our preliminary numerical experiments, we first find the maximum number of kidney matches by using an existing open source exact algorithm of Anderson et al. (2015). The solution will then be used as an initial solution for the stage two optimisation problem, wherein two heuristic methods, steepest ascent and random ascent, are implemented in obtaining good quality solutions to the objective of maximizing total weight of exchanges. The neighbourhood is obtained by two-swaps. It is our intention in the future to implement a varying neighbourhood scheme within the same two heuristic framework, or within other meta-heuristic framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the empirical methods for reinforcement design of underground excavations, an even distribution of rock bolts is generally recommended. This work proves that this design is not necessarily optimal and shows how the state-of-the-art reinforcement design could be improved through topology optimisation techniques. The Bidirectional Evolutionary Structural Optimisation (BESO) method has been extended to consider nonlinear material behaviour. An elastic perfectly-plastic Mohr-Coulomb model is utilised for both original rock and reinforced rock. External work along the tunnel wall is considered as the objective function. Various in situ stress conditions with different horizontal stress ratios and different geostatic stress magnitudes are investigated through several examples. The outcomes show that the proposed approach is capable of improving tunnel reinforcement design. Also, significant difference in optimal reinforcement distribution for the cases of linear and nonlinear analysis results proves the importance of the influence of realistic nonlinear material properties on the final outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé : Les performances de détecteurs à scintillation, composés d’un cristal scintillateur couplé à un photodétecteur, dépendent de façon critique de l’efficacité de la collecte et de l’extraction des photons de scintillation du cristal vers le capteur. Dans les systèmes d’imagerie hautement pixellisés (e.g. TEP, TDM), les scintillateurs doivent être arrangés en matrices compactes avec des facteurs de forme défavorables pour le transport des photons, au détriment des performances du détecteur. Le but du projet est d’optimiser les performances de ces détecteurs pixels par l'identification des sources de pertes de lumière liées aux caractéristiques spectrales, spatiales et angulaires des photons de scintillation incidents sur les faces des scintillateurs. De telles informations acquises par simulation Monte Carlo permettent une pondération adéquate pour l'évaluation de gains atteignables par des méthodes de structuration du scintillateur visant à une extraction de lumière améliorée vers le photodétecteur. Un plan factoriel a permis d'évaluer la magnitude de paramètres affectant la collecte de lumière, notamment l'absorption des matériaux adhésifs assurant l'intégrité matricielle des cristaux ainsi que la performance optique de réflecteurs, tous deux ayant un impact considérable sur le rendement lumineux. D'ailleurs, un réflecteur abondamment utilisé en raison de ses performances optiques exceptionnelles a été caractérisé dans des conditions davantage réalistes par rapport à une immersion dans l'air, où sa réflectivité est toujours rapportée. Une importante perte de réflectivité lorsqu'il est inséré au sein de matrices de scintillateurs a été mise en évidence par simulations puis confirmée expérimentalement. Ceci explique donc les hauts taux de diaphonie observés en plus d'ouvrir la voie à des méthodes d'assemblage en matrices limitant ou tirant profit, selon les applications, de cette transparence insoupçonnée.