25 resultados para Interior point methods

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we consider the secure beamforming design for an underlay cognitive radio multiple-input singleoutput broadcast channel in the presence of multiple passive eavesdroppers. Our goal is to design a jamming noise (JN) transmit strategy to maximize the secrecy rate of the secondary system. By utilizing the zero-forcing method to eliminate the interference caused by JN to the secondary user, we study the joint optimization of the information and JN beamforming for secrecy rate maximization of the secondary system while satisfying all the interference power constraints at the primary users, as well as the per-antenna power constraint at the secondary transmitter. For an optimal beamforming design, the original problem is a nonconvex program, which can be reformulated as a convex program by applying the rank relaxation method. To this end, we prove that the rank relaxation is tight and propose a barrier interior-point method to solve the resulting saddle point problem based on a duality result. To find the global optimal solution, we transform the considered problem into an unconstrained optimization problem. We then employ Broyden-Fletcher-Goldfarb-Shanno (BFGS) method to solve the resulting unconstrained problem which helps reduce the complexity significantly, compared to conventional methods. Simulation results show the fast convergence of the proposed algorithm and substantial performance improvements over existing approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An appreciation of the quantity of streamflow derived from the main hydrological pathways involved in transporting diffuse contaminants is critical when addressing a wide range of water resource management issues. In order to assess hydrological pathway contributions to streams, it is necessary to provide feasible upper and lower bounds for flows in each pathway. An important first step in this process is to provide reliable estimates of the slower responding groundwater pathways and subsequently the quicker overland and interflow pathways. This paper investigates the effectiveness of a multi-faceted approach applying different hydrograph separation techniques, supplemented by lumped hydrological modelling, for calculating the Baseflow Index (BFI), for the development of an integrated approach to hydrograph separation. A semi-distributed, lumped and deterministic rainfall runoff model known as NAM has been applied to ten catchments (ranging from 5 to 699 km2). While this modelling approach is useful as a validation method, NAM itself is also an important tool for investigation. These separation techniques provide a large variation in BFI, a difference of 0.741 predicted for BFI in a catchment with the less reliable fixed and sliding interval methods and local minima turning point methods included. This variation is reduced to 0.167 with these methods omitted. The Boughton and Eckhardt algorithms, while quite subjective in their use, provide quick and easily implemented approaches for obtaining physically realistic hydrograph separations. It is observed that while the different separation techniques give varying BFI values for each of the catchments, a recharge coefficient approach developed in Ireland, when applied in conjunction with the Master recession Curve Tabulation method, predict estimates in agreement with those obtained using the NAM model, and these estimates are also consistent with the study catchments’ geology. These two separation methods, in conjunction with the NAM model, were selected to form an integrated approach to assessing BFI in catchments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last two decades, ionic liquids have gained importance as alternative solvents to conventional VOCs in the field of homogeneous catalysis. This success is not only due to their ability to dissolve a large amount of metal catalysts, but it is also due to their potential to enhance yields of enantiopure products. The art of preparation of a specific enantiomer is a highly desired one and searched for in pharmaceutical industry. This work presents a study on solubility in water and in water/methanol mixture of a set of ILs composed of the bis (trifluoromethylsulfonyl) imide anion and of the N-alkyl-triethyl-ammonium cation (abbrev. [NR,222][NTf2]) with the alkyl chain R ranging from 6 to 12 carbons. Mutual solubilities between ILs and water, as well as between ILs and methanol/water mixture were investigated in detail. These solubilities were measured using two well-known and accurate experimental techniques based on a volumetric and a cloud-point methods. Both methods enabled us to measure the Tx diagrams reflecting the mutual solubilities between water (or water/methanol) and selected ILs in the temperature range from 293.15 to 338.15 K. The data were fitted by using the modified Flory-Huggins equation proposed by de Sousa and Rebelo and compared also with the prediction carried out by the Cosmo-RS methodology

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An optimal day-ahead scheduling method (ODSM) for the integrated urban energy system (IUES) is introduced, which considers the reconfigurable capability of an electric distribution network. The hourly topology of a distribution network, a natural gas network, the energy centers including the combined heat and power (CHP) units, different energy conversion devices and demand responsive loads (DRLs), are optimized to minimize the day-ahead operation cost of the IUES. The hourly reconfigurable capability of the electric distribution network utilizing remotely controlled switches (RCSs) is explored and discussed. The operational constraints from the unbalanced three-phase electric distribution network, the natural gas network, and the energy centers are considered. The interactions between the electric distribution network and the natural gas network take place through conversion of energy among different energy vectors in the energy centers. An energy conversion analysis model for the energy center was developed based on the energy hub model. A hybrid optimization method based on genetic algorithm (GA) and a nonlinear interior point method (IPM) is utilized to solve the ODSM model. Numerical studies demonstrate that the proposed ODSM is able to provide the IUES with an effective and economical day-ahead scheduling scheme and reduce the operational cost of the IUES.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a linear precoder design for an underlay cognitive radio multiple-input multiple-output broadcast channel, where the secondary system consisting of a secondary base-station (BS) and a group of secondary users (SUs) is allowed to share the same spectrum with the primary system. All the transceivers are equipped with multiple antennas, each of which has its own maximum power constraint. Assuming zero-forcing method to eliminate the multiuser interference, we study the sum rate maximization problem for the secondary system subject to both per-antenna power constraints at the secondary BS and the interference power constraints at the primary users. The problem of interest differs from the ones studied previously that often assumed a sum power constraint and/or single antenna employed at either both the primary and secondary receivers or the primary receivers. To develop an efficient numerical algorithm, we first invoke the rank relaxation method to transform the considered problem into a convex-concave problem based on a downlink-uplink result. We then propose a barrier interior-point method to solve the resulting saddle point problem. In particular, in each iteration of the proposed method we find the Newton step by solving a system of discrete-time Sylvester equations, which help reduce the complexity significantly, compared to the conventional method. Simulation results are provided to demonstrate fast convergence and effectiveness of the proposed algorithm. 

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The particle size characteristics and encapsulation efficiency of microparticles prepared using triglyceride materials and loaded with two model water-soluble drugs were evaluated. Two emulsification procedures based on o/w and w/o/w methodologies were compared to a novel spray congealing procedure. After extensive modification of both emulsification methods, encapsulation efficiencies of 13.04% tetracycline HCl and 11.27% lidocaine HCl were achievable in a Witepsol (R)-based microparticle. This compares to much improved encapsulation efficiencies close to 100% for the spray congealing method, which was shown to produce spherical particles of similar to 58 mu m. Drug release studies from a Witepsol (R) formulation loaded with lidocaine HCl showed a temperature-dependent release mechanism, which displayed diffusion-controlled kinetics at temperatures similar to 25 degrees C, but exhibited almost immediate release when triggered using temperatures close to that of skin. Therefore, such a system may find application in topical semi-solid formulations, where a temperature-induced burst release is preferred.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of a study aimed at determining the most important experimental parameters for automated, quantitative analysis of solid dosage form pharmaceuticals (seized and model 'ecstasy' tablets) are reported. Data obtained with a macro-Raman spectrometer were complemented by micro-Raman measurements, which gave information on particle size and provided excellent data for developing statistical models of the sampling errors associated with collecting data as a series of grid points on the tablets' surface. Spectra recorded at single points on the surface of seized MDMA-caffeine-lactose tablets with a Raman microscope (lambda(ex) = 785 nm, 3 mum diameter spot) were typically dominated by one or other of the three components, consistent with Raman mapping data which showed the drug and caffeine microcrystals were ca 40 mum in diameter. Spectra collected with a microscope from eight points on a 200 mum grid were combined and in the resultant spectra the average value of the Raman band intensity ratio used to quantify the MDMA: caffeine ratio, mu(r), was 1.19 with an unacceptably high standard deviation, sigma(r), of 1.20. In contrast, with a conventional macro-Raman system (150 mum spot diameter), combined eight grid point data gave mu(r) = 1.47 with sigma(r) = 0.16. A simple statistical model which could be used to predict sigma(r) under the various conditions used was developed. The model showed that the decrease in sigma(r) on moving to a 150 mum spot was too large to be due entirely to the increased spot diameter but was consistent with the increased sampling volume that arose from a combination of the larger spot size and depth of focus in the macroscopic system. With the macro-Raman system, combining 64 grid points (0.5 mm spacing and 1-2 s accumulation per point) to give a single averaged spectrum for a tablet was found to be a practical balance between minimizing sampling errors and keeping overhead times at an acceptable level. The effectiveness of this sampling strategy was also tested by quantitative analysis of a set of model ecstasy tablets prepared from MDEA-sorbitol (0-30% by mass MDEA). A simple univariate calibration model of averaged 64 point data had R-2 = 0.998 and an r.m.s. standard error of prediction of 1.1% whereas data obtained by sampling just four points on the same tablet showed deviations from the calibration of up to 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. A magneto-hydrostatic model is constructed with spectropolarimetric properties close to those of solar photospheric magnetic bright points.
Methods. Results of solar radiative magneto-convection simulations are used to produce the spatial structure of the vertical component of the magnetic field. The horizontal component of magnetic field is reconstructed using the self-similarity condition, while the magneto-hydrostatic equilibrium condition is applied to the standard photospheric model with the magnetic field embedded. Partial ionisation processes are found to be necessary for reconstructing the correct temperature structure of the model.
Results. The structures obtained are in good agreement with observational data. By combining the realistic structure of the magnetic field with the temperature structure of the quiet solar photosphere, the continuum formation level above the equipartition layer can be found. Preliminary results are shown of wave propagation through this magnetic structure. The observational consequences of the oscillations are examined in continuum intensity and in the Fe I 6302 angstrom magnetically sensitive line.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In 2005, the European Commission recommended that all member states should establish or strengthen surveillance systems for monitoring the use of antimicrobial agents. There is no evidence in the literature of any surveillance studies having been specifically conducted in nursing homes (NHs) in Northern Ireland (NI).

OBJECTIVE: The aim of this study was to determine the prevalence of antimicrobial prescribing and its relationship with certain factors (e.g. indwelling urinary catheterization, urinary incontinence, disorientation, etc.) in NH residents in NI.

METHODS: This project was carried out in NI as part of a wider European study under the protocols of the European Surveillance of Antimicrobial Consumption group. Two point-prevalence surveys (PPSs) were conducted in 30 NHs in April and November 2009. Data were obtained from nursing notes, medication administration records and staff in relation to antimicrobial prescribing, facility and resident characteristics and were analysed descriptively.

RESULTS: The point prevalence of antimicrobial prescribing was 13.2% in April 2009 and 10.7% in November 2009, with a 10-fold difference existing between the NHs with the highest and lowest antimicrobial prescribing prevalence during both PPSs. The same NH had the highest rate of antimicrobial prescribing during both April (30.6%) and November (26.0%). The group of antimicrobials most commonly prescribed was the penicillins (April 28.6%, November 27.5%) whilst the most prevalent individual antimicrobial prescribed was trimethoprim (April 21.3%, November 24.3%). The majority of antimicrobials were prescribed for the purpose of preventing urinary tract infections (UTIs) in both April (37.8%) and in November (46.7%), with 5% of all participating residents being prescribed an antimicrobial for this reason. Some (20%) antimicrobials were prescribed at inappropriate doses, particularly those which were used for the purpose of preventing UTIs. Indwelling urinary catheterization and wounds were significant risk factors for antimicrobial use in April [odds ratio {OR} (95% CI) 2.0 (1.1, 3.5) and 1.8 (1.1, 3.0), respectively] but not in November 2009 [OR (95% CI) 1.6 (0.8, 3.2) and 1.2 (0.7, 2.2), respectively]. Other resident factors, e.g. disorientation, immobility and incontinence, were not associated with antimicrobial use. Furthermore, none of the NH characteristics investigated (e.g. number of beds, hospitalization episodes, number of general practitioners, etc.) were found to be associated with antimicrobial use in either April or November 2009.

CONCLUSIONS: This study has identified a high overall rate of antimicrobial use in NHs in NI, with variability evident both within and between homes. More research is needed to understand which factors influence antimicrobial use and to determine the appropriateness of antimicrobial prescribing in this population in general and more specifically in the management of recurrent UTIs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based upon the original application to the European Commission, this article gives insights into the thinking of the Euroidentities team at the point that the project began. The question: “Is the European ‘identity project’ failing?” is posed in the sense that the political and economic attainments of the European Union have not been translated into a sense of identity with or commitment to Europe from the populaces that have benefited from them. The urgency of European ‘identity work’ is asserted with a number of levels for the construction of European identity being hypothesized. Euroidentities is intended to break conceptual ground by bringing together on an equal footing two apparently antagonistic views of identity -- the collective and institutional and the individual and biographical – to give a more anchored and nuanced view of identity formation and transformation than either can provide on its own. Rather than following the dominant approaches to research on European identity that have been macro-theoretical and ‘top-down’, retrospective in-depth qualitative biographical interviews are planned since they provide the ideal means of gaining insight into the formation of a European identity or multiple identities from the ‘bottom up’ perspective of non-elite groups. The reliability of analysis will be buttressed by the use of contrastive comparison between cases, culminating in contrastive comparison across the national project teams between cases drawn from different ‘sensitized groups’ that provide the fieldwork structure of the project. The paper concludes with a summary of some of the more significant findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the applicability of three ground survey methods for modelling terrain: one man electronic tachymetry (TPS), real time kinematic GPS (GPS), and terrestrial laser scanning (TLS). Vertical accuracy of digital terrain models (DTMs) derived from GPS, TLS and airborne laser scanning (ALS) data is assessed. Point elevations acquired by the four methods represent two sections of a mountainous area in Cumbria, England. They were chosen so that the presence of non-terrain features is constrained to the smallest amount. The vertical accuracy of the DTMs was addressed by subtracting each DTM from TPS point elevations. The error was assessed using exploratory measures including statistics, histograms, and normal probability plots. The results showed that the internal measurement accuracy of TPS, GPS, and TLS was below a centimetre. TPS and GPS can be considered equally applicable alternatives for sampling the terrain in areas accessible on foot. The highest DTM vertical accuracy was achieved with GPS data, both on sloped terrain (RMSE 0.16. m) and flat terrain (RMSE 0.02. m). TLS surveying was the most efficient overall but veracity of terrain representation was subject to dense vegetation cover. Therefore, the DTM accuracy was the lowest for the sloped area with dense bracken (RMSE 0.52. m) although it was the second highest on the flat unobscured terrain (RMSE 0.07. m). ALS data represented the sloped terrain more realistically (RMSE 0.23. m) than the TLS. However, due to a systematic bias identified on the flat terrain the DTM accuracy was the lowest (RMSE 0.29. m) which was above the level stated by the data provider. Error distribution models were more closely approximated by normal distribution defined using median and normalized median absolute deviation which supports the use of the robust measures in DEM error modelling and its propagation. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Last interglacial sediments in unglaciated Alaska and Yukon (eastern Beringia) are commonly identified by palaeoecological indicators and stratigraphic position ~2-5m above the regionally prominent Old Crow tephra (124±10ka). We demonstrate that this approach can yield erroneous age assignments using data from a new exposure at the Palisades, a site in interior Alaska with numerous exposures of last interglacial sediments. Tephrochronology, stratigraphy, plant macrofossils, pollen and fossil insects from a prominent wood-rich organic silt unit are all consistent with a last interglacial age assignment. However, six 14C dates on plant and insect macrofossils from the organic silt range from non-finite to 4.0 14C ka BP, indicating that the organic silt instead represents a Holocene deposit with a mixed-age assemblage of organic material. In contrast, wood samples from presumed last interglacial organic-rich sediments elsewhere at the Palisades, in a similar stratigraphic position with respect to Old Crow tephra, yield non-finite 14C ages. Given that local permafrost thaw since the last interglaciation may facilitate reworking of older sediments into new stratigraphic positions, minimum constraining ages based on 14C dating or other methods should supplement age assignments for last interglacial sediments in eastern Beringia that are based on palaeoecology and stratigraphic association with Old Crow tephra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Modern business practices in engineering are increasingly turning to post manufacture service provision in an attempt to generate additional revenue streams and ensure commercial sustainability. Maintainability has always been a consideration during the design process but in the past it has been generally considered to be of tertiary importance behind manufacturability and primary product function in terms of design priorities. The need to draw whole life considerations into concurrent engineering (CE) practice has encouraged companies to address issues such as maintenance, earlier in the design process giving equal importance to all aspects of the product lifecycle. The consideration of design for maintainability (DFM) early in the design process has the potential to significantly reduce maintenance costs, and improve overall running efficiencies as well as safety levels. However a lack of simulation tools still hinders the adaptation of CE to include practical elements of design and therefore further research is required to develop methods by which ‘hands on’ activities such as maintenance can be fully assessed and optimised as concepts develop. Virtual Reality (VR) has the potential to address this issue but the application of these traditionally high cost systems can require complex infrastructure and their use has typically focused on aesthetic aspects of mature designs. This paper examines the application of cost effective VR technology to the rapid assessment of aircraft interior inspection during conceptual design. It focuses on the integration of VR hardware with a typical desktop engineering system and examines the challenges with data transfer, graphics quality and the development of practical user functions within the VR environment. Conclusions drawn to date indicate that the system has the potential to improve maintenance planning through the provision of a usable environment for inspection which is available as soon as preliminary structural models are generated as part of the conceptual design process. Challenges still exist in the efficient transfer of data between the CAD and VR environments as well as the quantification of any benefits that result from the proposed approach. The result of this research will help to improve product maintainability, reduce product development cycle times and lower maintenance costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The environmental quality of land can be assessed by calculating relevant threshold values, which differentiate between concentrations of elements resulting from geogenic and diffuse anthropogenic sources and concentrations generated by point sources of elements. A simple process allowing the calculation of these typical threshold values (TTVs) was applied across a region of highly complex geology (Northern Ireland) to six elements of interest; arsenic, chromium, copper, lead, nickel and vanadium. Three methods for identifying domains (areas where a readily identifiable factor can be shown to control the concentration of an element) were used: k-means cluster analysis, boxplots and empirical cumulative distribution functions (ECDF). The ECDF method was most efficient at determining areas of both elevated and reduced concentrations and was used to identify domains in this investigation. Two statistical methods for calculating normal background concentrations (NBCs) and upper limits of geochemical baseline variation (ULBLs), currently used in conjunction with legislative regimes in the UK and Finland respectively, were applied within each domain. The NBC methodology was constructed to run within a specific legislative framework, and its use on this soil geochemical data set was influenced by the presence of skewed distributions and outliers. In contrast, the ULBL methodology was found to calculate more appropriate TTVs that were generally more conservative than the NBCs. TTVs indicate what a "typical" concentration of an element would be within a defined geographical area and should be considered alongside the risk that each of the elements pose in these areas to determine potential risk to receptors.